A real life table variable performance problem.

, 2014-09-30

G’day,

I’m sure we’ve all heard and read about the differences between table variables and temporary tables. In case you haven’t here’s one of many articles that you can read on the subject.

Anyway, there is a business process in one of my databases that calculates results via table variables. The amount of data that is generally contained in these table variables is generally less that 10 rows. There are however quite a few table variables in this process but again the amount of data generally contained in them is pretty small.

Now, one of our business users identified a very settle edge case bug in the process. The person went to see the developer of the code and together the bug was fixed. During testing, small amounts of data were ran though the temp tables – maybe 10 rows – then the amount of data was raised a little larger - maybe around 100 rows – but still pretty small, and all preformed well.

When the business people were satisfied that the whole bug was fixed they decided to reprocess all the data that had potentially fallen victim to this bug – This was done in test on a copy of the production database.

The user who was testing this started the process late one evening and they estimated that it would take quite some time and so left it running overnight – there were approximately 370,000 records to reprocess.

The next morning when I got into work, I found myself being asked why this process had not completed. Upon investigation I found that in fact the process was still running.

This was in fact the first that I’d heard about this process and I was curious.

I looked at the TSQL code that was running – it created at least 7 table variables, joined some table variables to themselves and others to permanent tables.

I then asked how much data was being processed and was told approximately 370,000 rows. This immediately set off alarm bells as I suspected that amount of data in a table variable was at least contributing to the amazing slow performance.

I took a new copy of the original production database and restored it to another server. I then took the script that was being used and turned all table variables to temp tables and altered any references elsewhere in the code.

I then ran the script, it appeared to to be going to take sometime.

But then, it finished after 2 minutes 27 seconds – wow.

I left the original version running just to see how long it would take but eventually killed it after it had not finished for 28 hours – that’s some major difference.

This was perhaps the biggest performance difference I have ever personally witnessed – down from 28+ hours to 2 minutes 27 seconds. To say the business users were delighted is an understatement.

Have a good day,

Cheers

Martin

 

 

 

 

 

 

 

 

 

 

 

Rate

Share

Share

Rate

Related content

Database Mirroring FAQ: Can a 2008 SQL instance be used as the witness for a 2005 database mirroring setup?

Question: Can a 2008 SQL instance be used as the witness for a 2005 database mirroring setup? This question was sent to me via email. My reply follows. Can a 2008 SQL instance be used as the witness for a 2005 database mirroring setup? Databases to be mirrored are currently running on 2005 SQL instances but will be upgraded to 2008 SQL in the near future.

2009-02-23

1,567 reads

Networking - Part 4

You may want to read Part 1 , Part 2 , and Part 3 before continuing. This time around I'd like to talk about social networking. We'll start with social networking. Facebook, MySpace, and Twitter are all good examples of using technology to let...

2009-02-17

1,530 reads

Speaking at Community Events - More Thoughts

Last week I posted Speaking at Community Events - Time to Raise the Bar?, a first cut at talking about to what degree we should require experience for speakers at events like SQLSaturday as well as when it might be appropriate to add additional focus/limitations on the presentations that are accepted. I've got a few more thoughts on the topic this week, and I look forward to your comments.

2009-02-13

360 reads