How many rows is too many in a table variable?

  • rcnelsonmba

    SSC Eights!

    Points: 813

    At what point does a table variable start impeding performance with any and all gains lost and a temporary table is a much better way?  I know in our datawarehouse/OLAP environment I NEVER use a table variable, but have seen it in use by others.  Just curious what everyone thinks about this.

  • EdVassie

    SSC Guru

    Points: 60269

    The answer is: 'It depends'. 

    If your table variable gets too big to hold in memory, SQL will automatically create a temporary table as backing storage.  If your SQL has no WHERE clause and therefore always does a table scan, there probably will be no impact in having a large number of rows in your table variable.

    If your SQL has a WHERE clause, it could possibly get better performance by exploiting indexes.  You can create indexes on a temporary table but not on a TV (apart from the PK).  However, SQL will not use an index if the number of pages in the table is so low that a table scan will not harm performance.  Therefore the breakpoint comes when the table grows above about 500KB, when SQL would normally determine that index access is faster than a table scan.  To translate this to an approximate row count, divide your row length into 500KB.  If you know you will always have less rows than this number then a TV will not harm performance, otherwise consider using a temporary table. 

    Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.

    When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hélder Câmara

  • BarryOC

    SSC Veteran

    Points: 299

    Hello,

    My recollection is that if the recordset is going to be large then a temp table may out perform a table variable. If it is a performance issue consider comparing each.

    Best wishes,

    Barry O'Connell

  • rcnelsonmba

    SSC Eights!

    Points: 813

    This is the best info I've received.  In our datawarehousing environment, we are always working with result sets that are in the tens of thousands rows, sometimes in the millions.  I've been examining the stored procedures of an application sold to us by a vendor and have been questioning their use of TV's when dealing with the reporting volume we have.  Thanks again.

  • BaronVonData

    Newbie

    Points: 1

    This *was* good advice back in 2005, but now on SQL Server 2012 and above, table variables can be indexed, are dynamically indexed by default and fully accessible by the query optimiser etc. We also commonly have RAM for our SQL servers in the hundreds of gigabytes, allowing virtually any size of Table Variable to be stored in RAM.

    I just finished running a stored proc that loads 5.5 million rows into a table variable as part of a cursor loop, then writes all of those rows to a table on dbo, and it ran in 12 minutes vs 30 minutes with a temp table.

  • John Mitchell-245523

    SSC Guru

    Points: 148519

    BaronVonData wrote:

    This *was* good advice back in 2005, but now on SQL Server 2012 and above, table variables can be indexed, are dynamically indexed by default and fully accessible by the query optimiser etc.

    I'm not sure that's completely correct - what do you mean by "dynamically indexed by default", for example?  The most important thing, in my opinion, is that table variables don't have statistics on them.  Have a look here for some straight-from-the-horse's-mouth information.

    BaronVonData wrote:

    I just finished running a stored proc that loads 5.5 million rows into a table variable as part of a cursor loop, then writes all of those rows to a table on dbo, and it ran in 12 minutes vs 30 minutes with a temp table.

    Are you in a position to post some code that we can try, please?

    John

  • Brahmanand Shukla

    Right there with Babe

    Points: 714

    John Mitchell-245523 wrote:

    BaronVonData wrote:

    I just finished running a stored proc that loads 5.5 million rows into a table variable as part of a cursor loop, then writes all of those rows to a table on dbo, and it ran in 12 minutes vs 30 minutes with a temp table.

    Are you in a position to post some code that we can try, please?

    John

    I'm also very excited to know if it's true.

    An eye opener reply for me ! Is it correct that Table Variables are stored in-memory? I assumed it also refers tempdb the way temporary tables does.

  • Jonathan AC Roberts

    SSCoach

    Points: 17091

    BaronVonData wrote:

    This *was* good advice back in 2005, but now on SQL Server 2012 and above, table variables can be indexed, are dynamically indexed by default and fully accessible by the query optimiser etc. We also commonly have RAM for our SQL servers in the hundreds of gigabytes, allowing virtually any size of Table Variable to be stored in RAM.

    I just finished running a stored proc that loads 5.5 million rows into a table variable as part of a cursor loop, then writes all of those rows to a table on dbo, and it ran in 12 minutes vs 30 minutes with a temp table.

    So you think table variables perform better than temporary tables?! I think we all need to see some code that shows this.

    BaronVonData, can you provide a script that demonstrates this?

Viewing 8 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply