Table-Valued inputs for stored procedures

  • Hi all,

    Really quick question - has anyone encountered any performance issues with using table-valued inputs for stored procedures? In particular with data sets that can be somewhat large - perhaps in the hundreds, maybe even thousands of rows?

    Basically I have two options available to me - I can use a table variable, or I can use a VARCHAR(MAX) string with two delimiters, such as "1/Kiran/DBA;2/Alex/Artist;6/Scott/Designer", and then do two splits in my stored proc.

  • It really depends on what you're doing with that data. If you're doing any kind of filtering or joining against it, then the table variable structure that the table valued parameters use can be extremely problematic because there are no statistics. This causes the optimizer to create very poor plans, especially if you have 100s or 1000s of rows of data. Without seeing what you're doing with it, it's hard to say if that's the case here. However, check the execution plans.

    If you really need to pass in the data in a delimited format, make sure you look up Jeff Moden's article on the site here on using a tally table to split the data. It's the single most efficient method I'm aware of. Although, again, if you need to filter based on the incoming data, you may need to load it into temporary tables (not table variables) in order to arrive at statistics to assist the optimizer in making good plan choices.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Piling on Grant's fine answer, whenever I see a pattern like this I cannot help but thinking that there is something wrong in the design. If the stored procedure is called from the same database or even the same server then it would probably be more efficient to use a normal table to store the values and point the procedure to that table.

    😎

    Both table-valued parameters and delimited list of values are fine when the cardinality / size is small, the problem is that this technique simply does not scale. I've seen beefy servers go bonkers when hit by few thousands of parameter values.

  • You could also, of course, instead simply insert the values from the tvp into a properly clustered temp table, and then process using the temp table.

    SQL DBA,SQL Server MVP(07, 08, 09) A socialist is someone who will give you the shirt off *someone else's* back.

  • Perhaps I should have given a bit more insight - the input for the stored procedure is a TVP because the caller is a .NET application. Rather than sending the records in one at a time, I'm sending them in as one batch. I could probably write a bulk insert instead, but given that the number of records shouldn't be *that* large, I felt that the TVP would be more appropriate

  • kramaswamy (1/11/2016)


    Perhaps I should have given a bit more insight - the input for the stored procedure is a TVP because the caller is a .NET application. Rather than sending the records in one at a time, I'm sending them in as one batch. I could probably write a bulk insert instead, but given that the number of records shouldn't be *that* large, I felt that the TVP would be more appropriate

    And assuming you take those records and just do an insert with them, it'll be just fine. The problems are possibly going to come up when you decide that you want to UPDATE or DELETE based on those records and now you have to do a JOIN operation against the TVP. Let's say there are less than 100 rows. It's going to scan that table, but that might not be a big deal. The question is, what does it do with the table you're JOINing to? If it scans that table as well... you could be in trouble.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Fair point - I'll make sure that the UPDATE parts do, at very least, index scans, if not index seeks

  • I love what TVPs bring to the party especially when coupled with .NET since you can seamlessly pass an ADO.NET DataTable to a stored proc, how great. Performance with a TVP has shown to generally be better than methods like passing XML documents and delimited strings. (Deep Dive)

    One other tip for performance, if you need to join to a TVP inside your proc you can add OPTION(RECOMPILE) to the relevant queries. The option will give the Optimizer a chance to create its estimates based on the TVPs contents instead of it just defaulting to use 1 as the estimated number of rows in the TVP. The recompilation of the query introduces some overhead but this is usually a trade worth making when considering your chances of getting predictable plans each time you call the proc versus random slowdowns. Your mileage may vary but that should help remedy the biggest disadvantage of using a TVP. If you have large data volumes or are running into horrible performance due to scans refer to what Scott mentioned to bring the data into a full-fledged #temp table with proper indexing.

    There are no special teachers of virtue, because virtue is taught by the whole community.
    --Plato

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply