SQL Server 2008 Table Valued Parameters Performance

  • m.scott.bryant


    Points: 5

    Scott - Thanks for the article. Reaffirms some of the information I found while investigating this option for some of our performance improvements.

    One other thing I discovered through my testing is table valued parameters are not a good solution if you need to pass rows of relatively long text. We had one instance where we had 5 strings that needed passed into the stored procedure for each new row. Using Table Valued Parameters to pass the data showed a huge degradation in performance over alternative solutions. Passing numeric values in Table Valued Parameters showed a huge increase in performance.

    The result of my testing showed (as always) that you need to consider all the alternatives, but if you are passing purely numeric data it is an excellent solution. As you add varchar() and nvarchar() columns, you need to analyze the performance and consider alternatives.

  • Florian Reischl


    Points: 37299

    Hi Scott

    m.scott.bryant (7/9/2010)

    ... rows of relatively long text...

    Probably you want to have a look for the new SQL Server 2008 WRITE() function of VARCHAR(MAX) and VARBINARY(MAX):

    Concatenation of Text and Binary Data in T-SQL



  • Scott Zurolo

    SSC Enthusiast

    Points: 117

    Thank you. It is true. As the size of your data row grows, the performance boost from table valued parameters may decrease. If the data passed in contains several large fields such as varchar(Max), the single row insert approach may perform better. Thanks for pointing that out.

  • emmchild

    Ten Centuries

    Points: 1010

    Great article! Phil this technique is great when used with merge. Another thing to consider is the use of tvp's for optional parameter queries.

  • jude.pieries

    Default port

    Points: 1438

    I’m sorry I don’t have a real example but I know when I see trouble at a distance

    Please don’t get me wrong, it’s not that I don’t like this feature but just that this will not be effective for a large table which expects 80-90 % read operation with “read committed”.

    Updates like this will not help replication (transaction replication ) as well, if I remember right transactions may get stuck at the distributor and may even cause deadlocks at the subscriber. Granular the transaction the more success it have of completing without causing significant pain to DB.

    Let me know if you want to take this conversation offline ; would love to share what I know

  • Scott Zurolo

    SSC Enthusiast

    Points: 117

    Thank you for your input. The primary benefit of table valued parameters in my example is eliminating the overhead of multiple calls to the database. For each call there is overhead from ADO.net to set up the connection and configure the parameters and procedure call. Table valued parameters help reduce the repetitive creation of the ADO.net object plumbing for each call. For a true bulk insert operation, this feature may not be the optimal solution. Like most tools in the SQL Server toolbox table valued parameters may be a good fit in certain scenerios. I am using in production, stored procedures with table valued parameters on a database that is part of transactional replication. I have not seen any adverse effect on latency. It is a good thought to concider what side effects a new technology can have on something like replication. Thanks.

  • Slick84


    Points: 5602


    Did you decide to skip the section where you populated the Table Valued Parameter? In the script you create it, but I dont see where its populated? Please let us know.




  • Scott Zurolo

    SSC Enthusiast

    Points: 117

    In the article right below the declaration of the UpdateTableViaTableValuedParameter stored procedure I populated the table valued parameter with data from a table named PODetailtest:

    DECLARE @ProcTableParamExample PODetailTableType

    INSERT INTO @ProcTableParamExample

    SELECT TOP 750 PurchaseOrderDetailID,



    FROM PODetailtest

    You may just need to scroll down to see it. I hope this helps clarify. Thanks for your interest.

  • Slick84


    Points: 5602


    I think the incorrectly named parameters confused me.

    You used @TableParamExample and @ProcTableParamExample.

    Anyhow, thanks.




  • j nuñez


    Points: 2115

    A lot of thanks to everybody for your contribution on this subject.

    This is a problem we has in our applications and SQL Server 2000.

    We are now considering the upgrade to SQL Server 2008.

    We will make a extensive analysis of performance on this kind of solution.

    Again, a lot of thanks!

  • skra


    Points: 1614


    Did You make any test which would measure the whole operation performance (CPU and memopry usage, IO reads and execution time):

    I. preparing XML in an external application

    II. executing a procedure with this XML as parameter


    I. preparing (populating with data) a DataTable on the application side

    II. executing a procedure with this table as parameter (which also includes sending the whole data table through the network connection to the server and inserting rows to the table on the server side [II-B])


    Time and resources used in the step II-B may be significant.

    With best regards

Viewing 11 posts - 16 through 26 (of 26 total)

You must be logged in to reply to this topic. Login to reply