Long Parameter Lists

  • Anyone fancy writing an article comparing the different ways of handling long parameter lists for stored procedures?

    Possible ways I can think of (I'm sure there's others)

    CSV

    XML (including schema bound)

    table valued parameters (SQL Server 2008 and later)

    multiple parameters

    By long lists I mean at least hundreds. It would be interesting to see how these compare in terms of performance, maintenance etc.

    Thanks

    ____________________________________________________

    Deja View - The strange feeling that somewhere, sometime you've optimised this query before

    How to get the best help on a forum

    http://www.sqlservercentral.com/articles/Best+Practices/61537
  • I guess my take on such a thing (and it IS just an opinion, not an argument) is that if someone is passing "at least hundreds" in such a list, there's a pretty strong chance that someone might have a problem with their design.

    I went through such a thing two years ago with a company that really didn't know how to use T-SQL. They would take 4 inputs from the user and then generate a couple thosand rows of data to be inserted into a table. The problem was that they were generating those rows on the application side of the house and then trying to pass them to SQL Server. They first tried passing the information as individual INSERT/VALUES statements. You can just imagine how badly that choked. They then tried passing them as one big ol' delimited string and parsing them on the server side. While that certainly worked better, it still choked the pipe between the sender and the server.

    I showed them how to pass the 4 parameters that defined the data they wanted to build and then how to build that data on the server side. The whole process dropped from a 40 minute, pipe choking, server clogging, resource monging process to a server side run of about 3 seconds.

    Heh... I told you that story so I could tell you this one. 🙂 Because of my extreme bias in passing such large amounts of data to a stored proc (even inside the server), I'd be ill suited to write such a thing. I will, however, be happy to tech review such an article if someone wanted to take it on.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • I'm generally with Jeff on this topic. I think you have a fundamental design problem if you have to pass HUNDREDS of parameters. I have never had a case where this made sense and I can't in good conscience recommend it.

    CEWII

  • There are cases where a lot of data has to be got in quickly - for exqample I might have some measuring equipment that is giving me large numbers (say 10s of 1000s) of readings (each reading is 10s of 16 bit numbers) per second. Then the sensible way to do it isn't to pass the data as parameters to a stored procedure, but to stage them through files and pass a filename for each call of the stored procedure - a proc call per reading would require crazy (currently unavailable) performance levels, and the data isn't algorithmically generated by some known algorithm, it's empirical, so generating it inside the database from a small subset (as in the cases Jeff referred to) is not possible.

    These sort of data input rates turn up regularly in radio astronomy, but until fairly recently no-one dreamed of trying to store the data in a relational database (purpose built data handling software written for example in Forth was the order of the day). I think probably anyone looking at SQL to do it even now is probably looking at the wrong tool (so I agree with Jeff and Elliot), but in future I hope that SQL will be able to handle this sort of challenge and think we should be looking at how, with current facilities, it could be approached in SQL.

    Tom

  • For the case specified, basically a long string of related numbers I would probably handle them as a CSV string and use a breaker to pivot them into rows. Or take a filename. Getting the data loaded is perhaps your upfront bottleneck but using SQL for analysis is probably the way to go.. But I still haven't seen a case where hundreds of input parameters are either a good thing or something I would consider.. Not even in this case. But I like the case, it would be an interesting case to tackle.

    CEWII

  • If you had massive amounts of data like L' Eomot Inversé 's example, I'd think that using .NET or ADO to bulk insert would be the solution;

    accumulate the data locally for a while, and then bulk copy.

    Try

    Dim dt As New DataTable

    Dim DestinationTable As String = "InstrumentMeasurements"

    dt = Some_Process_That_Loads_A_Zillion_Rows()

    Dim myConn As New SqlConnection(Me.ConnectionString)

    myConn.Open()

    Using myBulkCopy As New SqlBulkCopy(myConn)

    myBulkCopy.DestinationTableName = "[" & DestinationTable & "]"

    myBulkCopy.WriteToServer(dt)

    End Using

    Catch ex As Exception

    Debug.Print(ex.Message)

    End Try

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Storing spectrometer data with 1024 channels is a problem in SQL Server unless you pivot (which has too much baggage to be useful at that scale IMO) since the storage engine limits us to 1024 columns per table. The problem can be handled quite nicely however using a SQLCLR Type while providing solid performance and type-safety. SQLCLR types open up many possibilities and help us overcome many of the limitations L' Eomot Inversé alluded to with storing scientific data (many readings per entity per time) in a relational store.

    There are no special teachers of virtue, because virtue is taught by the whole community.
    --Plato

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply