Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

Long Parameter Lists Expand / Collapse
Author
Message
Posted Sunday, February 26, 2012 7:56 AM


SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Tuesday, April 8, 2014 6:13 AM
Points: 1,694, Visits: 19,550

Anyone fancy writing an article comparing the different ways of handling long parameter lists for stored procedures?

Possible ways I can think of (I'm sure there's others)

CSV
XML (including schema bound)
table valued parameters (SQL Server 2008 and later)
multiple parameters

By long lists I mean at least hundreds. It would be interesting to see how these compare in terms of performance, maintenance etc.


Thanks



____________________________________________________

How to get the best help on a forum

http://www.sqlservercentral.com/articles/Best+Practices/61537

Never approach a goat from the front, a horse from the rear, or a fool from any direction.
Post #1257942
Posted Sunday, February 26, 2012 1:08 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 4:03 PM
Points: 36,983, Visits: 31,508
I guess my take on such a thing (and it IS just an opinion, not an argument) is that if someone is passing "at least hundreds" in such a list, there's a pretty strong chance that someone might have a problem with their design.

I went through such a thing two years ago with a company that really didn't know how to use T-SQL. They would take 4 inputs from the user and then generate a couple thosand rows of data to be inserted into a table. The problem was that they were generating those rows on the application side of the house and then trying to pass them to SQL Server. They first tried passing the information as individual INSERT/VALUES statements. You can just imagine how badly that choked. They then tried passing them as one big ol' delimited string and parsing them on the server side. While that certainly worked better, it still choked the pipe between the sender and the server.

I showed them how to pass the 4 parameters that defined the data they wanted to build and then how to build that data on the server side. The whole process dropped from a 40 minute, pipe choking, server clogging, resource monging process to a server side run of about 3 seconds.

Heh... I told you that story so I could tell you this one. Because of my extreme bias in passing such large amounts of data to a stored proc (even inside the server), I'd be ill suited to write such a thing. I will, however, be happy to tech review such an article if someone wanted to take it on.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1257958
Posted Sunday, February 26, 2012 8:24 PM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Thursday, May 15, 2014 5:11 PM
Points: 6,067, Visits: 5,283
I'm generally with Jeff on this topic. I think you have a fundamental design problem if you have to pass HUNDREDS of parameters. I have never had a case where this made sense and I can't in good conscience recommend it.

CEWII
Post #1257990
Posted Monday, February 27, 2012 6:19 AM


SSCrazy Eights

SSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy Eights

Group: General Forum Members
Last Login: Today @ 11:26 AM
Points: 8,718, Visits: 9,266
There are cases where a lot of data has to be got in quickly - for exqample I might have some measuring equipment that is giving me large numbers (say 10s of 1000s) of readings (each reading is 10s of 16 bit numbers) per second. Then the sensible way to do it isn't to pass the data as parameters to a stored procedure, but to stage them through files and pass a filename for each call of the stored procedure - a proc call per reading would require crazy (currently unavailable) performance levels, and the data isn't algorithmically generated by some known algorithm, it's empirical, so generating it inside the database from a small subset (as in the cases Jeff referred to) is not possible.
These sort of data input rates turn up regularly in radio astronomy, but until fairly recently no-one dreamed of trying to store the data in a relational database (purpose built data handling software written for example in Forth was the order of the day). I think probably anyone looking at SQL to do it even now is probably looking at the wrong tool (so I agree with Jeff and Elliot), but in future I hope that SQL will be able to handle this sort of challenge and think we should be looking at how, with current facilities, it could be approached in SQL.


Tom
Post #1258190
Posted Monday, February 27, 2012 8:22 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Thursday, May 15, 2014 5:11 PM
Points: 6,067, Visits: 5,283
For the case specified, basically a long string of related numbers I would probably handle them as a CSV string and use a breaker to pivot them into rows. Or take a filename. Getting the data loaded is perhaps your upfront bottleneck but using SQL for analysis is probably the way to go.. But I still haven't seen a case where hundreds of input parameters are either a good thing or something I would consider.. Not even in this case. But I like the case, it would be an interesting case to tackle.

CEWII
Post #1258280
Posted Monday, February 27, 2012 8:36 AM


SSChampion

SSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampion

Group: General Forum Members
Last Login: Today @ 5:28 AM
Points: 12,910, Visits: 32,013
If you had massive amounts of data like L' Eomot Inversé 's example, I'd think that using .NET or ADO to bulk insert would be the solution;
accumulate the data locally for a while, and then bulk copy.
Try
Dim dt As New DataTable
Dim DestinationTable As String = "InstrumentMeasurements"
dt = Some_Process_That_Loads_A_Zillion_Rows()
Dim myConn As New SqlConnection(Me.ConnectionString)
myConn.Open()
Using myBulkCopy As New SqlBulkCopy(myConn)
myBulkCopy.DestinationTableName = "[" & DestinationTable & "]"
myBulkCopy.WriteToServer(dt)
End Using
Catch ex As Exception
Debug.Print(ex.Message)
End Try



Lowell

--There is no spoon, and there's no default ORDER BY in sql server either.
Actually, Common Sense is so rare, it should be considered a Superpower. --my son
Post #1258287
Posted Tuesday, February 28, 2012 12:02 PM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Saturday, August 23, 2014 8:34 AM
Points: 7,097, Visits: 12,601
Storing spectrometer data with 1024 channels is a problem in SQL Server unless you pivot (which has too much baggage to be useful at that scale IMO) since the storage engine limits us to 1024 columns per table. The problem can be handled quite nicely however using a SQLCLR Type while providing solid performance and type-safety. SQLCLR types open up many possibilities and help us overcome many of the limitations L' Eomot Inversé alluded to with storing scientific data (many readings per entity per time) in a relational store.

__________________________________________________________________________________________________
There are no special teachers of virtue, because virtue is taught by the whole community. --Plato
Post #1259084
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse