Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12»»

The Zero to N Parameter Problem (Sql Server 2005 and Up -- Update) Expand / Collapse
Author
Message
Posted Tuesday, December 10, 2013 12:03 AM


SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Friday, April 4, 2014 7:31 AM
Points: 117, Visits: 176
Comments posted to this topic are about the item The Zero to N Parameter Problem (Sql Server 2005 and Up -- Update)


Post #1521404
Posted Tuesday, December 10, 2013 10:04 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Yesterday @ 2:46 PM
Points: 165, Visits: 510
Interesting idea to use XML as a means of passing the parameters, could certainly be very practical.

I am interested in the blanket statement: 'dynamic SQL is a bad idea'? Yes, if handled improperly there is the possibility of a SQL injection attack. However, this is mitigated with the use of the parameter list. The biggest obstacles I have found to deal with depends on the number of applicable predicate conditions. If the list is short, say less than or equal to ten conditions, then code maintenance isn't that bad. As that list increases, I will admit that it becomes a hassle to maintain.

Erland Sommarskog provides probably the best documentation on how to handle dynamic search conditions. He provides multiple solutions, evaluating the pro's and con's of each:

http://www.sommarskog.se/dyn-search.html

And then choose to view the document based on the version of SQL Server that fits your situation.
Post #1521590
Posted Tuesday, December 10, 2013 10:54 AM


SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Friday, April 4, 2014 7:31 AM
Points: 117, Visits: 176
My comment "dynamic SQL isn't a good idea" is directly related to injection attacks, as you state.

What I like about this solution is that everything (client side with a strong dataset, and this procedure on the server side) is "strongly typed" and isn't alot of string concatentation.

Client side, I would write something like

ParameterDS ds = new ParameterDS();

ds.ScalerRow srow = ds.NewScalarRow();
srow.FromDate = DateTime.Now;
srom.ToDate = DateTime.Now;

ds.Scalar.AddScalarRow(srow);

something like that.

So I get strong typing.

It's not the only way to skin a cat, but it's my preferred and still serves me well after 12 or so years.

Thanks for the feedback.



Post #1521601
Posted Tuesday, December 10, 2013 10:55 AM


SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Friday, April 4, 2014 7:31 AM
Points: 117, Visits: 176
Note, this article is an update from a previous (pre-Sql Server 2005) version, here:

http://www.sqlservercentral.com/articles/Stored+Procedures/thezerotonparameterproblem/2283/




Post #1521604
Posted Monday, December 16, 2013 1:41 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, March 26, 2014 9:07 AM
Points: 1, Visits: 14
This is very nice,
but there's a better solution for the same issue.

http://www.madeira.co.il/fully-dynamic-search-conditions-vs-sql-injection
Post #1523119
Posted Monday, December 16, 2013 3:02 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Yesterday @ 7:49 AM
Points: 43, Visits: 480
Thanks for the article,

Consider using strongly typed xml and discarding fuzzy checking like datalength > 10

So you're SP's input parameter might be something like @parmeters XML (DOCUMENT dbo.MyXSD)

http://technet.microsoft.com/en-us/library/ms184277.aspx

You may also consider user defined table type (UDTT) parameters as table valued parameters (TVP), if your calling code is already manipulating datasets.

http://technet.microsoft.com/en-us/library/bb510489(v=sql.105).aspx

Post #1523133
Posted Monday, December 16, 2013 10:01 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Thursday, September 25, 2014 8:49 AM
Points: 147, Visits: 196
I always get a warm and fuzzy feeling when I see XML and TSQL so kudos for your solution. Since you are using XML, you can also pass in multi-hierarchy documents and or document fragments - which can give you more power. Please note the table and table valued parameters comment someone made and even the XSD idea can probably be perceived as superior alternatives. I wonder though, does the sproc scale - given the fact that there may not always be the same parameters coming in. I ask because I came across a situation recently whereby a sproc had overloaded input parameters - not every time would all the parameters be set. This caused the execution plan to be somewhat less than optimal.

Good article and thanks for sharing!
Post #1523330
Posted Monday, December 16, 2013 10:13 AM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Monday, July 28, 2014 7:12 PM
Points: 2,148, Visits: 487
Not a bad tool to have when flexibility trumps all other design objectives. Thank you for taking the time to contribute.
Post #1523335
Posted Monday, December 16, 2013 11:12 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 8:12 AM
Points: 35,770, Visits: 32,434
I appreciate every article because it is an exchange of ideas and it does take an author time to put together. I would like to suggest a couple of improvements, though. Some of the suggestions have already been mentioned but I've included my thoughts on them in the following for completeness. I hope you take them as helpful rather than any form of personal attack.

The NorthWind database is an SQL Server 2000 sample database. Since the title of your article clearly states "Sql {sic}Server 2005 and Up --Update", I strongly recommend that you modify your examples to work with the newer sample databases available as of SQL Server 2005.

In that same vein, I also recommend that you update the comments in the code because "Query Analyzer" does not exist in "SQL Server 2005 and Up".

The code is setup to shred some XML but the parameters in the example usage of the stored procedure are not in XML form in the article. That may be more the fault of the tool used to submit the article but a posted correction on your part would be mighty handy because those examples would show the necessary form of the parameters (example usage) for the stored procedure to work.

As someone else mentioned, I strongly oppose making blanket statements such as the following from the article.
A comma delimited list just does not seem to cut it anymore, and dynamic SQL isn't a good idea.


If you want to know what a good idea and just how injection proof dynamic SQL can actually be, please see the following article.
http://sqlinthewild.co.za/index.php/2009/03/19/catch-all-queries/

Then, to follow up such blanket statements with the following...

This method is not a "cure all" method. There is a performance penalty for using this method. If you have a procedure where performance is the main goal, and you have a lot of records to go through, this solution may not be the answer.
...{snip}...
Good luck. The method is useful ~at times. Please be aware of possible scalability issues, since there is a performance hit by making the procedure so flexible.
...{snip}...
Please heed my warning about scalability, and do not consider this a cure-all.


... seems to indicate that maybe the methods that you summarily dismissed in a blanket fashion might not be so bad after all. You also didn't mention that passing table valued parameters is also a highly effective method that avoids the pain of shredding XML nor any possibilities of what else one might use if the code does, in fact, produce scalability issues.

The following statement is also a bit misleading to the uninitiated who may be looking for a solution to the "0 to N" parameter problem...


...or said another way, I want to specify N parameters, where N is 1 through infinity.


Although the XML data-type can certainly accommodate a rather large number of parameters, its limited to 2GB and "infinity" isn't actually possible. Although I thought the related application was poorly designed, I have had to accommodate the passing of more than 2GB of parameters from an application and a single XML variable won't cut it despite the claim of "infinity". Further, passing a table variable parameter might, in fact, allow one to exceed that 2GB limit (as crazy as that may be).

You also wrote…

I learned this method from a previous colleague. The example is an original one coded against the Northwind database.


To add some additional credibility to your article, I recommend that you review the code and apply some common best practices such as using the 2 part naming convention, using table aliases on all column references where more than one table is referenced by a statement, using consistent casing, avoiding multiple statements (DECLARE) on one line, and maybe even taking the time to consider reducing the line length to prevent the necessity of have to scroll-right to see the end of a line and to facilitate easier reading.

You might also want to "modernize" some of the techniques. For example, doing a SELECT COUNT(*)to figure out the row counts of the table variables that you just INSERTed into could be replaced by the much more efficient SELECT @CounterVariable = @@ROWCOUNT.

And consider getting rid of all the empty "/* */" in the code. At best, they're annoying and the proper use of white space would make the code more pleasurable to read.

This article could be a fantastic article but it needs some tweaking.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1523368
Posted Monday, December 16, 2013 11:15 AM


SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Friday, April 4, 2014 7:31 AM
Points: 117, Visits: 176
gary.strange-sqlconsumer (12/16/2013)
Thanks for the article,

Consider using strongly typed xml and discarding fuzzy checking like datalength > 10

So you're SP's input parameter might be something like @parmeters XML (DOCUMENT dbo.MyXSD)

http://technet.microsoft.com/en-us/library/ms184277.aspx

You may also consider user defined table type (UDTT) parameters as table valued parameters (TVP), if your calling code is already manipulating datasets.

http://technet.microsoft.com/en-us/library/bb510489(v=sql.105).aspx



If the parameters are more often .. different than the same.......then consider the

OPTION(RECOMPILE)

hint.



Post #1523371
« Prev Topic | Next Topic »

Add to briefcase 12»»

Permissions Expand / Collapse