Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase «««910111213»»»

Parsing Parameters in a Stored Procedure Expand / Collapse
Author
Message
Posted Friday, November 2, 2012 7:56 AM


SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: 2 days ago @ 9:00 AM
Points: 1,595, Visits: 4,585
CELKO (8/12/2010)
jhood (8/10/2010)
I see what you are saying Jeff, but would you want to do validation if the application did not allow the end user to send invalid data? perhaps using a drop down list..


LOL! Why of course nobody would ever send garbage to a database! All users are perfect and the DB programmer should never have to worry about data integrity in his schema. Want to get rid of all constraints while we are at it?

I did an article on using a long parameter list instead of trying to write a home-made parser in T-SQL or contaminating the schema with XML and CLR code. I am still working on Part II with some examples of how the home-made parsers produce different results than the T-SQL compiler parameter parser.

So true. The user (client, customer, or whatever) may always be right when one is in the business of marketing, website design or bussing tables at Waffle House. However, from a database engineer's perspective, what gets stored in a database on the back end should be objective, not subjective.

Joe, it's been a couple of years since your comment above about XML contaminating the schema or transaction SQL processing of a database, has this stance softened any at all, at least in terms of using XML for passing input parameters?
Post #1380380
Posted Friday, November 2, 2012 7:56 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, November 27, 2013 9:55 PM
Points: 1, Visits: 6
This seems fraught with error, as well as inefficient.

What if you're doing high volume transaction processing? How will you keep one set of parameters separate from others? Another column with a GUID? Ask the calling code to pass in a unique identifier of its own (user ID or something)?

Is each instance going to create a new temporary table and do its own inserts, read them right back, and drop the table?

How can this possibly be more efficient than simply processing the values in a loop and using no table (temporary or otherwise) at all?

Also, you're going to need validation somewhere, like stripping leading and/or trailing spaces, making sure there are no missing parameters (AZ,,ME), etc., which would be easier to catch in "for each parameter" a loop rather than a ."for each row" loop.

Do you recommend making a solution "pure SQL" simply because procedural code "has coodies" or something?
Post #1380381
Posted Friday, November 2, 2012 8:34 AM


SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: 2 days ago @ 9:00 AM
Points: 1,595, Visits: 4,585
Here is what I've done in the past for supplying complex multi-values parameters to a reporting stored procedure.

The user specifies various options for ad-hoc reporting into a form. The application inserts a REPORTRUN table (one row for each report run) and other related tables like REPORTRUN_RULE, which contains one row for each filtering option chosen by the user from a dropdown listbox. What gets passed to the stored procedure that queries a resultset for the report is a single @reportrun_id parameter. I'll also contain things like the start time, end time, and even the actual SQL (which is generated dynamically because it's an ad-hoc report) in REPORTRUN columns. From an operational perspective, I have a historical record of useful stuff like how a report arrived at it's result, utilization, and runtime duration.
Post #1380409
Posted Friday, November 2, 2012 8:55 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Monday, March 31, 2014 3:37 PM
Points: 3, Visits: 55
There is another article from 2008, which covers the same scenario : Multivalue Parameters With SQL SP

It addition of the 2 methods posted here Comma separated text and SQL Server table the other article explains also Bitwise numbers and XML


Post #1380421
Posted Friday, November 2, 2012 9:03 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Monday, July 7, 2014 7:48 AM
Points: 12, Visits: 71
...
Post #1380427
Posted Friday, November 2, 2012 9:35 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 1:55 PM
Points: 36,749, Visits: 31,197
Peter-359208 (11/2/2012)
There is another article from 2008, which covers the same scenario : Multivalue Parameters With SQL SP

It addition of the 2 methods posted here Comma separated text and SQL Server table the other article explains also Bitwise numbers and XML




I have to warn you that the fnTextToTable is a multi-line table valued function with a WHILE loop (it's a "nibbler" splitter). That's going to be relatively slow (3 times as slow, in fact). Please see the following article for a much faster 8k splitter.
http://www.sqlservercentral.com/articles/Tally+Table/72993/




--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1380456
Posted Friday, November 2, 2012 1:25 PM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Wednesday, March 26, 2014 7:13 AM
Points: 108, Visits: 192
Here is a function to do the same thing:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
Create FUNCTION [dbo].[fnParseStringTSQL]
(@string NVARCHAR(MAX),@separator NCHAR(1))
RETURNS @parsedString TABLE (string NVARCHAR(MAX))
AS
BEGIN
DECLARE @position int
SET @position = 1
SET @string = @string + @separator
WHILE charindex(@separator,@string,@position) <> 0
BEGIN
INSERT into @parsedString
SELECT substring(@string, @position, charindex(@separator,@string,@position) - @position)
SET @position = charindex(@separator,@string,@position) + 1
END
RETURN
END
Post #1380572
Posted Friday, November 2, 2012 2:16 PM
SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Sunday, September 29, 2013 1:24 AM
Points: 429, Visits: 1,721
If anyone still cares...

I used Jeff's test data (#PostalArea and #CsvTest) and did the split (output = 1,000,000 rows) using the DelimitedSplit8K function and an XML splitter function. The XML method took 2 mins 23 secs whereas the DelimitedSplit8K method took just 19 secs. This was on my local machine which is not a speed demon.

Total Execution Time stats in ms (avg of 5 runs)

Rows XML Splitter
1,000 218.4 58.8
10,000 1299.6 131.0
100,000 9140.3 1116.8

Of course, these times will vary from machine to machine.


/* this query took 2 mins 23 secs to run for 1,000,000 rows */

IF OBJECT_ID('tempdb..#CsvSplit2') IS NOT NULL
DROP TABLE #CsvSplit2

SELECT csv.RowNum, split.[ID] AS ItemNumber, split.[Value] AS Abbreviation
INTO #CsvSplit2
FROM #CsvTest csv
CROSS APPLY
(
SELECT [ID],[Value]
FROM dbo.tvfParseDelimitedString(csv.CsvParameter,',')
WHERE [ID] > 0
) split
WHERE
csv.RowNum > 0

SELECT * FROM #CsvSplit2



/* this query took 19 secs to run for 1,000,000 rows */

IF OBJECT_ID('tempdb..#CsvSplit3') IS NOT NULL
DROP TABLE #CsvSplit3

SELECT csv.RowNum, split.ItemNumber, split.Item AS Abbreviation
INTO #CsvSplit3
FROM #CsvTest csv
CROSS APPLY
(
SELECT ItemNumber, Item
FROM dbo.tvfDelimitedSplit(csv.CsvParameter,',')
) split

SELECT * FROM #CsvSplit3


Post #1380586
Posted Friday, November 2, 2012 3:18 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 1:55 PM
Points: 36,749, Visits: 31,197
HextallFanForLife (11/2/2012)
Here is a function to do the same thing:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
Create FUNCTION [dbo].[fnParseStringTSQL]
(@string NVARCHAR(MAX),@separator NCHAR(1))
RETURNS @parsedString TABLE (string NVARCHAR(MAX))
AS
BEGIN
DECLARE @position int
SET @position = 1
SET @string = @string + @separator
WHILE charindex(@separator,@string,@position) <> 0
BEGIN
INSERT into @parsedString
SELECT substring(@string, @position, charindex(@separator,@string,@position) - @position)
SET @position = charindex(@separator,@string,@position) + 1
END
RETURN
END


If it has a WHILE loop in it, it's going to be slow.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1380602
Posted Friday, November 2, 2012 3:22 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 1:55 PM
Points: 36,749, Visits: 31,197
peter.row (11/2/2012)
Hey Diego nice article. I understand what you meant about the implied performance concerns and I found Jeff to be very abrasive virtually to the point of being offensive.


It would be interesting to know why you actually thought that. Care to explain further? Seriously... I don't like to offend folks but I do also like to get points across about not using RBAR to do such things.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1380604
« Prev Topic | Next Topic »

Add to briefcase «««910111213»»»

Permissions Expand / Collapse