Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase «««910111213

Parsing Parameters in a Stored Procedure Expand / Collapse
Author
Message
Posted Monday, November 5, 2012 8:16 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, November 29, 2012 11:47 AM
Points: 21, Visits: 43
Just at the time when we're trying to find a way to improve our data import this article comes along. Perhaps it is a God-send.
While your article does not answer all of the questions it may provide a jumping-off point to leap to a solution -- after we put our heads to our challenge. Our current method ties up the computer for an hour and a half bringing in the data and we'd like something faster.

Here is our challenge:
We receive multiple tables of data in a pipe-delimited text file.
Each table begins with 2 rows that identify the columns (one row with a column "name", the second row with a "field-code" for the column).
Thereafter come the data rows until we encounter the next TABLE headers.
The method in this article may pave the way to (1) parse the table headers to build temporary tables, and then (2) insert the table data into the temporary table. The method may also lead us to a way to extract selected columns for insertion into the final data tables.

Thanks, again, for this timely article (at least for us it's timely).



Norm Johnson

"Keep smiling ... it gives your face something happy to do
        ... and it makes people wonder what you're up to!"
   
Post #1381103
Posted Monday, November 5, 2012 10:32 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 10:15 PM
Points: 35,399, Visits: 31,959
Peter Di (11/5/2012)
Jeff, I agree with you. Having thousands or millions of calls to a procedure like this will be a problem. However stored procedures with multiple parameters are not used widely and usually have much slower part after the parsing. Also, as you can see in the article, there are other ways to send values instead of using comma separated strings ... so optimizing the parsing is not the thing I will start with.

Anyway , you are right that there is a faster way to parse the values and it will be stupid to ignore it ( especially if you already tested and prepared the functions for an easy implementation) . If you don't mind I will update my article to include one of the methods in your test and to include a link to the full article.


Thanks for the feedback, Peter.

We must work in different worlds, though. Most of our stored procedures do contain multiple parameters and round trip time for the GUI has been greatly improved by redacting hfow they're currently handled both in their parsing and post-parse processing. In fact, such has been the case for the last 4 companies I've worked at.

The other thing about parsing is that if someone finds a parsing "routine" that works well for one row, they might try to use it on thousands of rows without knowing the performance ramifications. So I bring that fact up even when something is "guaranteed" to only be used for one row.

It's all positive proof that "It Depends".


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1381199
Posted Monday, November 5, 2012 10:36 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 10:15 PM
Points: 35,399, Visits: 31,959
N.Johnson (11/5/2012)
Just at the time when we're trying to find a way to improve our data import this article comes along. Perhaps it is a God-send.
While your article does not answer all of the questions it may provide a jumping-off point to leap to a solution -- after we put our heads to our challenge. Our current method ties up the computer for an hour and a half bringing in the data and we'd like something faster.

Here is our challenge:
We receive multiple tables of data in a pipe-delimited text file.
Each table begins with 2 rows that identify the columns (one row with a column "name", the second row with a "field-code" for the column).
Thereafter come the data rows until we encounter the next TABLE headers.
The method in this article may pave the way to (1) parse the table headers to build temporary tables, and then (2) insert the table data into the temporary table. The method may also lead us to a way to extract selected columns for insertion into the final data tables.

Thanks, again, for this timely article (at least for us it's timely).

Hi Norm,

I recommend that you actually write a separate post about this subject. Parsing input files usually is quite different than shredding a line or parameters. For example, the use of BULK INSERT would probably do much better for you. Once you start the separate thread, please feel free to PM me with the URL for the thread and I'll try to take a look at it. For sure, there will be many others to respond with some great ideas, as well.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1381201
Posted Monday, December 24, 2012 12:20 PM
SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Monday, April 15, 2013 11:42 PM
Points: 424, Visits: 55
Nice article.
Post #1399963
Posted Tuesday, December 25, 2012 8:12 AM
SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Monday, April 15, 2013 11:42 PM
Points: 424, Visits: 55
Nice article!
Post #1400055
Posted Wednesday, September 25, 2013 7:26 PM
SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Wednesday, September 25, 2013 7:23 PM
Points: 93, Visits: 11
It fails when it contains a valid parameter. I changed Set @sParameter = 'AZ,XX,YY,ZZ'
In this case XX,YY,ZZ are classified as valid when they're not.
Post #1498644
« Prev Topic | Next Topic »

Add to briefcase «««910111213

Permissions Expand / Collapse