Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase «««12345»»

Sending multiple rows to the Database from an Application: Part I Expand / Collapse
Author
Message
Posted Friday, October 15, 2010 9:07 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, October 22, 2010 9:21 PM
Points: 1, Visits: 8
I've used the XML approach many times over for bulk loading many times over. It is probably the most efficient way I've found to do this. However, I use the OPENXML statement to parse the XML for me, then I can treat the xml as if it were a table.

I saw a post about user-defined tables for passing tables into. I would advise against this. It sounds great on paper and I've used it myself (with regrets now). Everything works fine when creating the setup. You define the table in SQL, build your sproc accepting the parameter of the table, pass the table in your code, it works fine. Fast-forward 6 months and you have to change that table structure and find out that you can't modify the user-defined table, you have to drop it and recreate it, but you can't drop it because your sproc is still referencing it. You find yourself squirreling around with the system just to make a simple change.

XML, you make the changes in your class, you make the changes in your sproc. Nice and simple.
Post #1005282
Posted Friday, October 15, 2010 10:17 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Monday, November 4, 2013 9:02 AM
Points: 24, Visits: 147
I'd like to second Stephen Byrne's question on the first page of this thread. What would be the best way to report errors on specific insert/update/delete operations. Stephen used the example of a 1000 row operation, and an error occurs on the 999th row. I'll expand on that. What if errors occurred on rows 7, 345, and 999? What would you suggest for reporting the errors specific to those rows, while allowing the entire 1000 row operation to complete?
Post #1005341
Posted Friday, October 15, 2010 11:33 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, October 5, 2011 1:07 AM
Points: 3, Visits: 61
I've had some success using temp tables to pass multi-row data into stored procs. This has worked well since SQL Server 2000 and is for passing smaller numbers of rows.
Post #1005409
Posted Friday, October 15, 2010 12:06 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 6:12 PM
Points: 35,808, Visits: 32,482
As yet another alternative... use with your favorite splitter...
http://www.sqlservercentral.com/articles/T-SQL/63003/

I also recommend NOT using a recurrsive CTE for such a thing because it's just too slow.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1005448
Posted Saturday, October 16, 2010 8:17 PM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Sunday, December 14, 2014 10:06 PM
Points: 379, Visits: 1,991
For anyone using SQL Server 2008, there is another option that is extremely efficient and does not require a splitter function:

http://www.sqlservercentral.com/articles/SQL+Server+2008/66554/





SQL# - http://www.SQLsharp.com/
Post #1005805
Posted Sunday, October 17, 2010 3:39 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 6:12 PM
Points: 35,808, Visits: 32,482
olimp23 (10/15/2010)
Nice article with interesting approach. I would use it in SqlServer 2000, but since SqlSever 2005 has a good XML support I would choose XML. In fact I have used XML parameter function to pass 50k+ rows from my Java application to SqlServer. I wonder witch approach is faster.


Heh... you should test it an let us know.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1005925
Posted Sunday, October 17, 2010 3:41 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 6:12 PM
Points: 35,808, Visits: 32,482
Daniel Macey (8/18/2009)
Good to see an article on a tricky and quite important piece of code.

Although, with SQL Server 2008 around, I am surprised that you did not mention the Table user defined type and the MERGE statement?


Since you brought it up, do you have an example you'd like to share?


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1005926
Posted Sunday, October 17, 2010 3:51 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 6:12 PM
Points: 35,808, Visits: 32,482
Gosh... my first take on this is all the people in this discussion citing XML as a good way to do this. Would you folks mind providing a little code on how to do that? Then we'll find out if it's a good way to pass large numbers of parameters or not.

My second take on it is that a SQLCLR splitter would probably be the best way to split up large "matrices" of data passed to the database. Of course, right after that I ask "except for certain process monitoring devices (which should be captured in a file to begin with), why would anyone need to pass so much data (ie. a million rows) from an app to a database?


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1005927
Posted Sunday, October 17, 2010 4:04 PM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Sunday, December 14, 2014 10:06 PM
Points: 379, Visits: 1,991
Jeff Moden (10/17/2010)
My second take on it is that a SQLCLR splitter would probably be the best way to split up large "matrices" of data passed to the database. Of course, right after that I ask "except for certain process monitoring devices (which should be captured in a file to begin with), why would anyone need to pass so much data (ie. a million rows) from an app to a database?


Jeff, the speed of large sets in CLR depends on which version of SQL Server one is using. I have noticed that in SQL Server 2005 trying to split a million items from a comma-delimited list starts out fast but slows down over time. So for SQL Server 2005 a T-SQL UDF based on a Numbers Table is the fastest way that I have seen. However, the CLR issue was fixed in SQL Server 2008 and the SQLCLR-based split function is just as fast as the T-SQL / Numbers Table version. Of course, if someone is splitting a complex set and not a simple-delimited list then CLR might certainly be the way to go in either case.

As far as to "why" anyone would need to send a million rows, it could be reading in log files to parse the data and update one or more tables that are not a simple BCP operation. In this case, as I noted above, the fastest way to send in large sets of rows is to use the streaming function combined with a Table-Valued Parameter that will allow someone to do a set-based approach that can start processing with the first row received as opposed to having to receive the entire set of data first which can be several MB if you are talking about 1 million+ rows.

Take care,
Solomon...





SQL# - http://www.SQLsharp.com/
Post #1005930
Posted Sunday, October 17, 2010 6:18 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 6:12 PM
Points: 35,808, Visits: 32,482
I'm not a GUI/App programmer so I'll have to find another way to test what you propose, Solomon. Shouldn't be too difficult, though.

If I were an App programmer, I'd write code to make whatever log file you speak of more easily fit the idea of a simple BCP import. Normalization of data isn't reserved to just tables in a database nor any less important outside a database.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1005951
« Prev Topic | Next Topic »

Add to briefcase «««12345»»

Permissions Expand / Collapse