SQL Clone
SQLServerCentral is supported by Redgate
 
Log in  ::  Register  ::  Not logged in
 
 
 


Sending multiple rows to the Database from an Application: Part I


Sending multiple rows to the Database from an Application: Part I

Author
Message
sql-1055196
sql-1055196
Forum Newbie
Forum Newbie (1 reputation)Forum Newbie (1 reputation)Forum Newbie (1 reputation)Forum Newbie (1 reputation)Forum Newbie (1 reputation)Forum Newbie (1 reputation)Forum Newbie (1 reputation)Forum Newbie (1 reputation)

Group: General Forum Members
Points: 1 Visits: 5
One alternative that I have used is app server generated SQL that uses "select X, X union all" to create a table of values to be inserted, updated or deleted.

Example:
insert into foo ( a, b, c)
select 1, 'bob', 'smith'
union all
select 3, 'John', 'Doe'
union all
select 93, 'Jane', 'Jones'
union all ....

Note you MUST BIND the values in the SELECT(s) to insure you do not have SQL injection happening.

This is very efficent and quick. When I first used it we needed to do upto ~1500 3 column inserts from a web app. This was taking 5+ min to complete. When changed took less than 5 seconds (might have been even lower don't remember for sure). One limit we found with SQL 2000 I belive (might have been 7) was 1024 unions was the max it would process in a single statement. So we split the data and did and additional insert into for every 1000 records being done. So do test the limit with SQL 2005/2008 befor using my method.
peter-757102
peter-757102
Ten Centuries
Ten Centuries (1.1K reputation)Ten Centuries (1.1K reputation)Ten Centuries (1.1K reputation)Ten Centuries (1.1K reputation)Ten Centuries (1.1K reputation)Ten Centuries (1.1K reputation)Ten Centuries (1.1K reputation)Ten Centuries (1.1K reputation)

Group: General Forum Members
Points: 1077 Visits: 2559
I can subscribe to the method "sql" subscribers, it works well and is far easyer to implement in many cases as there is not any tricky code involved.

Another method is to declare a table variable (make sure you set the collation on varchars correct) and then insert into this table using separate inserts within the same batch. Finally process the data in the table any way you like.
timothyawiseman
timothyawiseman
UDP Broadcaster
UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)

Group: General Forum Members
Points: 1464 Visits: 920
A good article and an interesting approach. Another approach for when dealing with truly large amounts of data is to have the application output it into a CSV file and then import that through a procedure which uses bulk insert. Of course, that is generally not the best approach for just a few rows, but it works nicely if there are a very large number.

---
Timothy A Wiseman
SQL Blog: http://timothyawiseman.wordpress.com/
Scott Roberts-430649
Scott Roberts-430649
SSC Rookie
SSC Rookie (48 reputation)SSC Rookie (48 reputation)SSC Rookie (48 reputation)SSC Rookie (48 reputation)SSC Rookie (48 reputation)SSC Rookie (48 reputation)SSC Rookie (48 reputation)SSC Rookie (48 reputation)

Group: General Forum Members
Points: 48 Visits: 81
I guess I just haven't come across a situation where this is necessary. If you are only inserting a few rows (OLTP) then what is wrong with simply concatenating multiple INSERT statements? If you are inserting many rows then BCP is the way to go. If you are using .Net then the SqlBulkCopy class makes bulk inserting of rows almost trivial.
R.P.Rozema
R.P.Rozema
UDP Broadcaster
UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)

Group: General Forum Members
Points: 1465 Visits: 1701
SQL provides us with a very powerfull tool to do this too: the union all statement. You can use this to generate, for example, a single statement that inserts multiple rows like this:
insert into dbo.table( col1, col2, col)
select 'some', 'value', 10
union all select 'another', 'value', 11

and an update statement can be constructed like this:

update tbl
set
col1 = t.col1,
col2 = t.col2
from dbo.table tbl
inner join (
select 'some' as col1, 'value' as col2, 10 as col3
union all select 'another', 'value', 11
) t on (t.col3 = tbl.col3)

One big advantage can be that this fires any triggers on the table only once for the entire set and not over and over for each row. Having this sort of tricks available can realy save your day.

edit: sorry to duplicate sql's idea, I hadn't opened the 2nd page of results yet.



Posting Data Etiquette - Jeff Moden
Posting Performance Based Questions - Gail Shaw
Hidden RBAR - Jeff Moden
Cross Tabs and Pivots - Jeff Moden
Catch-all queries - Gail Shaw


If you don't have time to do it right, when will you have time to do it over?
Galina Likhovetsky
Galina Likhovetsky
SSC Rookie
SSC Rookie (48 reputation)SSC Rookie (48 reputation)SSC Rookie (48 reputation)SSC Rookie (48 reputation)SSC Rookie (48 reputation)SSC Rookie (48 reputation)SSC Rookie (48 reputation)SSC Rookie (48 reputation)

Group: General Forum Members
Points: 48 Visits: 75
Hello,
I read your article with interest, because I have to solve similar problem recently. Both solutions are interesting, and both were considered when I was going to write this part of my code.
I found some problems with both of them. First: I have potentially very long strings with rich text tags, so I cound not think about a delimiter, which could be used safely. Also, I know that T-SQL is not very efifcient with string processing, and, as I can possibly have handreds of records to save, it would be slow.
Second: XML is great and easy when you have a dataset to save. In my case, I have objects and it means I have to build XML myself. I tried, by the way, and found it is not very quick as well.
So, our team came up with the third solution. Inside my .NET code we build and SQL string to create and populate TEMP tables, based on properties of my objects, using reflection. Then we execute that in SQL Server, and than call stored procedure to process data from temp table (validate, do some calculations and save to real tables)
mohd.nizamuddin
mohd.nizamuddin
SSC-Addicted
SSC-Addicted (460 reputation)SSC-Addicted (460 reputation)SSC-Addicted (460 reputation)SSC-Addicted (460 reputation)SSC-Addicted (460 reputation)SSC-Addicted (460 reputation)SSC-Addicted (460 reputation)SSC-Addicted (460 reputation)

Group: General Forum Members
Points: 460 Visits: 198
First of all thanks for your valuable comment with different approach.

R.P.Rozema, sql :
Actually in my scenario we can access database only by SP. So we must not use any inline query, so we cannot create a SELECT query using UNION ALL.

timothyawiseman, Scott Roberts:
As far as bulk insert is concern, I think that it is very efficient for very very large dataset. Here we are dealing with some 100 / 500 odd rows.

Galina Likhovetsky:
Actually our application is using JAVA as a language. Yes agree, if it would be at Microsoft platform, definitely I would have suggested and think about your approach.

Darrell Bauer:
The table-valued function [dbo].[TableFormDelimetedStringWithoutNumberList] will choke on lists longer than 100 elements. Here's the error:

Oops I missed it;-). Good catch.
;WITH CTETable (start, stop)
........
........
........
WHERE stop > 0
OPTION(MAXRECURSION 0)
you can add this hint to the sql in specified function.

Stephen Byrne:
I think it is not inserting one record at a time rather it is inserting in a set. So you will get a generic error like (data would be truncated, FK or unique constraints voilated etc.)

Lee Forst:
I haven't tested for it that much records, but our application is working fine dealing with upto 400 - 500 records in an XML.
Will test for more records and post the result.
paulhumphris
paulhumphris
Mr or Mrs. 500
Mr or Mrs. 500 (578 reputation)Mr or Mrs. 500 (578 reputation)Mr or Mrs. 500 (578 reputation)Mr or Mrs. 500 (578 reputation)Mr or Mrs. 500 (578 reputation)Mr or Mrs. 500 (578 reputation)Mr or Mrs. 500 (578 reputation)Mr or Mrs. 500 (578 reputation)

Group: General Forum Members
Points: 578 Visits: 1023
The code I use to split out delimited list is:

create function [dbo].[Split](@value nvarchar(max), @delimiter varchar(20))
returns @results table (id int identity(1, 1) primary key, value nvarchar(max))
as
begin
declare @p1 int,
@p2 int

-- ensure value ends with the delimiter character
if right(@value, len(@delimiter)) != @delimiter
set @value = @value + @delimiter

set @p1 = 1
set @p2 = charindex(@delimiter, @value)

while @p2 > @p1
begin
insert into @results (value)
select substring(@value, @p1, @p2 - @p1)

set @p1 = @p2 + len(@delimiter)
set @p2 = charindex(@delimiter, @value, @p1)
end

return
end
go

My initial tests show it to perform faster than your CTE based solution, but I would welcome other peoples results.



venablc
venablc
Grasshopper
Grasshopper (22 reputation)Grasshopper (22 reputation)Grasshopper (22 reputation)Grasshopper (22 reputation)Grasshopper (22 reputation)Grasshopper (22 reputation)Grasshopper (22 reputation)Grasshopper (22 reputation)

Group: General Forum Members
Points: 22 Visits: 21
Hi,

We have an application which requires anywhere between 10,000 to 500,000 new records to be uploaded into our database at anyone time. For each record we have to go through the process of checking wether it already exists and if so get and ID for it but if not create a new record before retrieving it's ID.

After along time trying all sorts of different methods (including the delimited string method) we now use the approach of creating a "clean" formatted text file on the local box, this is then uploaded to a predefined location by the application. The application then calls an SP which triggers a BCP from the uploaded file straight into a temporary table on our database. Now the data is in the database we call a second SP which loops through the records in the temporary table doing the neccesary work.

Hope this inspires one or two of you :-D
timothyawiseman
timothyawiseman
UDP Broadcaster
UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)UDP Broadcaster (1.5K reputation)

Group: General Forum Members
Points: 1464 Visits: 920
venablc, thanks for the post. I use a similar technique for certain large imports. It does not make a lot of sense for small inserts, but it is one of the best ways to deal with very large ones.

---
Timothy A Wiseman
SQL Blog: http://timothyawiseman.wordpress.com/
Go


Permissions

You can't post new topics.
You can't post topic replies.
You can't post new polls.
You can't post replies to polls.
You can't edit your own topics.
You can't delete your own topics.
You can't edit other topics.
You can't delete other topics.
You can't edit your own posts.
You can't edit other posts.
You can't delete your own posts.
You can't delete other posts.
You can't post events.
You can't edit your own events.
You can't edit other events.
You can't delete your own events.
You can't delete other events.
You can't send private messages.
You can't send emails.
You can read topics.
You can't vote in polls.
You can't upload attachments.
You can download attachments.
You can't post HTML code.
You can't edit HTML code.
You can't post IFCode.
You can't post JavaScript.
You can post emoticons.
You can't post or upload images.

Select a forum

































































































































































SQLServerCentral


Search