Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««12345»»»

Sending multiple rows to the Database from an Application: Part I Expand / Collapse
Author
Message
Posted Tuesday, August 18, 2009 8:50 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, April 1, 2011 9:05 AM
Points: 1, Visits: 5
One alternative that I have used is app server generated SQL that uses "select X, X union all" to create a table of values to be inserted, updated or deleted.

Example:
insert into foo ( a, b, c)
select 1, 'bob', 'smith'
union all
select 3, 'John', 'Doe'
union all
select 93, 'Jane', 'Jones'
union all ....

Note you MUST BIND the values in the SELECT(s) to insure you do not have SQL injection happening.

This is very efficent and quick. When I first used it we needed to do upto ~1500 3 column inserts from a web app. This was taking 5+ min to complete. When changed took less than 5 seconds (might have been even lower don't remember for sure). One limit we found with SQL 2000 I belive (might have been 7) was 1024 unions was the max it would process in a single statement. So we split the data and did and additional insert into for every 1000 records being done. So do test the limit with SQL 2005/2008 befor using my method.
Post #772788
Posted Tuesday, August 18, 2009 9:32 AM
Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Yesterday @ 3:18 AM
Points: 323, Visits: 2,200
I can subscribe to the method "sql" subscribers, it works well and is far easyer to implement in many cases as there is not any tricky code involved.

Another method is to declare a table variable (make sure you set the collation on varchars correct) and then insert into this table using separate inserts within the same batch. Finally process the data in the table any way you like.


Post #772835
Posted Tuesday, August 18, 2009 9:53 AM


Right there with Babe

Right there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with Babe

Group: General Forum Members
Last Login: Friday, April 4, 2014 4:40 PM
Points: 751, Visits: 917
A good article and an interesting approach. Another approach for when dealing with truly large amounts of data is to have the application output it into a CSV file and then import that through a procedure which uses bulk insert. Of course, that is generally not the best approach for just a few rows, but it works nicely if there are a very large number.

---
Timothy A Wiseman
SQL Blog: http://timothyawiseman.wordpress.com/
Post #772855
Posted Tuesday, August 18, 2009 11:14 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Friday, October 15, 2010 7:58 AM
Points: 26, Visits: 81
I guess I just haven't come across a situation where this is necessary. If you are only inserting a few rows (OLTP) then what is wrong with simply concatenating multiple INSERT statements? If you are inserting many rows then BCP is the way to go. If you are using .Net then the SqlBulkCopy class makes bulk inserting of rows almost trivial.
Post #772898
Posted Tuesday, August 18, 2009 5:11 PM
SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Tuesday, July 22, 2014 2:22 PM
Points: 411, Visits: 1,399
SQL provides us with a very powerfull tool to do this too: the union all statement. You can use this to generate, for example, a single statement that inserts multiple rows like this:
insert into dbo.table( col1, col2, col)
select 'some', 'value', 10
union all select 'another', 'value', 11

and an update statement can be constructed like this:

update tbl
set
col1 = t.col1,
col2 = t.col2
from dbo.table tbl
inner join (
select 'some' as col1, 'value' as col2, 10 as col3
union all select 'another', 'value', 11
) t on (t.col3 = tbl.col3)

One big advantage can be that this fires any triggers on the table only once for the entire set and not over and over for each row. Having this sort of tricks available can realy save your day.

edit: sorry to duplicate sql's idea, I hadn't opened the 2nd page of results yet.




Posting Data Etiquette - Jeff Moden
Posting Performance Based Questions - Gail Shaw
Hidden RBAR - Jeff Moden
Cross Tabs and Pivots - Jeff Moden
Catch-all queries - Gail Shaw


If you don't have time to do it right, when will you have time to do it over?
Post #773132
Posted Tuesday, August 18, 2009 5:32 PM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Tuesday, September 22, 2009 9:03 PM
Points: 48, Visits: 75
Hello,
I read your article with interest, because I have to solve similar problem recently. Both solutions are interesting, and both were considered when I was going to write this part of my code.
I found some problems with both of them. First: I have potentially very long strings with rich text tags, so I cound not think about a delimiter, which could be used safely. Also, I know that T-SQL is not very efifcient with string processing, and, as I can possibly have handreds of records to save, it would be slow.
Second: XML is great and easy when you have a dataset to save. In my case, I have objects and it means I have to build XML myself. I tried, by the way, and found it is not very quick as well.
So, our team came up with the third solution. Inside my .NET code we build and SQL string to create and populate TEMP tables, based on properties of my objects, using reflection. Then we execute that in SQL Server, and than call stored procedure to process data from temp table (validate, do some calculations and save to real tables)

Post #773140
Posted Tuesday, August 18, 2009 11:07 PM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Friday, July 29, 2011 4:14 AM
Points: 318, Visits: 198
First of all thanks for your valuable comment with different approach.

R.P.Rozema, sql :
Actually in my scenario we can access database only by SP. So we must not use any inline query, so we cannot create a SELECT query using UNION ALL.

timothyawiseman, Scott Roberts:
As far as bulk insert is concern, I think that it is very efficient for very very large dataset. Here we are dealing with some 100 / 500 odd rows.

Galina Likhovetsky:
Actually our application is using JAVA as a language. Yes agree, if it would be at Microsoft platform, definitely I would have suggested and think about your approach.

Darrell Bauer:
The table-valued function [dbo].[TableFormDelimetedStringWithoutNumberList] will choke on lists longer than 100 elements. Here's the error:

Oops I missed it. Good catch.
;WITH CTETable (start, stop)
........
........
........
WHERE stop > 0
OPTION(MAXRECURSION 0)
you can add this hint to the sql in specified function.

Stephen Byrne:
I think it is not inserting one record at a time rather it is inserting in a set. So you will get a generic error like (data would be truncated, FK or unique constraints voilated etc.)

Lee Forst:
I haven't tested for it that much records, but our application is working fine dealing with upto 400 - 500 records in an XML.
Will test for more records and post the result.
Post #773196
Posted Wednesday, August 19, 2009 3:23 AM
SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Thursday, February 20, 2014 8:25 AM
Points: 468, Visits: 1,023
The code I use to split out delimited list is:

create function [dbo].[Split](@value nvarchar(max), @delimiter varchar(20))
returns @results table (id int identity(1, 1) primary key, value nvarchar(max))
as
begin
declare @p1 int,
@p2 int

-- ensure value ends with the delimiter character
if right(@value, len(@delimiter)) != @delimiter
set @value = @value + @delimiter

set @p1 = 1
set @p2 = charindex(@delimiter, @value)

while @p2 > @p1
begin
insert into @results (value)
select substring(@value, @p1, @p2 - @p1)

set @p1 = @p2 + len(@delimiter)
set @p2 = charindex(@delimiter, @value, @p1)
end

return
end
go

My initial tests show it to perform faster than your CTE based solution, but I would welcome other peoples results.



Post #773304
Posted Wednesday, August 19, 2009 8:23 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, October 15, 2010 9:32 AM
Points: 6, Visits: 21
Hi,

We have an application which requires anywhere between 10,000 to 500,000 new records to be uploaded into our database at anyone time. For each record we have to go through the process of checking wether it already exists and if so get and ID for it but if not create a new record before retrieving it's ID.

After along time trying all sorts of different methods (including the delimited string method) we now use the approach of creating a "clean" formatted text file on the local box, this is then uploaded to a predefined location by the application. The application then calls an SP which triggers a BCP from the uploaded file straight into a temporary table on our database. Now the data is in the database we call a second SP which loops through the records in the temporary table doing the neccesary work.

Hope this inspires one or two of you
Post #773564
Posted Friday, August 21, 2009 8:53 PM


Right there with Babe

Right there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with Babe

Group: General Forum Members
Last Login: Friday, April 4, 2014 4:40 PM
Points: 751, Visits: 917
venablc, thanks for the post. I use a similar technique for certain large imports. It does not make a lot of sense for small inserts, but it is one of the best ways to deal with very large ones.

---
Timothy A Wiseman
SQL Blog: http://timothyawiseman.wordpress.com/
Post #775497
« Prev Topic | Next Topic »

Add to briefcase ««12345»»»

Permissions Expand / Collapse