Sending multiple rows to the Database from an Application: Part I

  • Comments posted to this topic are about the item Sending multiple rows to the Database from an Application: Part I

  • Good to see an article on a tricky and quite important piece of code.

    Although, with SQL Server 2008 around, I am surprised that you did not mention the Table user defined type and the MERGE statement?

  • Thanks Daniel,

    Although, with SQL Server 2008 around, I am surprised that you did not mention the Table user defined type and the MERGE statement?

    Actually this is based on SQL Server 2005. Yes, now (in Sql Server 2008) we have a feature where we can pass table object to the SP.

  • The technical issues are well thought out but I hope that you won't mind if I point out a couple of key spelling mistakes.

    The word is delimiter and not delimeter, delimited and not delimeted. (The words are derived from the word 'limit' meaning boundary.)

    You were introduced as a new author so 'welcome' and you have obviously taken care with the article.

    I have only ever mastered one language so I'm full of respect for you if English is not your 'first' language. Well done.

    Martin

  • Thank you for exploring an approach that I hadn't considered. I actually advised a colleague this month that there isn't a practical way to send bulk rows to SQL Server 2005 - he had around 100 rows to put in. But I'm particularly excited about the XML approach.

    We have a 3-tier application design (rhymes with "beer"), with desktop and a server both running Java application code, as well as some 2-tier where there's just a desktop program (SQL Server is the other tier). The server is right next to the SQL Server on a fast server-room network (could even be blades), and I assume that just submitting a series of INSERT... VALUES (...) statements, with or without parameterization of the statement, is almost the best we can do. BCP and BULK INSERT won't really do, although I suppose we could do those from our Java server.

    The great trap that I see with delimited data is where you want to use a delimiter character as a character in a data field. Again, XML, which is really just a sophisticated form of delimited data, comes with tools to handle that.

  • The table-valued function [dbo].[TableFormDelimetedStringWithoutNumberList] will choke on lists longer than 100 elements. Here's the error:

    Msg 530, Level 16, State 1, Line 1

    The statement terminated. The maximum recursion 100 has been exhausted before statement completion.

    Using OPTION(MAXRECURSION xxx) is not allowed within a UDF, so the numbered list seems to be the only alternative for large lists. Am I missing something?

  • Good approach given the lack of alternatives, I've used the XML approach before and it can really provide a huge performance boost - but how would you handle error cases - say I'm whacking in 1000 rows and the 999th insert fails for some reason - what would you think would be the best way to get information back about the specific failure to the user?

    Looking forward to part II!

  • I think the goal is to make it a set-based operation, so by SQL standard the error message should be as uninformative as possible ("String or binary data would be truncated"), whilst also throwing blame directly on the application developer! 😉

  • I have used the XML approach to do some integration between databases. I would pass to an SP in a different database a set of reference numbers as an XML param and then join to that XML to the data in that database. Although this worked fine, when it started getting up to over 700 - 1000 records, I saw a major performance problem. It was very slow. Maybe it was my implementation, but I ended up having to rewrite it to eliminate the XML approach (which is what I should have done the first time, but I wanted to try out the XML approach). Just make sure you test your implementation using more data than you ever think you will need to process.

  • i use a pretty same script to send multiple rows in sql server 2005 with charindex but i have an issue... and was that CHARINDEX just accept 8000 chars 🙁 and i was sending sometimes more than 10 textareas with nvarchar(max) :__

    i was thinking about the xml approach

  • One alternative that I have used is app server generated SQL that uses "select X, X union all" to create a table of values to be inserted, updated or deleted.

    Example:

    insert into foo ( a, b, c)

    select 1, 'bob', 'smith'

    union all

    select 3, 'John', 'Doe'

    union all

    select 93, 'Jane', 'Jones'

    union all ....

    Note you MUST BIND the values in the SELECT(s) to insure you do not have SQL injection happening.

    This is very efficent and quick. When I first used it we needed to do upto ~1500 3 column inserts from a web app. This was taking 5+ min to complete. When changed took less than 5 seconds (might have been even lower don't remember for sure). One limit we found with SQL 2000 I belive (might have been 7) was 1024 unions was the max it would process in a single statement. So we split the data and did and additional insert into for every 1000 records being done. So do test the limit with SQL 2005/2008 befor using my method.

  • I can subscribe to the method "sql" subscribers, it works well and is far easyer to implement in many cases as there is not any tricky code involved.

    Another method is to declare a table variable (make sure you set the collation on varchars correct) and then insert into this table using separate inserts within the same batch. Finally process the data in the table any way you like.

  • A good article and an interesting approach. Another approach for when dealing with truly large amounts of data is to have the application output it into a CSV file and then import that through a procedure which uses bulk insert. Of course, that is generally not the best approach for just a few rows, but it works nicely if there are a very large number.

    ---
    Timothy A Wiseman
    SQL Blog: http://timothyawiseman.wordpress.com/

  • I guess I just haven't come across a situation where this is necessary. If you are only inserting a few rows (OLTP) then what is wrong with simply concatenating multiple INSERT statements? If you are inserting many rows then BCP is the way to go. If you are using .Net then the SqlBulkCopy class makes bulk inserting of rows almost trivial.

  • SQL provides us with a very powerfull tool to do this too: the union all statement. You can use this to generate, for example, a single statement that inserts multiple rows like this:

    insert into dbo.table( col1, col2, col)

    select 'some', 'value', 10

    union all select 'another', 'value', 11

    and an update statement can be constructed like this:

    update tbl

    set

    col1 = t.col1,

    col2 = t.col2

    from dbo.table tbl

    inner join (

    select 'some' as col1, 'value' as col2, 10 as col3

    union all select 'another', 'value', 11

    ) t on (t.col3 = tbl.col3)

    One big advantage can be that this fires any triggers on the table only once for the entire set and not over and over for each row. Having this sort of tricks available can realy save your day.

    edit: sorry to duplicate sql's idea, I hadn't opened the 2nd page of results yet.



    Posting Data Etiquette - Jeff Moden[/url]
    Posting Performance Based Questions - Gail Shaw[/url]
    Hidden RBAR - Jeff Moden[/url]
    Cross Tabs and Pivots - Jeff Moden[/url]
    Catch-all queries - Gail Shaw[/url]


    If you don't have time to do it right, when will you have time to do it over?

Viewing 15 posts - 1 through 15 (of 43 total)

You must be logged in to reply to this topic. Login to reply