Sending multiple rows to the Database from an Application: Part II

  • Extremely helpful article. Exactly what I needed at exactly the moment I needed it. See my post in reply above for more detail, but in summary I am shipping data from all over the country from Access mdb's to SQL Server, dropping XML in a holding table, without even processing inserts during the session. Scheduled job follows up later and inserts as appropriate.

    Very helpful.

    Very low cost in code to implement. Maybe 20 rows or so, and saved a ton of coding for sending values row at a time via parameters on stored procedure. Thank God I hadn't started writing that mess when I found this. Solid gold!!

  • Charles Wannall (10/22/2010)


    I can see immediate usefulness in a project where I intend to use it today: I have large tables in Access database spread all over the country. Ancient app, many users. Not going away. But...must centralize the data for control and for management review and reporting. There's more involved obviously, but this little gem of an article shows me how I can organize my transmission into 5K chunks and send them to a stored procedure in SS that dumps them in a holding table - no processing except landing in the dump table. Later, I can follow up on SS with scheduled job to process rows into proper tables.

    User in Access has minimum transmission time (no waiting for inserts on server end). Data gets centralized (yes, GUIDs are added at transmission source and used throughout from then on).

    Fantastic article. Exactly what I needed at exactly the right moment.

    The collection into XML cost about 20 lines, including the routine to convert a recordset data row into an xml node.

    I would suggest two ideas, both similar.

    1. Put a IDataReader on your JetDatabase (Access .mdb file). As your fire-hose read the JetDatabase data, throw the values into a strong dataset.

    2. Every X number of rows (make it configurable and experiment).....push the strong dataset.GetXml() to a stored procedure.

    Deviation A

    1. If you care about bandwidth, zip the dataset xml and send it to a service. Or check out binary methods for serializing datasets.

    2. You could have a service which can unzip (or use raw string-xml) to process rows via bulk.

    OR

    Put your data into simple POCO objects. Add [Serializable] attribute to the class(es).

    Binary serialize and ship to Service.

    ...

    I have an older blog entry which describes the first one:

    http://sholliday.spaces.live.com/Blog/cns!A68482B9628A842A!527.entry

    ...

  • Response to con for XML item, "Difficult to create XML data at the application layer":

    Using the ASP.Net C# XmlTextWriter class , I found creating my data in xml format to send to the database fairly simple. XmlTextWriter did all the formatting for me. I could then check the string created and tweak the xmlTextWriter code if need be.

    (http://msdn.microsoft.com/en-us/library/system.xml.xmltextwriter.aspx)

    I have listed below a sample of the commands I used:

    using System.Xml;

    using System.IO;

    StringWriter writer = new StringWriter();

    XmlTextWriter xtWriter = new XmlTextWriter(writer);

    xtWriter.WriteStartElement("XML");

    xtWriter.WriteStartElement("CutParms");

    xtWriter.WriteStartElement("Parm");

    //Begin loop

    xtWriter.WriteAttributeString("DataYear", ddEventYear.SelectedText);

    xtWriter.WriteAttributeString("DataType", ddCertificateType.SelectedValue);

    xtWriter.WriteAttributeString("CutType", ddCutTypes.SelectedValue);

    xtWriter.WriteAttributeString("RevisedStateList", sJurisdictionList);

    //End loop

    xtWriter.WriteEndElement();

    xtWriter.WriteEndElement();

    xtWriter.WriteEndElement();

    xtWriter.Close();

    sXML = writer.ToString(); // string to send to the database

  • bruce lee-206043 (8/25/2009)


    there is another choice:

    Update SQL Server Data by Using XML Updategrams

    An updategram is a data structure that you can use to express a change in the data. INSERT, UPDATE, and DELETE commands are represented in an updategram by the difference of the image of the data before and the image of the data after a change.

    http://support.microsoft.com/kb/316018%5B/quote%5D

    Have you ever tried using it? I did and it performed badly on even a simple xml format(1 level elements with no more than 10 elements per row, closely resembling the underlying table's layout). For more complicated xml formats it is seriously slow (up to taking minutes to update a single record). We found that this is mostly due to the very complex (and huge) T-SQL statements it generates. The idea of being able to select, insert, delete and update all using the same 'meta model' defined in a single xsd, hiding the actual database layout from the application is very nice. But the implementation is far from usable as it is right now.



    Posting Data Etiquette - Jeff Moden[/url]
    Posting Performance Based Questions - Gail Shaw[/url]
    Hidden RBAR - Jeff Moden[/url]
    Cross Tabs and Pivots - Jeff Moden[/url]
    Catch-all queries - Gail Shaw[/url]


    If you don't have time to do it right, when will you have time to do it over?

  • Hmmmmm...OK. I gotta learn some more stuff.

    I will investigate this and follow up with whatever I can master, as improvements, when I understand it well enough to apply it.

    Awfully nice of you to send it along. It all sounds like improvements worth making.

    Thanks ever so much.

    Rick

  • I see references to OPENXML in the postings.

    Don't use OPENXML! Use XQuery. See the following SQL ServerCentral article:

    http://www.sqlservercentral.com/blogs/michael_coles/archive/2008/01/20/stop-using-openxml-please.aspx


    [font="Arial Narrow"](PHB) I think we should build an SQL database. (Dilbert) What color do you want that database? (PHB) I think mauve has the most RAM.[/font]

  • Mauve (10/28/2010)


    I see references to OPENXML in the postings.

    Don't use OPENXML! Use XQuery. See the following SQL ServerCentral article:

    http://www.sqlservercentral.com/blogs/michael_coles/archive/2008/01/20/stop-using-openxml-please.aspx

    True, but BE WEARY of element based xml depending on your Sql Server Version.

    http://connect.microsoft.com/SQLServer/feedback/details/250407/insert-from-nodes-with-element-based-xml-has-poor-performance-on-sp2-with-x64

    Note, there are more versions affected than the url suggests.

  • sholliday (10/28/2010)


    Mauve (10/28/2010)


    I see references to OPENXML in the postings.

    Don't use OPENXML! Use XQuery. See the following SQL ServerCentral article:

    http://www.sqlservercentral.com/blogs/michael_coles/archive/2008/01/20/stop-using-openxml-please.aspx

    True, but BE WEARY of element based xml depending on your Sql Server Version.

    http://connect.microsoft.com/SQLServer/feedback/details/250407/insert-from-nodes-with-element-based-xml-has-poor-performance-on-sp2-with-x64

    Note, there are more versions affected than the url suggests.

    It's a defect. It will be, or has been, fixed.

    I'm also on SQL Server 2008 R2 (x64).

    We're successfully using the XML approach as a data transport mechanism between our .Net SaaS web application and the stored procedures in the database. Works quite well. The XML is a hybrid (Attribute & Element) but mostly Attribute centric.


    [font="Arial Narrow"](PHB) I think we should build an SQL database. (Dilbert) What color do you want that database? (PHB) I think mauve has the most RAM.[/font]

Viewing 8 posts - 46 through 52 (of 52 total)

You must be logged in to reply to this topic. Login to reply