Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase «««23456»»

Sending multiple rows to the Database from an Application: Part II Expand / Collapse
Author
Message
Posted Friday, October 22, 2010 7:00 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Tuesday, December 10, 2013 7:48 AM
Points: 2, Visits: 36
We take a different approach not mentioned in these articles, and it seems to work very well for our always-connected client/server apps.

We do a lot of large/wide data set processing in stored procedures, and passing huge text or XML files from the UI did not seem to be the best approach.

Rather than using XML or table functions, we use temp tables. From within the application we create/append data to a temp table with a unique identifier for that "batch". Then we simply call a stored procedure, passing the batch_id, and the multi-row data set is available for use by the proc for whatever processing needs to be performed.

The temp table is SPID specific, so there is no risk of data collision between users, nor between multiple processes for the same user if multi-threading (because of the batch_id). Once the proc is done with the data (which is almost always the case), it simply purges that data from the temp table. If the temp table become empty, it is dropped all together, so the overhead on the server is very low.

Yes, the initial loading of the temp tables is performed row-by-row behind the scenes, but that is achieved with a single line of code from the app.
Post #1009151
Posted Friday, October 22, 2010 7:21 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Wednesday, August 6, 2014 9:00 PM
Points: 16, Visits: 168
I am also using this same method in several places. This is great for when you want all the records to succeed or none. I would rather control the transaction rollback in a procedure than let the UI.

This is a great article. Thanks for posting.
Post #1009166
Posted Friday, October 22, 2010 7:28 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Tuesday, December 28, 2010 6:52 AM
Points: 30, Visits: 57
Disregarding that send multiple elements at once is questionable design, you con statements regard XML show your lack of comfort with XML.

Using OpenXML or XQuery is just as obvious and natural as T-Sql once one spends equivalent the time using.

Creating XML is superbly easy with .NET languages which have built in methods for dataset. writeXML.

Your article shows a lack of time spent working with XML.
Post #1009177
Posted Friday, October 22, 2010 7:37 AM
Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Tuesday, August 12, 2014 2:52 AM
Points: 363, Visits: 1,318
In fat that's how I worked - I built the XML in .NET, validated against a schema collection and passing it to a stored procedure for insert / update.
Post #1009185
Posted Friday, October 22, 2010 10:18 AM
SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Friday, May 30, 2014 7:03 AM
Points: 94, Visits: 31
I can see immediate usefulness in a project where I intend to use it today: I have large tables in Access database spread all over the country. Ancient app, many users. Not going away. But...must centralize the data for control and for management review and reporting. There's more involved obviously, but this little gem of an article shows me how I can organize my transmission into 5K chunks and send them to a stored procedure in SS that dumps them in a holding table - no processing except landing in the dump table. Later, I can follow up on SS with scheduled job to process rows into proper tables.

User in Access has minimum transmission time (no waiting for inserts on server end). Data gets centralized (yes, GUIDs are added at transmission source and used throughout from then on).

Fantastic article. Exactly what I needed at exactly the right moment.

The collection into XML cost about 20 lines, including the routine to convert a recordset data row into an xml node.

Post #1009334
Posted Friday, October 22, 2010 10:21 AM
SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Friday, May 30, 2014 7:03 AM
Points: 94, Visits: 31
Extremely helpful article. Exactly what I needed at exactly the moment I needed it. See my post in reply above for more detail, but in summary I am shipping data from all over the country from Access mdb's to SQL Server, dropping XML in a holding table, without even processing inserts during the session. Scheduled job follows up later and inserts as appropriate.

Very helpful.

Very low cost in code to implement. Maybe 20 rows or so, and saved a ton of coding for sending values row at a time via parameters on stored procedure. Thank God I hadn't started writing that mess when I found this. Solid gold!!
Post #1009337
Posted Friday, October 22, 2010 10:41 AM


SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Friday, April 4, 2014 7:31 AM
Points: 117, Visits: 176
Charles Wannall (10/22/2010)
I can see immediate usefulness in a project where I intend to use it today: I have large tables in Access database spread all over the country. Ancient app, many users. Not going away. But...must centralize the data for control and for management review and reporting. There's more involved obviously, but this little gem of an article shows me how I can organize my transmission into 5K chunks and send them to a stored procedure in SS that dumps them in a holding table - no processing except landing in the dump table. Later, I can follow up on SS with scheduled job to process rows into proper tables.

User in Access has minimum transmission time (no waiting for inserts on server end). Data gets centralized (yes, GUIDs are added at transmission source and used throughout from then on).

Fantastic article. Exactly what I needed at exactly the right moment.

The collection into XML cost about 20 lines, including the routine to convert a recordset data row into an xml node.




I would suggest two ideas, both similar.

1. Put a IDataReader on your JetDatabase (Access .mdb file). As your fire-hose read the JetDatabase data, throw the values into a strong dataset.
2. Every X number of rows (make it configurable and experiment).....push the strong dataset.GetXml() to a stored procedure.

Deviation A
1. If you care about bandwidth, zip the dataset xml and send it to a service. Or check out binary methods for serializing datasets.
2. You could have a service which can unzip (or use raw string-xml) to process rows via bulk.

OR

Put your data into simple POCO objects. Add [Serializable] attribute to the class(es).
Binary serialize and ship to Service.

...
I have an older blog entry which describes the first one:
http://sholliday.spaces.live.com/Blog/cns!A68482B9628A842A!527.entry

...




Post #1009353
Posted Friday, October 22, 2010 12:24 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, May 2, 2014 11:20 AM
Points: 2, Visits: 23
Response to con for XML item, "Difficult to create XML data at the application layer":

Using the ASP.Net C# XmlTextWriter class , I found creating my data in xml format to send to the database fairly simple. XmlTextWriter did all the formatting for me. I could then check the string created and tweak the xmlTextWriter code if need be.

(http://msdn.microsoft.com/en-us/library/system.xml.xmltextwriter.aspx)

I have listed below a sample of the commands I used:

using System.Xml;
using System.IO;

StringWriter writer = new StringWriter();
XmlTextWriter xtWriter = new XmlTextWriter(writer);

xtWriter.WriteStartElement("XML");
xtWriter.WriteStartElement("CutParms");
xtWriter.WriteStartElement("Parm");

//Begin loop

xtWriter.WriteAttributeString("DataYear", ddEventYear.SelectedText);
xtWriter.WriteAttributeString("DataType", ddCertificateType.SelectedValue);
xtWriter.WriteAttributeString("CutType", ddCutTypes.SelectedValue);
xtWriter.WriteAttributeString("RevisedStateList", sJurisdictionList);

//End loop

xtWriter.WriteEndElement();
xtWriter.WriteEndElement();
xtWriter.WriteEndElement();
xtWriter.Close();

sXML = writer.ToString(); // string to send to the database
Post #1009426
Posted Saturday, October 23, 2010 5:45 AM
SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Today @ 3:20 AM
Points: 411, Visits: 1,400
bruce lee-206043 (8/25/2009)
there is another choice:

Update SQL Server Data by Using XML Updategrams

An updategram is a data structure that you can use to express a change in the data. INSERT, UPDATE, and DELETE commands are represented in an updategram by the difference of the image of the data before and the image of the data after a change.

http://support.microsoft.com/kb/316018

Have you ever tried using it? I did and it performed badly on even a simple xml format(1 level elements with no more than 10 elements per row, closely resembling the underlying table's layout). For more complicated xml formats it is seriously slow (up to taking minutes to update a single record). We found that this is mostly due to the very complex (and huge) T-SQL statements it generates. The idea of being able to select, insert, delete and update all using the same 'meta model' defined in a single xsd, hiding the actual database layout from the application is very nice. But the implementation is far from usable as it is right now.




Posting Data Etiquette - Jeff Moden
Posting Performance Based Questions - Gail Shaw
Hidden RBAR - Jeff Moden
Cross Tabs and Pivots - Jeff Moden
Catch-all queries - Gail Shaw


If you don't have time to do it right, when will you have time to do it over?
Post #1009599
Posted Saturday, October 23, 2010 10:08 PM
SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Friday, May 30, 2014 7:03 AM
Points: 94, Visits: 31
Hmmmmm...OK. I gotta learn some more stuff.

I will investigate this and follow up with whatever I can master, as improvements, when I understand it well enough to apply it.

Awfully nice of you to send it along. It all sounds like improvements worth making.

Thanks ever so much.

Rick
Post #1009662
« Prev Topic | Next Topic »

Add to briefcase «««23456»»

Permissions Expand / Collapse