Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12»»

Fastest Way to consume large XML files Expand / Collapse
Author
Message
Posted Monday, July 28, 2014 12:47 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Monday, July 28, 2014 8:16 AM
Points: 111, Visits: 181
Hi everyone

I want to pick your brain on something.

I need to create script that will import large XML files (500 - 7GB) on a daily basis and store the data in a relational db structure.

What is the best and fastest way of importing such files. I have played around with smaller files and found the following.

1. SSIS XML Data Source: It doesn't seem to like the complex elements types and throws out the file.
2. Using Bulk File Import, sorting the file in XML variable and using XQuery to parse the file: This works but it can't take a file more than 2GB in size, so I can't use this method.
3. C# + XML Serialization: This also works, but seems to be terribly slow. I open the DB connection once, so it doesn't open and close for each db call, but still seems like it takes a long time.

Are there any other suggestions on how to import large XML quickly in a relational table structure?

Thank you
Post #1596696
Posted Monday, July 28, 2014 4:48 AM


Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: Yesterday @ 8:09 AM
Points: 1,191, Visits: 9,888
C# XML serialisation should be quickest if implemented correctly.

How have you implemented it? It sounds like you might be doing individual inserts for each row via a database connection embedded in the C# code? Try a different way. Implement it as an SSIS script task in a data flow and add rows to the output buffer within the read block of the XmlReader. That way, you're streaming the data into a bulk insert task, which is much more efficient.

You could also initialise a bulk load from within .Net to keep it as a discrete application, but via SSIS is much easier.
Post #1596753
Posted Monday, July 28, 2014 4:56 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Monday, July 28, 2014 8:16 AM
Points: 111, Visits: 181
Thanks Howard

I am doing single record inserts for it, will combine it with a SSIS and see how it goes.

Great advice, thanks!
Post #1596755
Posted Monday, July 28, 2014 4:59 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Monday, July 28, 2014 8:16 AM
Points: 111, Visits: 181
how would I setup the C# code as a data flow source though?
Post #1596758
Posted Monday, July 28, 2014 5:13 AM


Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: Yesterday @ 8:09 AM
Points: 1,191, Visits: 9,888
There's a basic tutorial here:

http://sql31.blogspot.co.uk/2013/03/how-to-use-script-component-as-data.html

Basically, most of your code goes into CreateNewOutputRows() and you just call Output0Buffer.AddRow() as you iterate through the XMLReader...
Post #1596768
Posted Monday, July 28, 2014 5:29 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Monday, July 28, 2014 8:16 AM
Points: 111, Visits: 181
Just a last question, I can see how this will work when the XML gets transformed into 1 table.

If the record consists out of 80 odd tables, I will have to run the same processing script for each block of data and output each element to its corresponding table.

Would you advise to perhaps use XML Serialization, then exporting each element to a flat file with all the PK/FK's included and then use a flat text source to import each.

I am just worried about the overhead of running the XML process for each element, or will this be a non-issue?
Post #1596779
Posted Monday, July 28, 2014 5:36 AM


Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: Yesterday @ 8:09 AM
Points: 1,191, Visits: 9,888
SSIS source tasks support multiple output buffers. 80 tables sounds pretty extreme, but you could, in theory, add 80 output buffers into a data flow into 80 destinations with a single pass through the actual file!

In practice, you're probably working at the extremes, so you should try both methods (and also spend time tuning the buffer size parameters). I would avoid making 80 passes of the file for obvious reasons.
Post #1596782
Posted Monday, July 28, 2014 5:50 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Monday, July 28, 2014 8:16 AM
Points: 111, Visits: 181
It's an international standard for our trade, so we can't make changes to the structure.

The suggestion you made sounds near perfect though, will implement it and play around with the buffer settings etc.

Thanks again.
Post #1596789
Posted Monday, July 28, 2014 6:08 AM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Today @ 2:08 PM
Points: 2,253, Visits: 6,172
Quick thought, 80 outputs is quite a lot, it might be worth looking into XQuery in TSQL for shredding the XML.


I have done few of these huge XML imports in the passed, one of the fastest method I've used is to bulk load the entire file, line by line, into a staging table and reconstruct the XML from there using FOR XML. Sounds daunting but actually is pretty quick. If I recall correctly, 4 core server, 8 Gb ram, SQL Server 2008 R2 did average around 5-6G of XML an hour.
Post #1596794
Posted Monday, July 28, 2014 6:12 AM


Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: Yesterday @ 8:09 AM
Points: 1,191, Visits: 9,888
I think you'd still have a 2GB max size that you can manipulate through XQuery, wouldn't you?
Post #1596796
« Prev Topic | Next Topic »

Add to briefcase 12»»

Permissions Expand / Collapse