Processing of XML with OPENROWSET gives out of memory exception

  • I have an XML file about 25MB. When I read it with openrowset it gives me 'System.OutOfMemoryException'. The machine I'm running has 16GB of RAM and memory is definetely not exceeded. When I run smaller XML files it works fine.

    I've read that older versions of SQL Server had this problem and it was caused by the parser having limited amount of memory. Is this still the case? Is there a way to change this?

  • Change the result set from Grid to Text.

  • If the issue is with the result set being returned to the screen there is an option to increase this size in SSMS

    Options --> Query Results --> SQL Server --> Results to Grid --> XML Data: [set value here]

  • This is inside a stored procedure. Results are not returned to the screen. I'm currently running the procedure in query analyzer to test it, but it's going to be called as part of a web service process.

  • Try delivering it to an xml or VARCHAR(MAX) datatype variable first (SELECT @xmlVar = Stuff FROM OPENQUERY...) before shipping it along to the webservice.

    If I'm right, you can get that because the cache doesn't build enough room for the plan in the memory because it doesn't expect anything that size. The variable should let the compiler know to use more room.

    Please be aware I'm not SURE of this. I'm guessing, and I don't have SQL 2012 to goof off with here to confirm, but it should be simple enough to try it.


    - Craig Farrell

    Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.

    For better assistance in answering your questions[/url] | Forum Netiquette
    For index/tuning help, follow these directions.[/url] |Tally Tables[/url]

    Twitter: @AnyWayDBA

  • One thing I have noticed recently is TSQL is not the best tool to use when it comes to XML. C# is far more efficient at searching and returning the data you need.

    I am not sure what your skill set is, but if there is any way you could pass in this XML file to a WCF Service and then use XDocument class to return the data you want in a list that might work out better.

    I recently had an issue with a SQL job that was using XQUery to return results from XML file to a temp table which we would then use later on in our process. Every is fine and dandy until a 15MB file came and it took 2 hours to scan this XML file to return the results.

    I then created a c# service that did same thing and was able to get the data I needed from the XML file in under 1 minute!

    C# is my tool of choice when it comes to large XML files.

  • N_Muller (6/20/2014)


    I have an XML file about 25MB. When I read it with openrowset it gives me 'System.OutOfMemoryException'. The machine I'm running has 16GB of RAM and memory is definetely not exceeded. When I run smaller XML files it works fine.

    I've read that older versions of SQL Server had this problem and it was caused by the parser having limited amount of memory. Is this still the case? Is there a way to change this?

    This is possibly a problem with the XML data, 25Mb is not a large sets, the limit for a single xml blob is 2Gb.

    😎

    Here is a simple way for checking this, first produce a reasonable size XML set (ca. 350 mb) in C:\TEMP.

    bcp "SELECT sm.* FROM sys.all_sql_modules sm cross join (select n from (values (1),(1),(1),(1),(1),(1),(1),(1),(1),(1),(1),(1),(1),(1),(1),(1),(1),(1),(1),(1),(1)) AS X(N)) AS X FOR XML PATH('SQL_OBJ'), TYPE, ROOT('ObjList')" queryout C:\TEMP\sys_all_sql_modules_large.xml -w -SServerName -T

    Create destination table in tempdb

    USE tempdb;

    GO

    CREATE TABLE dbo.TBL_XML_DOC

    ( XML_DOC_ID INT IDENTITY(1,1) PRIMARY KEY CLUSTERED NOT NULL

    ,XML_DOCUMENT XML NOT NULL

    );

    Import the data

    INSERT INTO dbo.TBL_XML_DOC(XML_DOCUMENT)

    SELECT x.BulkColumn FROM OPENROWSET(

    BULK 'C:\TEMP\sys_all_sql_modules_large.xml',

    SINGLE_BLOB) AS x;

    Check the import by shredding with the nodes function

    SELECT

    TX.XML_DOC_ID

    ,SQ.LOBJ.query('.')

    FROM dbo.TBL_XML_DOC TX

    OUTER APPLY TX.XML_document.nodes('ObjList/SQL_OBJ') AS SQ(LOBJ)

  • brad.mason5 (6/20/2014)


    One thing I have noticed recently is TSQL is not the best tool to use when it comes to XML. C# is far more efficient at searching and returning the data you need.

    I am not sure what your skill set is, but if there is any way you could pass in this XML file to a WCF Service and then use XDocument class to return the data you want in a list that might work out better.

    I recently had an issue with a SQL job that was using XQUery to return results from XML file to a temp table which we would then use later on in our process. Every is fine and dandy until a 15MB file came and it took 2 hours to scan this XML file to return the results.

    I then created a c# service that did same thing and was able to get the data I needed from the XML file in under 1 minute!

    C# is my tool of choice when it comes to large XML files.

    I find that the XQuery in T-SQL normally outperforms C# and other methods if done correctly. It is however easy to get it wrong. The import part of the example in my previous post, which is 350 Mb, runs for about 10 seconds on my laptop, including the shredding. If for example one tries to parse it on the way in (see the code below), it takes in excess of an hour!

    😎

    Wrong way, do not use!

    INSERT INTO dbo.TBL_XML_DOC(XML_DOCUMENT)

    SELECT

    SQ.LOBJ.query('.')

    FROM

    (

    SELECT CAST(BulkColumn AS XML) AS XML_DATA FROM OPENROWSET(

    BULK 'C:\TEMP\sys_all_sql_modules_large.xml',

    SINGLE_BLOB) AS X

    ) AS X

    OUTER APPLY x.XML_DATA.nodes('ObjList/SQL_OBJ') AS SQ(LOBJ)

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply