Out of memory error while scripting structure and data

  • I am using the Tasks -> Generate Scripts utility to generate a complete (data and schema) script that would allow me to recreate a database on an older SQL Server or using a different collating sequence. Everything works well if my database is small enough (resulting script is about 5 gb) but I am getting an out-of-memory error whenever I try with larger databases. I have over 100 gb of free disk space and 16 gb of ram and I see the memory usage grow toward this limit as the script executes. I increased the maximum memory used by my server without success. Script generation always crashes at the exact same step.

    Does anyone know how to fix this issue ?
    Will adding more ram will fix the issue (until, of course, I encounter an even larger database) ?
    Is there another application I could use to achieve the same result ?

  • cmartel 20772 - Friday, January 18, 2019 3:05 PM

    I am using the Tasks -> Generate Scripts utility to generate a complete (data and schema) script that would allow me to recreate a database on an older SQL Server or using a different collating sequence. Everything works well if my database is small enough (resulting script is about 5 gb) but I am getting an out-of-memory error whenever I try with larger databases. I have over 100 gb of free disk space and 16 gb of ram and I see the memory usage grow toward this limit as the script executes. I increased the maximum memory used by my server without success. Script generation always crashes at the exact same step.

    Does anyone know how to fix this issue ?
    Will adding more ram will fix the issue (until, of course, I encounter an even larger database) ?
    Is there another application I could use to achieve the same result ?

    Without any details of the actual error or where it's coming from I would suspect it's not the server but rather SSMS running out of memory.
    You can break up how you script things out - such as script just the schema and then script the data. If it's a lot of data, you can do that incrementally as well by selecting individual tables and just scripting those, then do another set, and another etc. Hard to include every possible way but the point is you can split up the tasks and what is scripted by changing the options when you are generating the scripts.
    You can try to limit the resources SSMS is using by making sure it's has just started up (or restart) when you go to do the scripts. Disable any third party add-ons being used.

    Sue

  • cmartel 20772 - Friday, January 18, 2019 3:05 PM

    I am using the Tasks -> Generate Scripts utility to generate a complete (data and schema) script that would allow me to recreate a database on an older SQL Server or using a different collating sequence. Everything works well if my database is small enough (resulting script is about 5 gb) but I am getting an out-of-memory error whenever I try with larger databases. I have over 100 gb of free disk space and 16 gb of ram and I see the memory usage grow toward this limit as the script executes. I increased the maximum memory used by my server without success. Script generation always crashes at the exact same step.

    Does anyone know how to fix this issue ?
    Will adding more ram will fix the issue (until, of course, I encounter an even larger database) ?
    Is there another application I could use to achieve the same result ?

    The problem is that all of the data is generated in a very RBAR fashion using a full INSERT/VALUES statefor every row.  My recommendation is to generate everything without the data and use BCP to generate the data.

    Better yet, if you want to make a copy of the database, why not just use one of the full backups to restore?  It'll blow the doors off of just about every other method and you won't have to worry about the precedence order in tables due to keys/foreign keys, etc, ad infinitum.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden - Friday, January 18, 2019 5:47 PM

    cmartel 20772 - Friday, January 18, 2019 3:05 PM

    I am using the Tasks -> Generate Scripts utility to generate a complete (data and schema) script that would allow me to recreate a database on an older SQL Server or using a different collating sequence. Everything works well if my database is small enough (resulting script is about 5 gb) but I am getting an out-of-memory error whenever I try with larger databases. I have over 100 gb of free disk space and 16 gb of ram and I see the memory usage grow toward this limit as the script executes. I increased the maximum memory used by my server without success. Script generation always crashes at the exact same step.

    Does anyone know how to fix this issue ?
    Will adding more ram will fix the issue (until, of course, I encounter an even larger database) ?
    Is there another application I could use to achieve the same result ?

    The problem is that all of the data is generated in a very RBAR fashion using a full INSERT/VALUES statefor every row.  My recommendation is to generate everything without the data and use BCP to generate the data.

    Better yet, if you want to make a copy of the database, why not just use one of the full backups to restore?  It'll blow the doors off of just about every other method and you won't have to worry about the precedence order in tables due to keys/foreign keys, etc, ad infinitum.

    Except that this is for lower versions so backup/restore isn't an option for the poster.

    Sue

  • Sue_H - Friday, January 18, 2019 7:26 PM

    Jeff Moden - Friday, January 18, 2019 5:47 PM

    cmartel 20772 - Friday, January 18, 2019 3:05 PM

    I am using the Tasks -> Generate Scripts utility to generate a complete (data and schema) script that would allow me to recreate a database on an older SQL Server or using a different collating sequence. Everything works well if my database is small enough (resulting script is about 5 gb) but I am getting an out-of-memory error whenever I try with larger databases. I have over 100 gb of free disk space and 16 gb of ram and I see the memory usage grow toward this limit as the script executes. I increased the maximum memory used by my server without success. Script generation always crashes at the exact same step.

    Does anyone know how to fix this issue ?
    Will adding more ram will fix the issue (until, of course, I encounter an even larger database) ?
    Is there another application I could use to achieve the same result ?

    The problem is that all of the data is generated in a very RBAR fashion using a full INSERT/VALUES statefor every row.  My recommendation is to generate everything without the data and use BCP to generate the data.

    Better yet, if you want to make a copy of the database, why not just use one of the full backups to restore?  It'll blow the doors off of just about every other method and you won't have to worry about the precedence order in tables due to keys/foreign keys, etc, ad infinitum.

    Except that this is for lower versions so backup/restore isn't an option for the poster.

    Sue

    Crud... missed that in the original post.  Thanks, Sue.

    The BCP option would still work here and would a ton faster.  You only have to worry about the correct order of things.  Of course, the script and data generator in SQL Server isn't totally faithful there either.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Sue_H - Friday, January 18, 2019 4:04 PM

    cmartel 20772 - Friday, January 18, 2019 3:05 PM

    I am using the Tasks -> Generate Scripts utility to generate a complete (data and schema) script that would allow me to recreate a database on an older SQL Server or using a different collating sequence. Everything works well if my database is small enough (resulting script is about 5 gb) but I am getting an out-of-memory error whenever I try with larger databases. I have over 100 gb of free disk space and 16 gb of ram and I see the memory usage grow toward this limit as the script executes. I increased the maximum memory used by my server without success. Script generation always crashes at the exact same step.

    Does anyone know how to fix this issue ?
    Will adding more ram will fix the issue (until, of course, I encounter an even larger database) ?
    Is there another application I could use to achieve the same result ?

    Without any details of the actual error or where it's coming from I would suspect it's not the server but rather SSMS running out of memory.
    You can break up how you script things out - such as script just the schema and then script the data. If it's a lot of data, you can do that incrementally as well by selecting individual tables and just scripting those, then do another set, and another etc. Hard to include every possible way but the point is you can split up the tasks and what is scripted by changing the options when you are generating the scripts.
    You can try to limit the resources SSMS is using by making sure it's has just started up (or restart) when you go to do the scripts. Disable any third party add-ons being used.

    Sue

    Thanks Sue,

    The error is : «Exception of type 'System.OutOfMemoryException' was thrown. at System.Text.StringBuilder.ToString()». This script generation utility is written in .Net and it seems that the developer accumulates the whole script in a StringBuilder rather that a periodically flushing it on disk using a StreamWriter. StreamBuilders have a maximum capacity of 2Gb, which is far less than the available memory of my computer (I increased to 24 Gb and got the exact same error at the exact same place).

    Generating several smaller scripts is not trivial. I tried and got a «circular reference» error. The scripting engine is well designed (but badly implemented!) and ensures that no dependencies are ignored. If your database objects are heavily related one to each other then you basically have to include everything.

    I will therefore use the utility to script the schema only and I will write my own application to transfer the data in the right order.

  • cmartel 20772 - Monday, January 21, 2019 2:24 PM

    Sue_H - Friday, January 18, 2019 4:04 PM

    cmartel 20772 - Friday, January 18, 2019 3:05 PM

    I am using the Tasks -> Generate Scripts utility to generate a complete (data and schema) script that would allow me to recreate a database on an older SQL Server or using a different collating sequence. Everything works well if my database is small enough (resulting script is about 5 gb) but I am getting an out-of-memory error whenever I try with larger databases. I have over 100 gb of free disk space and 16 gb of ram and I see the memory usage grow toward this limit as the script executes. I increased the maximum memory used by my server without success. Script generation always crashes at the exact same step.

    Does anyone know how to fix this issue ?
    Will adding more ram will fix the issue (until, of course, I encounter an even larger database) ?
    Is there another application I could use to achieve the same result ?

    Without any details of the actual error or where it's coming from I would suspect it's not the server but rather SSMS running out of memory.
    You can break up how you script things out - such as script just the schema and then script the data. If it's a lot of data, you can do that incrementally as well by selecting individual tables and just scripting those, then do another set, and another etc. Hard to include every possible way but the point is you can split up the tasks and what is scripted by changing the options when you are generating the scripts.
    You can try to limit the resources SSMS is using by making sure it's has just started up (or restart) when you go to do the scripts. Disable any third party add-ons being used.

    Sue

    Thanks Sue,

    The error is : «Exception of type 'System.OutOfMemoryException' was thrown. at System.Text.StringBuilder.ToString()». This script generation utility is written in .Net and it seems that the developer accumulates the whole script in a StringBuilder rather that a periodically flushing it on disk using a StreamWriter. StreamBuilders have a maximum capacity of 2Gb, which is far less than the available memory of my computer (I increased to 24 Gb and got the exact same error at the exact same place).

    Generating several smaller scripts is not trivial. I tried and got a «circular reference» error. The scripting engine is well designed (but badly implemented!) and ensures that no dependencies are ignored. If your database objects are heavily related one to each other then you basically have to include everything.

    I will therefore use the utility to script the schema only and I will write my own application to transfer the data in the right order.

    Yeah that error is the one you get with SSMS and problems building larger scripts. Unfortunately it's not unusual to get the errors with larger scripts
    Writing your own app for the data is a good idea. Additionally as Jeff mentioned earlier, BCP would be a fast option for the data.

    Sue

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply