• v.bartosh - Tuesday, December 19, 2017 2:15 AM

    We've got a system which generates an SQL script to be executed against some test database, and were using SQLCMD for running that script. Everything was fine while the size of the script was relatively small, but now we've ran into gigabyte sizes and more, and the script execution started failing with out of memory error. The script itself is just a bunch of 'exec storedprocedure's, split into batches by 'GO' statement after every 1000 exec's (every 21k lines of code effectively, it's about 1MB size)
    I suppose that SQLCMD tries to read the whole file into memory and then makes some splits/conversions with it, consuming more and more. Is there any way to restritct it to reading and running just one batch at a time?

    I suspect that you're generating a script that uses a RBAR stored procedure to insert data into tables and that you're generating an EXEC for every row you're trying to insert.

    If that's what you're doing, try the following, instead.  Whatever "system" you're using to generate the SQLCMD stuff, change that to generate a TAB delimited file and using BULK INSERT to load the data file into a table.  Then, if you insist on continuing to use the RBAR stored procedure to input the data, just loop through the rows of the table to do it.

    A much better thing to do would be to still create the TAB delimited file and BULK INSERT it into a table and then rewrite the RBAR stored procedure to do things in bulk, as well.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)