• Jeff Moden - Tuesday, December 19, 2017 9:32 PM

    I suspect that you're generating a script that uses a RBAR stored procedure to insert data into tables and that you're generating an EXEC for every row you're trying to insert.

    If that's what you're doing, try the following, instead.  Whatever "system" you're using to generate the SQLCMD stuff, change that to generate a TAB delimited file and using BULK INSERT to load the data file into a table.  Then, if you insist on continuing to use the RBAR stored procedure to input the data, just loop through the rows of the table to do it.

    A much better thing to do would be to still create the TAB delimited file and BULK INSERT it into a table and then rewrite the RBAR stored procedure to do things in bulk, as well.

    You are most surely right, but in this particular case we cannot do that, at least not the easy way. My question was not about the speed optimization, but about the possibility to run the script at all.
    But after looking into the script, we found out that there was an unescaped quote in some string values, causing SQLCMD to interpret the script incorrectly, hence the 'out of memory' error. Why not the syntax error - that I do not know.

    After we had fixed those quotes the script ran just fine, with SQLCMD consuming 15MB memory maximum.