• Eric M Russell (3/13/2014)


    pretea (3/13/2014)


    Hi!

    I have multiple (individual) insert statement to insert around 2 million rows (using sqlcmd because the insert stament is around 8 thou lines).

    It took about an hr to insert when I tried it on my machine which is physical server with 4 GB RAM.

    When I try to run the same on my test server which is virtual machine it is still running,we have bumped up memory from 2 to 4 GB RAM.

    Any suggestions?

    Performance on a workstation PC, much less a VM running on your PC, will bear little resemblance to performance on a typical production server. That said, first option should be to get a delimited text version of the records and then BCP into database, like others suggested. You can use 'Export Data' wizard to copy records from original source datbase to text file.

    The problem with a script is that all 2 million inserts are probably getting batched into one huge transaction. If all you have to work with is this single script, then use a tool like GSplit to chop it into seperate smaller batches. I'm guessing 20 .sql scripts with 100,000 inserts per batch should take it down to a manageable level. Also, you can use PowerShell script or SQLCMD command and a batch file to execute the scripts sequentially.

    http://www.gdgsoft.com/gsplit/

    http://technet.microsoft.com/en-us/library/ms170572.aspx

    BCP will actually do all of that.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)