Click here to monitor SSC
SQLServerCentral is supported by Redgate
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

Insert Statement for 2 million rows Expand / Collapse
Author
Message
Posted Thursday, March 13, 2014 10:11 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Tuesday, August 16, 2016 3:50 PM
Points: 171, Visits: 559
Hi!

I have multiple (individual) insert statement to insert around 2 million rows (using sqlcmd because the insert stament is around 8 thou lines).

It took about an hr to insert when I tried it on my machine which is physical server with 4 GB RAM.

When I try to run the same on my test server which is virtual machine it is still running,we have bumped up memory from 2 to 4 GB RAM.

Any suggestions?
Post #1550803
Posted Thursday, March 13, 2014 10:20 AM


SSC-Forever

SSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-Forever

Group: General Forum Members
Last Login: Yesterday @ 7:52 PM
Points: 41,198, Visits: 38,504
pretea (3/13/2014)
Hi!

I have multiple (individual) insert statement to insert around 2 million rows (using sqlcmd because the insert stament is around 8 thou lines).

It took about an hr to insert when I tried it on my machine which is physical server with 4 GB RAM.

When I try to run the same on my test server which is virtual machine it is still running,we have bumped up memory from 2 to 4 GB RAM.

Any suggestions?


Why is the insert statement around 8 thousand lines?


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

Helpful Links:
How to post code problems
How to post performance problems
Post #1550809
Posted Thursday, March 13, 2014 10:30 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Tuesday, August 16, 2016 3:50 PM
Points: 171, Visits: 559
It is how the vendor sent in the script?

Other thing I can try is to ask them to load data into text file and use BCP.
Post #1550814
Posted Thursday, March 13, 2014 11:06 AM


SSC-Forever

SSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-Forever

Group: General Forum Members
Last Login: Yesterday @ 7:52 PM
Points: 41,198, Visits: 38,504
pretea (3/13/2014)
It is how the vendor sent in the script?

Other thing I can try is to ask them to load data into text file and use BCP.


Heh... I can only imagine what that script must look like.

Yes, I agree. Load the text file into a staging table using BCP, validate the data, and then insert it into the final table.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

Helpful Links:
How to post code problems
How to post performance problems
Post #1550833
Posted Thursday, March 13, 2014 11:29 AM


Hall of Fame

Hall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of Fame

Group: General Forum Members
Last Login: Yesterday @ 2:42 PM
Points: 3,717, Visits: 8,556
pretea (3/13/2014)
Hi!

I have multiple (individual) insert statement to insert around 2 million rows (using sqlcmd because the insert stament is around 8 thou lines).

It took about an hr to insert when I tried it on my machine which is physical server with 4 GB RAM.

When I try to run the same on my test server which is virtual machine it is still running,we have bumped up memory from 2 to 4 GB RAM.

Any suggestions?


Performance on a workstation PC, much less a VM running on your PC, will bear little resemblance to performance on a typical production server. That said, first option should be to get a delimited text version of the records and then BCP into database, like others suggested. You can use 'Export Data' wizard to copy records from original source datbase to text file.
The problem with a script is that all 2 million inserts are probably getting batched into one huge transaction. If all you have to work with is this single script, then use a tool like GSplit to chop it into seperate smaller batches. I'm guessing 20 .sql scripts with 100,000 inserts per batch should take it down to a manageable level. Also, you can use PowerShell script or SQLCMD command and a batch file to execute the scripts sequentially.
http://www.gdgsoft.com/gsplit/
http://technet.microsoft.com/en-us/library/ms170572.aspx



"If you break a few eggs, then make an omelet."
Post #1550848
Posted Thursday, March 13, 2014 12:02 PM


SSC-Forever

SSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-Forever

Group: General Forum Members
Last Login: Yesterday @ 7:52 PM
Points: 41,198, Visits: 38,504
Eric M Russell (3/13/2014)
pretea (3/13/2014)
Hi!

I have multiple (individual) insert statement to insert around 2 million rows (using sqlcmd because the insert stament is around 8 thou lines).

It took about an hr to insert when I tried it on my machine which is physical server with 4 GB RAM.

When I try to run the same on my test server which is virtual machine it is still running,we have bumped up memory from 2 to 4 GB RAM.

Any suggestions?


Performance on a workstation PC, much less a VM running on your PC, will bear little resemblance to performance on a typical production server. That said, first option should be to get a delimited text version of the records and then BCP into database, like others suggested. You can use 'Export Data' wizard to copy records from original source datbase to text file.
The problem with a script is that all 2 million inserts are probably getting batched into one huge transaction. If all you have to work with is this single script, then use a tool like GSplit to chop it into seperate smaller batches. I'm guessing 20 .sql scripts with 100,000 inserts per batch should take it down to a manageable level. Also, you can use PowerShell script or SQLCMD command and a batch file to execute the scripts sequentially.
http://www.gdgsoft.com/gsplit/
http://technet.microsoft.com/en-us/library/ms170572.aspx


BCP will actually do all of that.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

Helpful Links:
How to post code problems
How to post performance problems
Post #1550868
Posted Thursday, March 13, 2014 12:19 PM


Hall of Fame

Hall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of Fame

Group: General Forum Members
Last Login: Yesterday @ 2:42 PM
Points: 3,717, Visits: 8,556
Jeff Moden (3/13/2014)
Eric M Russell (3/13/2014)
pretea (3/13/2014)
Hi!

I have multiple (individual) insert statement to insert around 2 million rows (using sqlcmd because the insert stament is around 8 thou lines).

It took about an hr to insert when I tried it on my machine which is physical server with 4 GB RAM.

When I try to run the same on my test server which is virtual machine it is still running,we have bumped up memory from 2 to 4 GB RAM.

Any suggestions?


Performance on a workstation PC, much less a VM running on your PC, will bear little resemblance to performance on a typical production server. That said, first option should be to get a delimited text version of the records and then BCP into database, like others suggested. You can use 'Export Data' wizard to copy records from original source datbase to text file.
The problem with a script is that all 2 million inserts are probably getting batched into one huge transaction. If all you have to work with is this single script, then use a tool like GSplit to chop it into seperate smaller batches. I'm guessing 20 .sql scripts with 100,000 inserts per batch should take it down to a manageable level. Also, you can use PowerShell script or SQLCMD command and a batch file to execute the scripts sequentially.
http://www.gdgsoft.com/gsplit/
http://technet.microsoft.com/en-us/library/ms170572.aspx


BCP will actually do all of that.

Yes, BCP is best option, unless all that's available is the insert script. We can not BCP a .sql insert script.



"If you break a few eggs, then make an omelet."
Post #1550874
Posted Thursday, March 13, 2014 12:54 PM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Tuesday, August 16, 2016 3:50 PM
Points: 171, Visits: 559
Yes let's see what see, have asked them to get the text version of the records and use then we can use BCP if not use smaller batches.

Thanks for all the suggestions.
Post #1550888
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse