June 23, 2010 at 8:01 am
Hi,
I am looking at copying a vast amount of data between SQL 2000 servers (SP4).
The source table has 615 million rows (don't ask, I didn't create it!) and takes time to return any more than 1000 rows and I have a script (below) that returns a result set of 17 million rows that have to be copied to onto another sql server table. I am not sure of which copy method to use as I have not copied that vast amount of rows before. Which one would perform best out of bcp, DTS or BULK INSERT?
DECLARE @SD smalldatetime
DECLARE @ED smalldatetime
SET @SD = '20090301' -- do a month at a time (batch it)
SET @ED = '20090401'
SELECT* FROM transactionitems NOLOCK - this will be changed to select all named cols
WHEREbusinessdate >=@SD AND businessdate < @ED
the data to copy does not contain any BLOBs or TEXT/IMAGE data.
Any ideas?
Thanks in advance
qh
June 24, 2010 at 7:11 am
Found this out in the end.
DTS data copy with transformations bombed out after and hour so went with BCP file out from source db - which created a 4GB .dat file - then BULK INSERT back in to destination db - 18m rows copied in 4 minutes!
Well done me! 😉
fc
June 24, 2010 at 7:43 am
quackhandle1975 (6/24/2010)
Found this out in the end.DTS data copy with transformations bombed out after and hour so went with BCP file out from source db - which created a 4GB .dat file - then BULK INSERT back in to destination db - 18m rows copied in 4 minutes!
Well done me! 😉
fc
Thanks for the feedback on this.
--Jeff Moden
Change is inevitable... Change for the better is not.
November 20, 2010 at 6:09 pm
Could you please post an example of your Bulk Insert statement? Also did you have any clustered indexes?
Thanks.
Viewing 4 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply