August 6, 2009 at 4:08 am
Guru, I want to move 99,00,000 records from one table to another table
I tried below scenarios but its impacting the performance.
1.
INSERT INTO table1
SELECT * FROM table2
2.
ROW-BY-ROW processing using cursor.
need your suggestions.
Abhijit - http://abhijitmore.wordpress.com
August 6, 2009 at 6:18 am
During a maintenance window I would...
1- Set database to simple recovery model
2- Load table as SELECT INTO
3- Set database back to original recovery model
4- Take a backup
Another alternative would be to bcp out / bcp in.
_____________________________________
Pablo (Paul) Berzukov
Author of Understanding Database Administration available at Amazon and other bookstores.
Disclaimer: Advice is provided to the best of my knowledge but no implicit or explicit warranties are provided. Since the advisor explicitly encourages testing any and all suggestions on a test non-production environment advisor should not held liable or responsible for any actions taken based on the given advice.August 6, 2009 at 6:31 am
In addition to the Paul; also you can create the table without any constraint and Primary Key will improve the performance. Later you can create the Primary Key or constraints.
In either case "BCP" or "SELECT INTO" puting db into simple recovery will be good followed by full backup.
---------------------------------------------------
"Thare are only 10 types of people in the world:
Those who understand binary, and those who don't."
August 6, 2009 at 7:02 am
Also consider copying the records in smaller batches.
http://sql-server-performance.com/Community/forums/p/11750/11750.aspx
declare @minID intdeclare @maxID int
declare @batchStart int
declare @batchSize int
set @batchSize = 25000
select @minID = min(id), @maxID = max(id)from sourceTable
set @batchStart = @minIDwhile @batchStart <= @maxID
begin
insert into DestinationTable(....)
select ...
from sourceTable s
where s.id between @batchStart and @batchStart + @batchSize - 1
set @batchStart = @batchStart + @batchSize
end
Adding a WAITFOR inside the loop, may improve the server performance.
August 6, 2009 at 1:22 pm
SELECT INTO will not work on an existing table. 99 Lacs is 9.9 million rows. No matter how you slice it, that will take some time especiall if SIMPLE recovery is not set, if the clustered index doesn't support the order of inserts (not to mention all the page splits that will occur), if there are a large number of indexes, and/or if there are triggers on the target table.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 5 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply