July 24, 2010 at 8:03 am
Dear Friends,
I've written a stored procedure which populates a temporary table with a complex join query first (some 14,64,441 rows). Then it reads each row and on the basis of some condition inserts a new record into a table. The total number of rows inserted is around 3,00,000.
The problem is that it takes so much of time to execute: It inserts around 2000 rows in 40 minutes. Is there a way to optimize the procedure code so that it takes less time?
I've attached the query that I used to create the stored procedure for your reference.
Request you to suggest me something which can dramatically reduce the time taken to insert so many records in a table while doing it in a loop on the basis of some condition.
Thank You !
July 24, 2010 at 8:50 am
Your best bet for optimising that is to get rid of the while loop. I just took a brief look, but I couldn't see anything that needed row-by-row checking.
You should be able to change that INSERT ... VALUES into an INSERT ... SELECT ... WHERE ITEMASSOLDID IS NULL
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
July 26, 2010 at 1:18 am
Hi Gail,
Thank you for the idea. I transformed my select query to contain the conditions that I was checking for each row and inserted the fetched rows directly by using INSERT...SELECT . It worked fine and all the required rows were inserted in 4 minutes approximately.
Best Regards,
Nitin
July 26, 2010 at 2:11 am
Excellent. Is that good enough or do you need it optimising further?
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
Viewing 4 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply