Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

Improving DELETE performance Expand / Collapse
Author
Message
Posted Tuesday, November 19, 2013 10:58 AM
SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Friday, August 1, 2014 7:15 AM
Points: 82, Visits: 246
Hi,
in a my stored procedure I need to use a DELETE statement and I cannot use the TRUNCATE TABLE.
The table source contains more tens of millions of rows and I need to eliminate some tens of millions of data.
The recovery model of the db is SIMPLE.

The DELETE statement is similar to this:

delete from work.dbo.WRK_FACT
where month >= @I and year = @year

Any ideas to improve the DELETE perfomances?

Thanks
Post #1515684
Posted Tuesday, November 19, 2013 11:07 AM


Hall of Fame

Hall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of Fame

Group: General Forum Members
Last Login: Yesterday @ 7:03 PM
Points: 3,486, Visits: 7,531
Use a DELETE in batches.
  DECLARE @i int = 1  
WHILE @i > 0
BEGIN
DELETE TOP (1000000) work.dbo.WRK_FACT
where month >= @I and year = @year
SET @i = @@ROWCOUNT
BEGIN END

Or if you have a trigger on the table that might alter @@ROWCOUNT value.
  WHILE EXISTS (  
SELECT TOP 1 1
FROM work.dbo.WRK_FACT
where month >= @I and year = @year
)
BEGIN
DELETE TOP (1000000) work.dbo.WRK_FACT
where month >= @I and year = @year
BEGIN END




Luis C.
Are you seriously taking the advice and code from someone from the internet without testing it? Do you at least understand it? Or can it easily kill your server?

Forum Etiquette: How to post data/code on a forum to get the best help
Post #1515687
Posted Tuesday, November 19, 2013 11:16 AM
Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Yesterday @ 1:37 PM
Points: 346, Visits: 1,366
This was discussed just recently.

http://www.sqlservercentral.com/Forums/Topic1511574-3412-1.aspx#bm1511578

If you are deleting more than you are keeping you can populate a duplicate empty table with the rows you want to keep, delete the old table, rename the new table.

I know you said you can't use truncate, but you may wish to reconsider your reasons for not using it when it is the absolute fastest option available. You could populate a temp table with the rows you want to keep, truncate the table, then move the saved transactions from the temp table back to the original table.


__________________________________________________________________________________________________________
How to Post to get the most: http://www.sqlservercentral.com/articles/Best+Practices/61537/
Post #1515694
Posted Tuesday, November 19, 2013 2:39 PM
SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Friday, August 1, 2014 7:15 AM
Points: 82, Visits: 246
Hi LinksUp,
the batch/bunch approach for deletions rows it is very interesting. I want to avoid to manage a cursor for a such operation.

In my scenario, I need to populate a working table that represents the data source for a SSAS solution: I want to avoid to partition a working table.

In order to SELECT ... INTO, DROP old table and rename the new table, ALTER permissions are necessary.

My working table could countain 100 millions of records: 70 millions of rows could be deleted and maintained 30 millions of rows. In this case, using a temp table to fill is a good solution? However I could need to delete 40 millions of records and maintain 60 millions of records.

Ultimately, the batch approach seems the better solution ..., isn't it?

Thanks
Post #1515793
Posted Tuesday, November 19, 2013 2:42 PM
Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Yesterday @ 1:37 PM
Points: 346, Visits: 1,366
pmscorca (11/19/2013)

Ultimately, the batch approach seems the better solution ..., isn't it?


Test, test, and test!!

There are a number of techniques to populate a test table with millions of rows in a very short time. Then you can implement the different different scenarios to see which is the fastest, uses the fewest resources, has the best execution plan, etc. . .


__________________________________________________________________________________________________________
How to Post to get the most: http://www.sqlservercentral.com/articles/Best+Practices/61537/
Post #1515795
Posted Tuesday, November 19, 2013 2:56 PM


Hall of Fame

Hall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of Fame

Group: General Forum Members
Last Login: Yesterday @ 7:03 PM
Points: 3,486, Visits: 7,531
LinksUp (11/19/2013)
pmscorca (11/19/2013)

Ultimately, the batch approach seems the better solution ..., isn't it?


Test, test, and test!!

There are a number of techniques to populate a test table with millions of rows in a very short time. Then you can implement the different different scenarios to see which is the fastest, uses the fewest resources, has the best execution plan, etc. . .

+1000
When dealing with performance improvements, the only sure way to choose is by testing and testing again after any changes on SQL Server.



Luis C.
Are you seriously taking the advice and code from someone from the internet without testing it? Do you at least understand it? Or can it easily kill your server?

Forum Etiquette: How to post data/code on a forum to get the best help
Post #1515802
Posted Tuesday, November 19, 2013 2:56 PM
SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Friday, August 1, 2014 7:15 AM
Points: 82, Visits: 246
Ok.
However, I can post this article: http://social.technet.microsoft.com/wiki/contents/articles/20651.delete-huge-amount-of-data-from-a-table.aspx
Post #1515804
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse