Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««123»»

Cursor to Delete data Expand / Collapse
Author
Message
Posted Wednesday, November 06, 2013 4:22 AM


SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Friday, March 14, 2014 2:19 AM
Points: 2,820, Visits: 3,916
Tej_7342 (11/5/2013)
There are 2.2 billion records in the table and we want to only keep 2 weeks data, as of now we are lagging behind so we want to delete data in smaller chunks first so that the logs don't get messed up.
you should look for batch approach for these type of deletions

see http://www.sqlservercentral.com/articles/Top/63301/


-------Bhuvnesh----------
I work only to learn Sql Server...though my company pays me for getting their stuff done
Post #1511794
Posted Wednesday, November 06, 2013 5:21 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Yesterday @ 1:47 AM
Points: 176, Visits: 145
Bhuvnesh (11/6/2013)
Tej_7342 (11/5/2013)
There are 2.2 billion records in the table and we want to only keep 2 weeks data, as of now we are lagging behind so we want to delete data in smaller chunks first so that the logs don't get messed up.
you should look for batch approach for these type of deletions

see http://www.sqlservercentral.com/articles/Top/63301/


And indexing
Post #1511820
Posted Wednesday, November 06, 2013 12:45 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Tuesday, March 25, 2014 1:32 PM
Points: 4, Visits: 20
If I had to guess, the problem is that a single DELETE is too large to fit in the transaction log. DELETE is the wrong tool for the job. A cursor loop reduces the size of each transaction, so they would not overflow the log, but it is still naive.

The correct solution here is to partition the table. The SWITCH out the old partition with all of its data.
http://technet.microsoft.com/en-us/magazine/2007.03.partitioning.aspx

Post #1511980
Posted Wednesday, November 06, 2013 12:51 PM


SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Today @ 3:06 PM
Points: 2,763, Visits: 5,904
steven.ensslen (11/6/2013)
If I had to guess, the problem is that a single DELETE is too large to fit in the transaction log. DELETE is the wrong tool for the job. A cursor loop reduces the size of each transaction, so they would not overflow the log, but it is still naive.

The correct solution here is to partition the table. The SWITCH out the old partition with all of its data.
http://technet.microsoft.com/en-us/magazine/2007.03.partitioning.aspx


It depends as the OP is not interested on keeping old data.



Luis C.
I am a great believer in luck, and I find the harder I work the more I have of it. Stephen Leacock

Forum Etiquette: How to post data/code on a forum to get the best help
Post #1511983
Posted Wednesday, November 06, 2013 12:58 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Tuesday, March 25, 2014 1:32 PM
Points: 4, Visits: 20
Luis Cazares (11/6/2013)
steven.ensslen (11/6/2013)
If I had to guess, the problem is that a single DELETE is too large to fit in the transaction log. DELETE is the wrong tool for the job. A cursor loop reduces the size of each transaction, so they would not overflow the log, but it is still naive.

The correct solution here is to partition the table. The SWITCH out the old partition with all of its data.
http://technet.microsoft.com/en-us/magazine/2007.03.partitioning.aspx


It depends as the OP is not interested on keeping old data.


I disagree. If we want to get rid of the data SWITCH the partition then to DROP or TRUNCATE the destination table will be thousands of times faster/less-work-for-the-db than a DELETE. The only reason for the DELETE would be some sort of transaction replication, like log-shipping for Disaster Recovery.
Post #1511984
Posted Wednesday, November 06, 2013 12:59 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, March 14, 2014 3:00 PM
Points: 7, Visits: 58
Yes a delete would fill up the logs.
Right now the older data is deleted but going further we would to put in a practice which will delete data which gets older than 2 weeks.

May be put in a cursor and delete one day at a time. Any other way to accomplish this?

Thanks in Advance
Post #1511985
Posted Wednesday, November 06, 2013 2:49 PM


SSChampion

SSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampion

Group: General Forum Members
Last Login: Today @ 3:44 PM
Points: 11,970, Visits: 10,995
By far the simplest approach here is to use the technique I described. I guess since everyone is completely ignoring it that either I have lost my mind or nobody even read it.

_______________________________________________________________

Need help? Help us help you.

Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.

Need to split a string? Try Jeff Moden's splitter.

Cross Tabs and Pivots, Part 1 – Converting Rows to Columns
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs
Understanding and Using APPLY (Part 1)
Understanding and Using APPLY (Part 2)
Post #1512014
Posted Wednesday, November 06, 2013 3:08 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Tuesday, March 25, 2014 1:32 PM
Points: 4, Visits: 20
Sean Lange (11/6/2013)
By far the simplest approach here is to use the technique I described. I guess since everyone is completely ignoring it that either I have lost my mind or nobody even read it.


Sean, I like your solution. It is simple and it will usually work. But it won't scale to handle this problem (billions of rows) on their hardware. They need a more complex solution that does less IO and doesn't put pressure on their transaction log.
Post #1512020
Posted Thursday, November 07, 2013 2:06 AM


SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Friday, March 14, 2014 2:19 AM
Points: 2,820, Visits: 3,916
steven.ensslen (11/6/2013)
They need a more complex solution that does less IO and doesn't put pressure on their transaction log.
IF you handle the records in bunches .. it will take care of IO plus transaction log management. thats the reason i recommended you "Batch approach deletion".


-------Bhuvnesh----------
I work only to learn Sql Server...though my company pays me for getting their stuff done
Post #1512143
Posted Thursday, November 07, 2013 7:46 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: 2 days ago @ 11:32 AM
Points: 180, Visits: 326
I would suggest you export the records that you want into another table and then drop the original table. If you need the table named back to the original, that is easy to do.
Post #1512276
« Prev Topic | Next Topic »

Add to briefcase ««123»»

Permissions Expand / Collapse