Cursor to Delete data

  • Hey Guys,

    i have to delete data from a table which is older than 2 weeks, how can i use a cursor to do it.

    I will have to place the query in a SQL job and run that weekly once, please help me out.

  • Tej_7342 (11/5/2013)


    Hey Guys,

    i have to delete data from a table which is older than 2 weeks, how can i use a cursor to do it.

    I will have to place the query in a SQL job and run that weekly once, please help me out.

    Why do you think you need a cursor for a delete? Cursors are horrible for performance and there is absolutely no need for a cursor for this type of thing.

    _______________________________________________________________

    Need help? Help us help you.

    Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.

    Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.

    Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
    Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
    Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
    Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/

  • Tej_7342 (11/5/2013)


    i have to delete data from a table which is older than 2 weeks, how can i use a cursor to do it.

    Why does it have to be a cursor?

    Could you do something like this:

    delete from Table1 where Table1.DateTimeColumn < dateadd(ww, -2, getDate())

    This would need to be tweaked to account for the time component. Other than that, this would run MUCH faster than any cursor.

    __________________________________________________________________________________________________________
    How to Post to get the most: http://www.sqlservercentral.com/articles/Best+Practices/61537/

  • The log file fills up if i am using the query. (Database is in Simple Recovery). Any ideas?

  • Something like this might work, but you should test the correct numbers depending on your server.

    DECLARE @i int = 1

    WHILE @i > 0

    BEGIN

    DELETE TOP (1000000) MyTable

    WHERE Table1.DateTimeColumn < dateadd(ww, -2, getDate())

    SET @i = @@ROWCOUNT

    BEGIN END

    Or if you have a trigger on the table that might alter @@ROWCOUNT value.

    WHILE EXISTS (

    SELECT TOP 1 1

    FROM MyTable

    WHERE Table1.DateTimeColumn < dateadd(ww, -2, getDate())

    )

    BEGIN

    DELETE TOP (1000000) MyTable

    WHERE Table1.DateTimeColumn < dateadd(ww, -2, getDate())

    SET @i = @@ROWCOUNT

    BEGIN END

    Luis C.
    General Disclaimer:
    Are you seriously taking the advice and code from someone from the internet without testing it? Do you at least understand it? Or can it easily kill your server?

    How to post data/code on a forum to get the best help: Option 1 / Option 2
  • Please don't send private messages, you limit the help you can get.

    Tej_7342 (11/5/2013)


    It looks good, but can this be modified to be generic instead of hard-coded. Also the problem is its fills up logs if we run it.

    What do you mean by generic? What's wrong with the code? You said you needed to delete data from a table which is older than 2 weeks. As we can't see your data, we're just guessing.

    Luis C.
    General Disclaimer:
    Are you seriously taking the advice and code from someone from the internet without testing it? Do you at least understand it? Or can it easily kill your server?

    How to post data/code on a forum to get the best help: Option 1 / Option 2
  • There are 2.2 billion records in the table and we want to only keep 2 weeks data, as of now we are lagging behind so we want to delete data in smaller chunks first so that the logs don't get messed up.

  • Tej_7342 (11/5/2013)


    There are 2.2 billion records in the table and we want to only keep 2 weeks data, as of now we are lagging behind so we want to delete data in smaller chunks first so that the logs don't get messed up.

    So, what's the problem with my code?

    Luis C.
    General Disclaimer:
    Are you seriously taking the advice and code from someone from the internet without testing it? Do you at least understand it? Or can it easily kill your server?

    How to post data/code on a forum to get the best help: Option 1 / Option 2
  • Tej_7342 (11/5/2013)


    There are 2.2 billion records in the table and we want to only keep 2 weeks data, as of now we are lagging behind so we want to delete data in smaller chunks first so that the logs don't get messed up.

    Given that I might suggest a different approach entirely. Instead of trying to delete 2.18 billion rows out of 2.2 billion it is probably easier to insert the rows you want to keep into a new table. Then drop the current table and rename the new table.

    Something like this:

    select *

    into MyCopy

    from MyCurrentData

    where SomeDate > DATEADD(week, -2, getdate())

    drop table MyCurrentData

    exec sp_rename 'MyCopy', 'MyCurrentData'

    Make sure you script all the indexes/constraints prior to this because the select into will NOT bring over any indexes or constraints.

    As with anything, make sure you try this on a test system first. 😛

    _______________________________________________________________

    Need help? Help us help you.

    Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.

    Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.

    Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
    Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
    Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
    Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/

  • Tej_7342 (11/5/2013)


    The log file fills up if i am using the query. (Database is in Simple Recovery). Any ideas?

    Why not consider running the job daily then instead of weekly? I'm assuming that the data are inserted on a daily basis.

    Edit: Talking about this query

    delete from Table1 where Table1.DateTimeColumn < dateadd(ww, -2, getDate())

  • Tej_7342 (11/5/2013)


    There are 2.2 billion records in the table and we want to only keep 2 weeks data, as of now we are lagging behind so we want to delete data in smaller chunks first so that the logs don't get messed up.

    you should look for batch approach for these type of deletions

    see http://www.sqlservercentral.com/articles/Top/63301/

    -------Bhuvnesh----------
    I work only to learn Sql Server...though my company pays me for getting their stuff done;-)

  • Bhuvnesh (11/6/2013)


    Tej_7342 (11/5/2013)


    There are 2.2 billion records in the table and we want to only keep 2 weeks data, as of now we are lagging behind so we want to delete data in smaller chunks first so that the logs don't get messed up.

    you should look for batch approach for these type of deletions

    see http://www.sqlservercentral.com/articles/Top/63301/

    And indexing 😉

  • If I had to guess, the problem is that a single DELETE is too large to fit in the transaction log. DELETE is the wrong tool for the job. A cursor loop reduces the size of each transaction, so they would not overflow the log, but it is still naive.

    The correct solution here is to partition the table. The SWITCH out the old partition with all of its data.

    http://technet.microsoft.com/en-us/magazine/2007.03.partitioning.aspx

  • steven.ensslen (11/6/2013)


    If I had to guess, the problem is that a single DELETE is too large to fit in the transaction log. DELETE is the wrong tool for the job. A cursor loop reduces the size of each transaction, so they would not overflow the log, but it is still naive.

    The correct solution here is to partition the table. The SWITCH out the old partition with all of its data.

    http://technet.microsoft.com/en-us/magazine/2007.03.partitioning.aspx

    It depends as the OP is not interested on keeping old data.

    Luis C.
    General Disclaimer:
    Are you seriously taking the advice and code from someone from the internet without testing it? Do you at least understand it? Or can it easily kill your server?

    How to post data/code on a forum to get the best help: Option 1 / Option 2
  • Luis Cazares (11/6/2013)


    steven.ensslen (11/6/2013)


    If I had to guess, the problem is that a single DELETE is too large to fit in the transaction log. DELETE is the wrong tool for the job. A cursor loop reduces the size of each transaction, so they would not overflow the log, but it is still naive.

    The correct solution here is to partition the table. The SWITCH out the old partition with all of its data.

    http://technet.microsoft.com/en-us/magazine/2007.03.partitioning.aspx

    It depends as the OP is not interested on keeping old data.

    I disagree. If we want to get rid of the data SWITCH the partition then to DROP or TRUNCATE the destination table will be thousands of times faster/less-work-for-the-db than a DELETE. The only reason for the DELETE would be some sort of transaction replication, like log-shipping for Disaster Recovery.

Viewing 15 posts - 1 through 15 (of 27 total)

You must be logged in to reply to this topic. Login to reply