Deleting batches of rows with TOP

  • Comments posted to this topic are about the item Deleting batches of rows with TOP

    ---------------------------
    |Ted Pin >>

  • I'm probably missing something here:

    Why does that example do anything different to:

    DELETE FROM tab1 WHERE col1 = 1;

    ???

  • I certainly wouldn't call this an article. I found the "aside" most amusing - and is almost as long as the so called article.

    If you are hesitating about writing an article (as mentioned in the aside) then I'd recommend you come up with something more than a paragraph, where the purpose of your article and your intention is clearly stated - where you instruct or enlighten the reader, and finally where you draw some sort of conclusion.

    If you can't do any of the above, then perhaps hesitate some more until you can.

    David McKinney.

  • Hi david.gerard,

    the idea is that it can be much better to delete in batches when handling a much larger data set than in the example. Imagine you have to do a delete of 5 million entries in a 20 million entry table that is in a production system. You would have some (necessary) blocking which could seriously slow down your production system.

    As a note to Ted Pin - Why is the variable @cnt a decimal and not integer? You cannot get 2.5 rows back with your count(*) query.

    Also, would this not be even better using a numbers table รก la Jeff Moden http://www.sqlservercentral.com/articles/TSQL/62867/ ?

    Regards

    GermanDBA

    Regards,

    WilliamD

  • There's no need to get all 'Comic Book Guy' about this. The example seems a bit muddled, but I thought the aside was actually quite a well made point.

    The one thing that is likely to put people off posting articles is when people get snippy about them. And if you don't appreciate having your time wasted by 'unenlightening articles', why reply in the first place?

  • I see. (I thought there would probably be a reason in there somewhere). Thanks.

  • I am wondering, why not use SET ROWCOUNT ?

    Normally I am using rowcount because it is really easy to handle.

    Set Rowcount @n

    select 1

    While @@ROWCOUNT > 0

    BEGIN

    Delete from .... where x = y

    END

    Should do the same without too much calculation and too many variables :hehe:

  • You're right...my comments are quite severe, and probably over the top. So I apologise if I've offended.

    But I do think that in an article the presentation is as important as the content, in the sense that the content will never be considered by a large section of the readership, when it is poorly presented.

    I also know from experience that writing articles is hard work and takes time. It is however a very rewarding experience, and I'd encourage others to try their hand. However to maximise their chances of their articles being well received, I'd recommend spending a certain time on the "packaging", which can show their technical contributions in the best light.

    (But I do think the editor should be a little more proactive in this regard.)

    A final point, it is clear that the author has good English, grammar and spelling, and can put together a sentence or paragraph, apparently without too much trouble. I just wish he'd done a couple more, to put his script in context.

  • I like this forum!

  • Is these one practically useful? I have a doubt. Any comments on these?

  • I thought the same... "Why would I ever need to delete 5,000,000 rows from a huge database table?" But then I thought "you never know!" It's not the sort of thing you would do every day (I hope) but it's more likely you'd need to do it in response to some disaster or other. And knowing a way to do so effectively without making the situation worse might save your skin one day.

  • I like the following a bit better as you don't have to find out how many rows you're going to affect before affecting them. YMMV.

    declare @rowcount int

    set @rowcount = 1

    declare @batchsize int

    set @batchsize = 5000

    while(@rowcount > 0)

    begin

    delete top @batchsize from table where foo=1

    set @rowcount = @@rowcount

    end

  • Anirban Paul (7/3/2008)


    Is these one practically useful? I have a doubt. Any comments on these?

    I actually did use this about a month ago... we had a new policy implemented that allowed us to delete a couple of years worth of data out of several audit tables. The largest delete was approx 2.5 million rows. Agreed probably not an every day use, but does get us less experienced DBA's thinking beyond getting something to just work.

  • On top of these one off deletions there will undoubtedly be cases when such a delete needs to be done. Off the top of my head, I would say a logging table may well need this type of delete to be done.

    The original programmer of the TOP option in t-sql had a reason back then, that was a good enough reason to allow this behaviour ๐Ÿ˜›

    EDIT: I find the TOP option good when doing updates on production tables which can otherwise cause blocking. It's not always nice, but this is real life and not a walt disney film !

    Regards,

    WilliamD

  • cjeremy74 (7/3/2008)

    I actually did use this about a month ago... we had a new policy implemented that allowed us to delete a couple of years worth of data out of several audit tables. The largest delete was approx 2.5 million rows. Agreed probably not an every day use, but does get us less experienced DBA's thinking beyond getting something to just work.

    Same with me, we do have a logging table where all the errors are stored. If there is a small bug in the system, there can be aprox. 3000 rows per second. So if you fix the bug in a few minutes there are still tons of rows in the table and there is an insert for sure if you want to delete the rows all at once.

    So it is useful but as I wrote a few rows earlier, I do prefer rowcount but it doesn't matter how you solve a problem as long as you can solve it with as less work for the server as possible.

Viewing 15 posts - 1 through 15 (of 46 total)

You must be logged in to reply to this topic. Login to reply