create temp table to delete

  • hello guys, i need your expertise. I have a table of 10 million rows which is heavy transactional. I need to create a script which will delete around 1 million rows according to some date criteria(this will be in an stored procedure). Now, in your opinion would it be better to first create a temp table with the rows which will be deleted then use an inner join in my transaction to delete the 1 million rows?

  • It'll depend on a number of factors.

    For example, if you delete all 1-million at once, that will create a rather huge transaction, which might take a long time to commit. If that blocks other people from accessing the table for a while, is that a bad thing?

    Also, will that transaction cause your transaction log to grow too much?

    Generally, when I have to delete millions of rows at a time, unless it's ranges that are supported by the clustered index, I delete a few thousand at a time and loop through that, to keep transaction time down a bit.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • I do what GSquared does as well. Delete them in batches, 1,000-10,000 at a time.

  • thanks for your input guys, so if i do it with batches, let 's say 10000 at a time, would it also help to create a temp table and then do an inner join? or it would'nt be necessary.

  • Nah. If the temp table is created/populated with a select from the main table, you'll just be processing the data twice instead of once.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • thank you!

  • DBA (3/23/2009)


    thanks for your input guys, so if i do it with batches, let 's say 10000 at a time, would it also help to create a temp table and then do an inner join? or it would'nt be necessary.

    It also depends ;-).

    If you have also to delete related data and maybe have more complex criteria for the deletion a temp table may make sense.

    1. Select ids into temp table WHERE some quiet complex criteria

    2. Use a joined delete to purge all the related data

    3. Use a joined delete to purge the start table

    Greets

    Flo

  • If you can post a bit more information, we can likely help you to determine if a temp table makes sense. Show some DDL and code.

  • Table partitioning comes in handy for these cases too. Keep it in mind in case you have EE 😉


    * Noel

  • Also, you can add a backup log operation (full recovery) or checkpoint (simple recovery) for each iteration to reduce growing the transaction log too much.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

Viewing 10 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply