Delete

  • I have TWO TABLES ABC and XYZ ..and we are doing some logic to Insert millions of records into XYZ from ABC and after Inserted from ABC With @MINID AND @MAXID Range,All records which are inserted into XYZ should be deleted in ABC ...

    for ex :@MIN= 1 and @max-2 -100000

    Note:Not The Table or Whole data ,records 1-100000 should be deleted and ABC may have 500000000 records

    Here is the Querey i Wrore for deleteing

    DELETE FROM ABC

    WHERE ORD_ID IN (Select ORD_ID FROM XYZ

    Where ID BETWEEN @MinID AND @MAXID )

    THIS Worked excellent and MY LEAD SAID ..he need to have Better querey to improve the performance ...can Any one PLease help me to write a Querey with good PErformance i can tell you what scenario he explained me

    This is not exact querey ..im explaing the general scenario ..blablahh

    DELETE From ABC a Where EXITS (a.O_ID BEtween @MINID AND @MAXID)

    AND SELECT (Order_ID from XYZ x where x.o_ID =a.ID)

    something these kind ...

    can you please help me in writing the querey to improve the performance

    Thanks

    Regards

  • Delete performance is dependent on two factors: how quickly you can identify the rows to be deleted, and how quickly you can delete the row. Make sure you're testing this on a non-Production system.

    The first one is easy - just change the "DELETE FROM..." to SELECT * FROM...". If this returns pretty quickly, then your bottleneck won't be in the selecting phase.

    The deleting phase does a bit more work. Each row must be marked as deleted, and the previous version of the row must be written to the transaction log in case of a rollback. The problem with deleting many rows at once is the amount of locking that will occur during the delete. This may lock the entire table and block other users.

    Instead of deleting 100,000 rows at once, consider deleting smaller batches. Larger batches increase the chance of the entire table being locked, so a small batch of 5,000-10,000 rows might show much better performance. Transactions are shorter, so users are locked for smaller amounts of times, and the transaction log can be cleared (backed up in Full recovery, automatically truncated in Simple) after each batch.

    Note that delete triggers, or any foreign key relationships that need to be checked/cascaded could also cause performance issues if not well indexed.

    The most important thing is to test though. If you let it run for 4 hours, it may take another 4 hours to rollback!

  • Some good points posted already, but I just wanted to add one more

    Sometimes it's a good idea to drop all indexes (except the one you're sorting on of course), delete, and then recreate them. This saves the engine from having to update every single index every time it deletes a row.

  • I'd agree with both points above. USe batches, and if you can test, try dropping indexes.

  • Thanks For the quick responses .....

    DELETE FROM ABC

    FROM ABC a

    WHERE EXISTS ( SELECT ID FROM XYX x WHERE x._ID = a.ID)

    AND Order_ID BETWEEN @MinID AND @MAXID

    Using INNERJOIN:

    DELETE FROM ABC

    From ABC a

    INNER JOIN Staging..XYZ x

    ON a.Id = x.id

    WHERE x.ID BETWEEN @MinID AND @MAXID

    This is what i went with ...and he said Perfect ....

    Thanks Again

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply