fastest way to delete from hugetables

  • - make sure you have the proper index (on date_id), so the query could find the records quickly.

    - delete in batches of 100.000 - 500.000 records (i.e. DELETE TOP (100000) FROM ...)

    - if the amount of data to be deleted is much more compared to the data that will remain, you could:

    >> 1. copy the data that will remain into a temporary table

    >> 2. truncate the original table

    >> 3. move the data from the temporary table back into the original table

    ** Don't mistake the ‘stupidity of the crowd’ for the ‘wisdom of the group’! **
  • many thanks, table is quite big not sure how do i go about moving data into temp table and then back to original table. However i will try the batch delete and see how it goes.

    Any advise on how do i go about such things in future?

  • rajsin7786 (6/23/2014)


    many thanks, table is quite big not sure how do i go about moving data into temp table and then back to original table.

    SELECT *

    INTO #temp_table

    FROM {original_table}

    WHERE {filter on remaining data here}

    TRUNCATE TABLE {original_table}

    INSERT INTO {original_table}

    SELECT *

    FROM #temp_table

    rajsin7786 (6/23/2014)

    However i will try the batch delete and see how it goes. Any advise on how do i go about such things in future?

    Sure: :w00t: prevent the duplicated records in the first place :w00t:

    You can do it on the front end (in the application/ETL/...) by adding additional checks, but you can also add some constraint(s) on the table(s) to require unique records and thus prevent duplicate records.

    ** Don't mistake the ‘stupidity of the crowd’ for the ‘wisdom of the group’! **

Viewing 3 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply