Deleting large number of records from tables

  • Hi

    We're working on archiving or moving a huge dataset from one database to another (somewhere around 90 tables). Basically, we're moving everything which is more than 1 year old to an archive database.
    Data count varies from 100million to 400million (each table)

    Please advise, what should be the ideal way of removing/deleting this (archived) data from the existing database.

    **We're using SSIS to move data among these DB's and as these tables are used frequently by different processes, we would like to keep the downtime to a minimum.

    Thanks in advance!

    ____________________________________________________________

    AP
  • Another thing to consider is archiving using partitioning.
    For deleting large amounts of data, you typically want to do the deletes in batches. You may also want to look at how much data in the tables is being deleted. If you are deleting most records, it can often make more sense to move the records to keep into another table, truncate the old, rename the new and rebuild any other existing constraints. Or you could also look at truncating the old and reloading the data you moved to another table. Depends on the tables, the relationships, the amount of data, business considerations, etc. It can be a pain to archive this way which is why a lot of times partitioning is used for archive data.

    Sue

  • As Sue indicated, it isn't necessarily the number of rows but the percentage of data being deleted from each table.  If you are deleted 10% of the data then moving and deleting that data makes sense. If you are deleting 90% of the data then moving out the data you are keeping makes sense, especially if there is a lot of data.

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply