IMPORT / EXPORT DATABASE TABLES

  • I have 4 transaction tables that I wish to delete all the rows that have a date/time stamp of 12/31/2010 or less. Each table is about 60gigs before I delete I want to make a backup of them. I thought I would import each table into a test database and then detach them and file them away in case something was to come up. Will this lock the source database tables? Could I have you thoughts

    on a better way. I have no money to spend on tools.

  • twdavis-893252 (3/5/2012)


    I have 4 transaction tables that I wish to delete all the rows that have a date/time stamp of 12/31/2010 or less. Each table is about 60gigs before I delete I want to make a backup of them. I thought I would import each table into a test database and then detach them and file them away in case something was to come up. Will this lock the source database tables? Could I have you thoughts

    on a better way. I have no money to spend on tools.

    deleting a huge number of rows will most likely upgrade to a table lock, yes.

    you could use rowcount to limit the number of rows to limit locking, but you still have to be careful of your log size growing due to the zillion deletes:

    any reason you cannot use 4 simple insert into and then delete statements?

    --backup the data by creating tables ont he fly

    SELECT * INTO MyBackupDatabase.dbo.TABLE1 FROM TABLE1 WHERE SomeField<='2010-12-31 00:00:00.000')

    SELECT * INTO MyBackupDatabase.dbo.TABLE2 FROM TABLE2 WHERE SomeField<='2010-12-31 00:00:00.000')

    SELECT * INTO MyBackupDatabase.dbo.TABLE3 FROM TABLE3 WHERE SomeField<='2010-12-31 00:00:00.000')

    SELECT * INTO MyBackupDatabase.dbo.TABLE4 FROM TABLE4 WHERE SomeField<='2010-12-31 00:00:00.000')

    --delete?

    SET ROWCOUNT 1000

    WHILE EXISTS(SELECT * FROM TABLE1 WHERE SomeField<='2010-12-31 00:00:00.000')

    BEGIN

    --deleting 1000 rows at a time.

    DELETE FROM TABLE1 WHERE SomeField<='2010-12-31 00:00:00.000'

    END --WHILE

    WHILE EXISTS(SELECT * FROM TABLE2 WHERE SomeField<='2010-12-31 00:00:00.000')

    BEGIN

    --deleting 1000 rows at a time.

    DELETE FROM TABLE2 WHERE SomeField<='2010-12-31 00:00:00.000'

    END --WHILE

    WHILE EXISTS(SELECT * FROM TABLE3 WHERE SomeField<='2010-12-31 00:00:00.000')

    BEGIN

    --deleting 1000 rows at a time.

    DELETE FROM TABLE3 WHERE SomeField<='2010-12-31 00:00:00.000'

    END --WHILE

    WHILE EXISTS(SELECT * FROM TABLE4 WHERE SomeField<='2010-12-31 00:00:00.000')

    BEGIN

    --deleting 1000 rows at a time.

    DELETE FROM TABLE4 WHERE SomeField<='2010-12-31 00:00:00.000'

    END --WHILE

    SET ROWCOUNT 0

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Assuming none of the row you plan to delete are being activly updated, you could backup and restore the database under another name. Put the database in SIMPLE recovery mode and delete all records EXCEPT those you want to keep. You then take a backup of the remaining data and keep that. This will isolate the archive process from your live database and ensure there is no contention for resources during the first phase.

    Far more likely to cause blocking is the delete phase. On SQL 2005 Standard edition your options are fairly limited. With Enterprise Edition you could use table partitioning and some partially logged bulk operations to partition the tables based on date and then just drop the older partition(s).

    I've found setting the row count and doing deletes as described before fairly efficient however for large tables I recommend adding a WAITFOR delay and experimenting. For large deletes I've seen the CHECKPOINT being blocked by the delete proces, even when done in 1000 row loops, and the log file then doesn't clear even if the database is in SIMPLE mode. Adding the delay gives the checkpoint time to run.

    Have you considered for each table you want to archive, create a new table, copy rows you want to keep into these, then rename the tables. You would need to work around thing like Identity columns.

    Cheers

    Leo

    Leo
    Nothing in life is ever so complicated that with a little work it can't be made more complicated.

  • I had a large number of deletes and did something similar to previous posts. I used a loop to delete in smaller groups, with a waitfor. I also made a larger loop to run a t-log backup after every x # of delete statements. Here's a chopped up version.

    declare @DeleteCount int

    DECLARE @exitcode int

    DECLARE @sqlerrorcode int

    declare @DeleteTimeID int

    set @DeleteTimeID = 40000

    set @DeleteCount = 1

    DeleteMore:

    WAITFOR DELAY '00:00:05' -- 5 second delay to prevent blocking

    set @DeleteCount = @DeleteCount + 1

    if @DeleteCount = 50 -- backup t-log every x# delete loops

    begin

    set @DeleteCount = 0

    >>>> Backup statement here

    END

    end

    delete top(2000) from Production..Call_Detail

    where time_id < @DeleteTimeID

    if @@Rowcount > 0 goto DeleteMore

  • Thanks Guys I did the Select INTO which worked much better than the EXPORT and I deleted 1000 rows at a time with a delay in between and had maybe a little locking but no one noticed thank you very much.

Viewing 5 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply