Deleting 170 Million rows - Database Maintenance

  • Howzit guys .

    I need some opinions from the SQL lovers out there .

    I implemented a Archive solution for our homeloans devision SQL 7 . This was about 450 000 closed cases that had to be sent to Archive + 170 Million rows . Then I had to delete round about 170 million rows from various tables (live system) if they existed in archive of course ..

    now my question :

    The space usage have not changed much in the database so it means that I must reorg and update the usage and rebuild the indexes ect ect ect ...

    In what order would you guys perform these tasks so it would be most beneficial and , I would get all the free space back to the operating system and minimize the fragmentation of the database as well ?

  • so after the delete operation, did you perform dbcc updateusage for that db ?

    With such mass operations, statistics often get out of date.

    If you still cannot find beter stats after the updateusage, you'll need a reindex.

    Remember reindex will only optimize your datapages if you have a clustering index defined on the table.

    A reindex only makes space available for you db-engine, not the OS.

    If you want to create free space at disk level, you'll need to use dbcc shrinkdb or dbcc shrinkfile. Check BOL.

    Johan

    Learn to play, play to learn !

    Dont drive faster than your guardian angel can fly ...
    but keeping both feet on the ground wont get you anywhere :w00t:

    - How to post Performance Problems
    - How to post data/code to get the best help[/url]

    - How to prevent a sore throat after hours of presenting ppt

    press F1 for solution, press shift+F1 for urgent solution đŸ˜€

    Need a bit of Powershell? How about this

    Who am I ? Sometimes this is me but most of the time this is me

  • If possible I try and avoid SHRINKFILE in live systems because there is an overhead in growing the file as data is put back in.

     

  • I believe after deleting the rows You should execute DBCC Reindex and then use dbcc updateusage


    Kindest Regards,

    Sureshkumar Ramakrishnan

  • The best you can do is to run an TSQL Script for an periodic update of the statistics and reindex.

    Also use DBCC shrinkfile ....

     

  • One question, how big are the table after the deletes?  With archiving that many rows, I imagine that your tables are very large.  Your reindex could take hours so plan accordingly. 

  • Thanks for all the feedback guys ....

    There is one thing though I will not reindex the indexes ..I am going to drop and recreate them ..start off fresh ..I do not have the space to reindex and do not want to temp fate

    Cool banana's

    Like I said thanks for the input ......

  • Dropping and recreating the indexes start with the smallest table and work up to the largest.  Creating additional space as you go.

  • If it is log space you are concerned with (I'm assuming you have tons of available page space if you haven't shrunk the file yet after deleting all that data) and you have some offline time available to you, you could switch to bulk logging, rebuild your indexes, and switch back to full logging. Although I'm not sure what is truely involved with switching logging modes (I've never had cause to investigate) I thought it worth mentioning in case you could put it to use.

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply