Disk defragmentation before rebuilding indexes

  • Reviewing our databases, we find 31% of our indexes have index fragmentation above 30% and should be rebuilt. Yesterday, I found fragmentation of the disk on our SQL Server 2005 system to be at 64%. There are apparently good reasons for our queries to be failing.

    Last night, I stopped the SQL Server and started the Windows defrag tool. Of 273GB, 24% was free and the defrag tool ran for three hours. When it completed, a message came up saying not all files could be defragmented and that the new fragmentation level was 55%.

    I brought the SQL Server back up. All but one web application worked properly, so I cycled the IIS web server and all was fine again. In the end, it didn't seem that much was accomplished, but I'm thinking I need to defrag the physical disk before trying to rebuild the indexes.

    Questions:

    1. Why could the tool not defrag more of the files? Were they still locked by SQL Server though service was stopped?

    2. Should I keep running the same defrag tool more to try to accomplish more? (I heard once that was the way the Windows defrag tool worked.)

    3. I've heard of people using a program called CONTIG from sysinternals to defrag large, individual database files. Has anyone used that and recommend it?

    4. Should defrag of the physical disk be a weekly or maybe monthly scheduled task to prevent this situation?

  • I have had better results when I have continued to defrag more than once. Dont know why this happens. What I also try to do is delete (IF I CAN!) fragmented files that cannot be defragged.

    There are other 3rd party frag tools that you can use.

    I would sort out sql server fragmentation & maybe look at Trunacting your logs, MDF files before physically fragmenting your drive. There is an argument whther or not to truncate yoour ldf files & mdf files so research this yourself first to see what suite you.

  • I can't remember in 12 years of being a DBA having once defragged the hard drive on the disk with SQL databases on it. 

    I know at a previous job they used some other tool than the one that comes with Windows to defrag our file and print servers (about 2 TB worth of storage each), I can't remember what the name of it was though.

  • I have run the normal Windows defrag tool on our non-production servers while SQL is running.  The disks defrag OK, and SQL is not harmed - this is the result that should be expected.  Even though it should be safe, I would not want to defrag disks in Production while SQL is running.

    Running the defrag tool multiple times can give improved results.  In order to defrag a file, the tool needs a contiguous chunk of disk space to hold the entire file.  Multiple runs of the tool can move defragmented files located all over the disk to nearer the start of the disk, leaving larger chunks of contiguous space to defrag larger files.

    Mimimising your log file size before a defrag will speed up the process as there is less data to move around.  Remember to reset the log size after the defrag.  (OK, this will give you a slightly fragmented log file, but the alternative may be almost nothing gets defragged due to lack of contiguous disk space...)

    There must be third-party tools that can defragment faster than the default Windows tool.  Also, if you have your disk formatted into 64k segments to optimise SQL performance, then the standard Windows tool will not work.  Budget restrictions where I work mean we have not tried anything other than the Windows tool.

    Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.

    When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hélder Câmara

  • The MDF's and LDF' s should not be in use when the SQL server service is stopped. I agree that you should defrag physical files in addition to the indexes (it's no good if your indexes are logically contiguous, but physically all over on the disk)

    I recently evaluated and decided to use Diskeeper. The basic server version isn't very expensive. I think I got 10 server licenses for around $2000. It works great with SQL in that it does defragging of the mdf's and ldf's while SQL is online. I have it continously defragging in non production hours.

    One thing also- Do you have your log files on a disk separate from your Data files? Log files don't get fragmented as much if they are on a separate physical disk as they are written contiguously. 

     

     

  • First I like to agree with Andrew. Defragging the datafiles will improve performance only in very few cases. I had to deal once with a database file which consisted of more than 2000 files. After defragging query performance improved between 10 and 20%. But this case was extreme and in most cases performance improvement will be neglible.

    If you still like to defrag your files I suggest using contig.exe from Sysinternals (now taken over by MS). What I liked best about this tool, besides the fact that is free, is the possibility to analyse and defrag individual files instead of the whole volume.

     

    Markus

    [font="Verdana"]Markus Bohse[/font]

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply