• OK does this make sense...I was just reading a wonderful article by Gail Shaw (Thanks for the article)...

    http://www.sqlservercentral.com/articles/Indexing/68439/

    This is probably going to sound nice and stupid but is the following a good assumption?

    Since I ran the first two scans in (Limited and Sampled) it scans the index but doesn't go to the leaf level of the index. The table in question has a page count for the IN_ROW_DATA of 132955 and a page count for LOB_DATA of 1445895

    The LIMITED mode is the fastest and scans the smallest number of pages. It scans all pages for a heap, but only the parent-level pages for an index, which are the pages above the leaf-level.

    The SAMPLED mode returns statistics based on a 1 percent sample of all the pages in the index or heap. If the index or heap has fewer than 10,000 pages, DETAILED mode is used instead of SAMPLED.

    As a result it shows very low fragmentation and if I were running a script with logic (fragmentation > xxx) it would pass over this index.

    When running the detailed it shows all the levels of the index 0, 1, 2, 3.

    The first and second levels of this particular index appear to have very high levels of fragmentation (80% plus)...

    In this example would you query your output table to say something like:

    return all indexes and their max fragmentation level and then evaluate just the singular value? i.e.

    SELECTobject_id,

    MAX(avg_fragmentation_in_percent)

    FROMMyTable

    WHEREMAX(avg_fragmentation_in_percent) > 30

    GROUP BY object_id

    Something like this would hopefully return only 1 row for the above table and it would be:

    object_idfragmentation%

    82950628486.2069

    This would indicate that the index(s) on the table above would require a rebuild

    Am I off base?