• Nadrek (7/19/2011)


    I would suggest including more warnings about the missing index DMV's flaws; suggesting duplicate or overlapping indexes, suggesting indexes that won't actually help, and so on.

    I may have missed it, but finding queries by aggregate cost is also critical; a 100 cost (read, CPU, etc.) query executed 50,000 times is likely more important than a 5,000 cost (read, CPU, etc.) query executed twice.

    Suggesting the Database Tuning Advisor "identifies an optimal set of indexes that takes the requirements of all queries into account" is also... optimistic, to say the least. The DTA often makes bad suggestions and misses good suggestions.

    The trace information was better, though I'd suggest that the ".Net SqlClient Data Provider" is only there if the developer fails to set "Application Name=My Apps Name" in the connection string.

    depends on the system

    i just changed the filtering i use for reporting missing indexes after running the DMV's in the morning. i used to report anything over a few hundred or 1000 seeks until one of our BI servers had the CPU spike for hours every day. turns out a few queries that only run once or twice a day caused it. so now i filter by cost

    and my favorite is when it says to create an index on a bit column or one where the table will have a few million rows but there are only a few unique values in the column of data