Find/Analyze bad statistics of entire db

  • I read http://www.sql-server-performance.com/articles/per/execution_plan_statistics_p3.aspx and it says if the estimated number of rows are very different from the actual number of rows then it means there are bad statistics on the database. I am going through the same scenario. Can someone please tell me how to identify bad statistics and if they can also give me a script that would be great. I have come across many articles but couldn't find the one which talks more in detail about finding bad statistics.thanks

  • do

    dbcc checkbd

    dbcc checkalloc

    then sp_update stats.

    [font="Tahoma"]
    --SQLFRNDZ[/url]
    [/font]

  • Thanks Pavan. But part of my question was also i was refering to an article which was talking about identfying statistics..is that true?

  • Check on database when was the stats last updated

    run this script if they are updated long back

    USE pubs -- Change desired database name here

    GO

    SET NOCOUNT ON

    GO

    DECLARE updatestats CURSOR FOR

    SELECT table_name FROM information_schema.tables

    where TABLE_TYPE = 'BASE TABLE'

    OPEN updatestats

    DECLARE @tablename NVARCHAR(128)

    DECLARE @Statement NVARCHAR(300)

    FETCH NEXT FROM updatestats INTO @tablename

    WHILE (@@FETCH_STATUS = 0)

    BEGIN

    PRINT N'UPDATING STATISTICS ' + @tablename

    SET @Statement = 'UPDATE STATISTICS ' + @tablename + ' WITH FULLSCAN'

    EXEC sp_executesql @Statement

    FETCH NEXT FROM updatestats INTO @tablename

    END

    CLOSE updatestats

    DEALLOCATE updatestats

    GO

    SET NOCOUNT OFF

    GO

    Search google for the references I don't have time to find material for you.

    [font="Tahoma"]
    --SQLFRNDZ[/url]
    [/font]

  • Hi,

    You can check when the statistics was last update by using the below DBCC cmd.

    dbcc show_statistics(table name,index name)WITH STAT_HEADER.

    After getting the last update statistics date,if you see the statistics on the table are not updated recently,you can update the statistics..it may help..

    Thanks,

    Deepak

  • it says if the estimated number of rows are very different from the actual number of rows then it means there are bad statistics on the database.

    For large tables, SQL Server doesn't compute the statistics from each row of the table. Rather, it takes a sample of the rows, and computes the statistics from there. Hence, I think you should be more concerned about ensuring that SQL Server computes the statistics for large tables by reading a larger sample, or even the entire table. You can do this using the UPDATE STATISTICS command with the FULLSCAN/SAMPLE options.

    However, with the 'auto update statistics' option active, SQL Server will update the statistics on your table automatically if there is a large number of DML operations against that table. Unfortunately, that computation will still only be based on a sample of rows, and negate the benefits of your earlier computation using a larger sample.

    Here's what we did in my previous company:

    - for large tables, disable automatic statistics updates using sp_autostats

    - run a job at midnight to compute statistics for the above tables using UPDATE STATISTICS with the FULLSCAN option

    - monitor long running queries during the day, and analyze the query plans to determine if the poor performance is due to inaccurate statistics, in which case we may need to update the statistics for the affected tables once more during the day

    SQL BAK Explorer - read SQL Server backup file details without SQL Server.
    Supports backup files created with SQL Server 2005 up to SQL Server 2017.

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply