Vendor recommendations - Looking for which option for stats update would be best

  • Running SQL 2016 Enterprise on Windows 2016

    We upgraded to a new version of this product and having slowness issues. The vendor keeps running their tool and tells us sample percentage needs to be 100% for all statistics. The problem is the auto update statistics keeps updating them and so they are not 100% We rebuild them with full scan but once the data changes SQL does its own stats update like its supposed to do.

    From their documentation

    The Auto update statistics option, located under Database Properties, causes SQL Server to automatically

    update statistics when approximately 20% of the table data has changed. When this occurs, SQL Server

    uses the default sample percentage (10%) to perform the update. As a result, the sample size is often

    too small to capture a true representation of the data distribution. Additionally, this process (and SQL

    Server’s periodic checks to see whether the process needs to be done) can cause overhead at

    inopportune times and may affect database performance.

    However, with this option enabled(i think they meant disabled), query plans are not recompiled after an update of statistics, which

    can potentially cause inaccurate plans to persist in SQL Server’s plan cache. Inaccurate plans have the

    potential to cause additional I/O, higher CPU, and adverse effects on memory utilization.

    Recommendation: Database Administrators are empowered to choose whether the Auto update

    statistics option is enabled or disabled. Note the following: If the Auto update statistics database

    option within SQL Server is disabled, it is recommended that you clear the plan cache after statistics

    are updated. While this may cause a slight degradation to performance initially, the overall benefit of

    more accurate query plans is a worthy trade off. If the Auto update statistics database option is

    enabled, Database Administrators are encouraged to aggressively manage statistics to ensure that

    they are rarely (if ever) automatically updated and that they are always captured with a 100%

    sample.

    This may not be the problem with the slowness, however until they see consistent 100% sample percentage they're not going to focus on the real issue, or this may be the real issue in either case what the thoughts on this?

     

  • We had this issue migrating from 2008 R2 to 2016 and by issues i mean, performance degradation in some queries. don't know if you upgraded from that early of a version.

    Microsoft made some changes to the cardinality estimator in 2014, if you migrated and changed your version to 130, you could try changing back to the version you had earlier (2008 - 100, 2012 - 110, 2014 - 120, 2016 -130) could help you.

    In database option there's also the option of using "Legacy Cardinality Estimation", i have not used it but you could give it a try.

    https://blogs.msdn.microsoft.com/psssql/2014/04/01/sql-server-2014s-new-cardinality-estimator-part-1/

    This answer is assuming you migrated from SQL Server versions earlier than 2014.

    Regards.

  • If the vendor wants to see a full sample rate - you can set the NO_RECOMPUTE flag on those stats so they will not be auto updated.  If you do this, then you must setup a schedule to update the statistics manually - and how frequent that process will run will depend on how many changes are made to that table in a given day.

    For example - if the stats go stale after 2 hours of normal activity you would schedule the manual update every 2 hours.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply