Asynchronous statistics updates

  • I am considering turning this option on.  We have very large databases (several TBs) that contain large tables (100,000,000 rows) that are highly OLTP and we think this will benefit us.  However, I'd like to capture a metric before and after the change to measure the improvement.  What metric would this be?  Average compilation time?  And how would I capture it?

    Thanks in advance.

  • Mick Opalak - Friday, November 10, 2017 12:45 PM

    I am considering turning this option on.  We have very large databases (several TBs) that contain large tables (100,000,000 rows) that are highly OLTP and we think this will benefit us.  However, I'd like to capture a metric before and after the change to measure the improvement.  What metric would this be?  Average compilation time?  And how would I capture it?

    Thanks in advance.

    Hi

    You can measure compile time, see link below, not sure i would run it on production system though as it will have to scan through the entire plan cache which is a little expensive. Think you will also find it difficult to get any meaningful data from it.

    https://www.sqlskills.com/blogs/jonathan/identifying-high-compile-time-statements-from-the-plan-cache/

    Personally I would look at making this change if the time taken to update statistics caused an application timeout or users are complaining of poor performance for specific queries. You also have the option to disable automatic updates on the large tables and then handle statistics updates A SQL Agent job.

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply