Question about updating statistics

  • We have very large tables in our database, some even more 1 B records, so updating statistics will take many hours.

    Would it be a good idea to skip those tables that have ratio of modified records to total number of records less than 0.5 percent? That's because number of buckets in statistics histogram is 200.

  • Thanks for posting your issue and hopefully someone will answer soon.

    This is an automated bump to increase visibility of your question.

  • Thats an interesting case. I would still say stats have to be updated in someway as 0.5% still is a good 5 million rows. The stats on the table can be very precisely chosen depend on some factors , say , what queries use the table more often and what columns comprise the select clause, etc

    • This reply was modified 2 months, 2 weeks ago by Arsh.
    • This reply was modified 2 months, 2 weeks ago by Arsh.
  • Thanks for raising this question. Just checking—has the table been partitioned? With a table holding around a billion rows, I would guess it is part of a data warehouse. If so, then partitioning is something to look at.

    I am mainly trying to figure out what exact issue we are tackling. If partitioning is in place, maybe we can think about managing stats at the partition level. Also, what sampling option is being used here?

    I have come across a similar case earlier—this was on SQL Server 2022—and the stats were managed pretty well by the engine itself.

    If you could share a bit more detail, especially around what made us think the stats are not getting updated properly, and what performance issue you noticed, that will help. It will be easier to narrow down the right solution once we are clear on what exactly is going wrong.

    Thanks,
    Abhinav

Viewing 4 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply