Statistics with NORECOMPUTE

  • Evening All,

    When would one typically specify NORECOMPUTE  on Statistics?

    Presumably when the data is changing dramatically in a way could skew the dsitrubtion? If that assumption is right, and I've decided that a poor plan was generated by skewed statistics, how would I then 'fix' the statistics.... By that, what I mean is:  If I were to specify NORECOMPUTE now whilst I know its pushing bad plans, the norecompute will keep those bad plans coming?

    If I drop and recreate statistics with norecompue there is no guarenetee that it will create good stats this timearound and ill still be stuck with dodgy ones?

    So now I've demonstrated my ignorance and lack of understanding, can anyone recommend some bedtime reading on this?

    Cheers
    Alex

  • alex.sqldba - Wednesday, March 29, 2017 12:45 PM

    Evening All,

    When would one typically specify NORECOMPUTE  on Statistics?

    Presumably when the data is changing dramatically in a way could skew the dsitrubtion? If that assumption is right, and I've decided that a poor plan was generated by skewed statistics, how would I then 'fix' the statistics.... By that, what I mean is:  If I were to specify NORECOMPUTE now whilst I know its pushing bad plans, the norecompute will keep those bad plans coming?

    If I drop and recreate statistics with norecompue there is no guarenetee that it will create good stats this timearound and ill still be stuck with dodgy ones?

    So now I've demonstrated my ignorance and lack of understanding, can anyone recommend some bedtime reading on this?

    Cheers
    Alex

    What NORECOMPUTE does is just create new statistics one time, and then disable further automatic updates to them.   This is the kind of thing you would do with data that is known to be stable and is either being archived or going into a read-only data warehouse.   Because you're going to get up to date stats out of the process, you should get good plans until the data starts to change.   I suspect it was just a good option to provide to for those READ ONLY sets of tables where you have a once and done stats update option.

    Steve (aka sgmunson) 🙂 🙂 🙂
    Rent Servers for Income (picks and shovels strategy)

  • Steve, cheers for that! I had not considered read-only work loads for a second...

    Brilliant, thanks again!

    Alex

  • sgmunson - Wednesday, March 29, 2017 1:07 PM

     I suspect it was just a good option to provide to for those READ ONLY sets of tables where you have a once and done stats update option.

    But if a table is read-only, then the stats will never be invalidated and hence never be updated

    alex.sqldba - Wednesday, March 29, 2017 12:45 PM

    When would one typically specify NORECOMPUTE  on Statistics?

    The only time I can think of is when I absolutely do not want an automatic, sampled stats update, I always want the stats to be updated with a fullscan and I have a job that does that often enough for the queries, and I don't want to risk getting an automatic stats update sneaking in somewhere unexpectedly.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • Right. So an automatic update is ever only a sampled scan then?

  • Yes. On very small tables it'll be sample 100%, as the size goes up the sample % goes down.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • is there a DMV that will tell me the last sample rate used?

  • Don't think so, DBCC Show_Statistics does.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • I am back with another ignorant question.

    How exactly does on read / interpret the results of show_statistics...

    This is sample out from the histogram (pardon the bad table representation)
    range_hi range_rows eq_rows distinct_range_rows avg_range_rows
    10001 0 1 0 1
    25255 444.6727 1 445 1
    38687 319.9761 3.736643 320 1
    337845 842.2902 1 830 1.014884
    443143 449.3783 1 449 1
    558154 599.9553 1 600 1.000086

    Looking at this it dawns on me that I could not identify a good result from a bad one?

    And how does the density value in the STAT_HEADER relate to that found in DENSITY_VECTOR:
    1.012792E-05 9 REFERENCE
    1.012597E-05 17 REFERENCE, BISUNIQUEID
    vs
    IX_ADDRESS_REFERENCE Mar 25 2017  6:29PM 98756 42073 173 0.9998568 17 NO  NULL 98756

    Cheers
    Alex

  • The density in the stats header is useless, ignore it.

    You can't, by looking at the histogram, tell whether your stats are good or not.
    eq_rows is the number of rows that match the high key. range rows is the number between that high key and the previous one, exclusive on both sides.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • Hmm, so what is a high key considered? Does it depend on the index and how its ordered firstly? And whether its filtered?
  • alex.sqldba - Thursday, March 30, 2017 12:19 PM

    Hmm, so what is a high key considered? Does it depend on the index and how its ordered firstly? And whether its filtered?

    The values in the column called range_hi in the histogram

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • alex.sqldba - Thursday, March 30, 2017 4:10 AM

    is there a DMV that will tell me the last sample rate used?

    GilaMonster - Thursday, March 30, 2017 4:22 AM

    Don't think so, DBCC Show_Statistics does.

    Gail is correct, there is no dmv that shows the last sample rate used. It would be a nice addition but is not there yet.

    Jason...AKA CirqueDeSQLeil
    _______________________________________________
    I have given a name to my pain...MCM SQL Server, MVP
    SQL RNNR
    Posting Performance Based Questions - Gail Shaw[/url]
    Learn Extended Events

Viewing 13 posts - 1 through 12 (of 12 total)

You must be logged in to reply to this topic. Login to reply