SQLServerCentral Article

Testing times

,

While working with Jonathan Kehayias on his Troubleshooting SQL Server book, I occasionally found myself poking him in the ribs, trying to encourage him to offer some best-practice advice on the 'optimal' values for various SQL Server configurations and options. For example, if your server has 64 GB of RAM, what's the optimal setting for max server memory? He was reluctant, based on the sure knowledge that many people don't test such advice in their environment – and, just as surely, there would be some environments for which his advice was wrong.

Everyone knows that the 'optimal' value of Buffer Cache Hit Ratio for a healthy SQL Server is over about 95%, right? Well, in this case the advice is not just wrong for some environments. It's just plain wrong. A bit of straightforward testing has proven that the BCHR is hugely ineffective at highlighting memory issues (due the efficiency of SQL Server's read-ahead mechanism), and that it's entirely possible for a server to be under extreme memory pressure, and experiencing constant buffer cache churn, while still displaying an 'optimal' value for BCHR. I've been a BCHR-sceptic for some years, and this has put the final nail in this particular coffin, as far as I am concerned.

Occasionally, of course, testing will bear out the prevailing best practice advice. Most DBAs, for example, now recommend that if Hyper-threading is available with your server processor then you should have it enabled, regardless of the composition of your workload (OLTP vs. DW/OLAP). It's good to see Linchi Shea, back in the testing saddle over on sqlblog.com, and investigating such recommendations with hard data. On the whole, his tests appear to back up this advice, although he is still uncovering particular cases where, on many-core servers, query throughput is higher without hyper-threading.

If you read his book, you will know that I did eventually persuade Jonathan to offer some specific advice on max server memory…for a 64 GB RAM server, he suggests a value of about 54 GB (based on reserving 1 GB of RAM for the OS, 1 GB for each 4 GB of RAM installed from 4–16 GB, and then 1 GB for every 8 GB RAM installed above 16 GB RAM). However, he was still nervous, and only included it with the proviso that he was allowed to stress, several times, the need to tune the value in your own environment.

Databases are unique in IT that they are singularly resistant to 'best practice' advice. It is useful, I think, for DBAs and devs to feel they can offer such advice where they can, based on experience, but on the understanding what they are offering is not the "truth", or a hard-and-fast rule, but just a sensible starting position for tests in your own environment.

Cheers,

Tony.

Rate

You rated this post out of 5. Change rating

Share

Share

Rate

You rated this post out of 5. Change rating