SQLServerCentral Editorial

What's a Normal SQL Server?

,

Many of us have worked in environments that have a number of SQL Server instances. Different versions, editions, and mostly different hardware profiles. We deal with different applications and software that places different demands and requirements on our database systems. Some of us are even dealing with multiple datastores, trying to integrate SQL Server with Oracle, PostgreSQL, Elasticsearch, Redis and more.

Keeping an inventory of instances and patches can be a challenge. I hadn't thought this was a big problem, but I've been surprised how many people like the Estate Management Versions page in SQL Monitor. This gets a lot of use, as people work to keep track of their systems. With that in mind, I wonder how many people have to support the "average" SQL Server that Brent Ozar supports. He runs the SQL ConstantCare® service for his customers, and regularly reports on average data every quarter.

In his latest report, which contains data from over 3000 servers, we get a picture of what various organizations run for their instances. The version breakdown is what I expect, and I'm not surprised that 2016 is the most popular. This was a major release, with lots of changes. I think 2008 R2, 2014, 2017 were relatively minor releases. Few changes, and not good timing relative to other releases.

The data shows most people have supported versions, and I think some of the licensing rules and breaks to lead many people to think about getting Software Assurance to upgrade once in awhile. I also think it makes sense that many people manage relatively small databases, well under 125GB. If you do the math, 47% of these instances are these relatively small sizes. However, it is interesting to see nearly 15% of his clients are > 1TB. That's quite a spread.

With data sizes like this, you see lots of smaller hardware sizes. I don't know how this might compare with your organization, but it's interesting to think about how you might fit into these averages. Is your organization doing a better or worse job of giving you resources for your data size? Keep in mind, there isn't correlation here, so you don't know if the 30GB database has 4 cores or 24 cores.

I think having data like this is interesting. I do wish Microsoft would release more specific stats, like the version counts and database sizes, with hardware averages for those data sizes. I know why this doesn't make sense for them to release the data, but I still wish they would let us know more. As Brent says, they tend to present on really recent technology, which many of us might not use. I'm sure many of you have a variety of instances, but you'll easily be able to see if your spread of instances looks like Brent's client base.

Rate

You rated this post out of 5. Change rating

Share

Share

Rate

You rated this post out of 5. Change rating