SQLServerCentral Article

Analyzing Memory Requirements for SQL Server

,

In my previous article I had talked about analyzing disk capacity for a SQL Server box. In this article, we will look at memory requirements for a SQL Server.

When we are going to estimate the memory requirements, we have to look at various aspects of SQL Server as well as the operating system. The memory will not be used for only SQL Server, but also other services, processes and operating system. Below are some series of questions and areas to check before you can zero in on determining and forecasting memory requirements for SQL Server.

  • Check the average free space and handling of memory in SQL Server.
  • Determine memory usage of SQL Server.
  • What is the size of the database(s)?
  • How much memory is getting used by SQL Server connections?
  • How many SQL Server instances are running on the Server?
  • How many users are accessing the database?
  • What is the database growth and how much memory should it consume in that way?
  • What is the baseline for present memory usage?

Check the average free space and handling of memory in SQL Server

As I have mentioned earlier, determining memory for a SQL Server box is not only checking how much memory SQL Server itself requires, but overall how much memory does the box requires. A couple of perfmon counters that come to mind are:

  • Memory: Pages/sec.
  • Memory: Available Bytes

Memory: Pages/sec. determines how many pages are read/written to the disk because of hard page faults. A hard page fault pertains to data fetching from hard disk, which adds latency to the throughput of the data. So, this counter should be as low as possible.

Memory: Available Bytes simply indicates how many bytes are available. If the counter is high, then your server is working fine, but check the counter when the server is undergoing the normal load that it is expected to handle.

Determine memory usage of SQL Server

We have talked about the overall server memory. We will now check SQL server memory usage pattern. In perfmon, you can find explicit counters given to check the SQL Server memory usage. SQL Server: Memory Manager: Total Server Memory counter can tell you the amount of dynamic memory the server is presently consuming. You can also look into:

  • SQL Server: Buffer Manager: Buffer cache hit ratio
  • SQL Server: Buffer Manager: Page Life Expectancy
  • Process: Working Set

SQL Server: Buffer Manager: Buffer cache hit ratio should be more than 90% for a SQL Server.

SQL Server: Buffer Manager: Page Life Expectancy should be above 350. If it is lower or almost equal to 350-400 mark and you are expecting much growth of the database, then be ready to buy more memory modules for your server.

Process: Working Set tells you much memory is SQL Server consuming. If it is way below than what you have configured by MIN SERVER MEMORY then you have allocated more memory than the server needs and if it exceeds MAX SERVER MEMORY, then look TO buy new memory. In the figure below, you can see that the upper perfmon is showing signs of danger in page life expectancy even though the buffer cache hit ration is above 90. The lower one show things improved after adding more memory and getting more page life expectancy than the previous one.

If you are using SQL Server 2005, then you can use Dynamic Management Views (DMV), which give more insight to the problem. I will talk more about DMVs in coming articles.

What is the size of the database?

Knowing the size of your database is a must for a DBA to know how much memory is required to support that much data. Generally, the rule of thumb is to have as much as RAM as your data file is. If you have a 4GB data file then a 4GB memory module will be fine for your application. It is not that all of your data file will be inside primary memory, but you have to consider indexing, tempdb space, plan cache, procedure cache, user connection memory etc also.

How much memory is getting used by SQL Server connections?

SQL Server: Memory Manager: Connection Memory (KB) will give an hint of how much memory is getting used by SQL Server. If your application is growing by the number of users, then you can find a hint on your user growth and the memory requirement. Also, a tip here is, if your application is an OLTP one, then it is more likely that the users will query it through a front end. If this is the case, then run SQL Server as a background process. Conversely, if you have an OLAP application, run SQL Server as a foreground process.

How many SQL Server instances are running on the Server?

If you are running multiple instances of your SQL Server on the same box, be ready to estimate memory accordingly. More instances will consume more memory and will cause memory stress.

How many users are accessing the database?

If there are lots of users, connecting at the same time to your database and firing ad-hoc queries, then there will be considerate amount of memory pressure in your system. Check the SQL Server: Memory Manager: Connection Memory (KB) to see the memory needed to maintain connections on a normal business day. Store the report for a considerable amount of time and check when the user load was heavy and the response of your server. You can determine whether you really need an additional memory module or not.

What is the database growth and how much memory should it consume in that way?

In my previous article I had given some formulae to identify and calculate the growth of your data file. Estimate your memory growth according to the same formulas. Additionally, if your users are growing at a 10% rate, then the connections to the server will also increase in that way. Be prepared to handle this and be sure your server also is prepared

What is the baseline for present memory usage?

Last, but not the least, you need to have a baseline clearly defined to identify your requirement. Gather the counter values of perfmon for 2-3 months. If you already have some reports of the past, study those and create a baseline for your server. This serves as a checklist for you to see everything is in good position and health.

Conclusion

Memory is a very important aspect to think about and manage. It becomes very predictive, if properly handled, but can take you for a toss if you are not careful enough. There are certain best practices that I would like to share. This goes out for everything in your server that you care for. If you are working on some existing system, then get the historical data of the system. Gather it in all circumstances and instances – best, worst and normal. Understand the trends of growth and usage. If you have to create a new system, don’t forget to ask the vendor to get you a test machine of the type you are considering. Make some load on the server and test it out. If your business is of 100 users, test it for 1000 and see how does it perform? Decide what suits you best before jumping into buying spree. Remember that components does not come cheep and your business probably won’t churn up money just because you asked for it.

Rate

4.44 (9)

You rated this post out of 5. Change rating

Share

Share

Rate

4.44 (9)

You rated this post out of 5. Change rating