Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 

Memory

Not a sound from the pavement

Have you ever come across a SQL query that used to run faster?  Has that query just recently begun to run considerably slower?

I just ran into one of those situations.  I had a query that should have been running in seven minutes or less (string parsing nearly 4 million records).  When it started running much slower, I decided to pull out some monitoring scripts and start checking a few things on the server.  I wanted to check for the usual suspects first and then find what was causing the alleged problems.

If you touch me you’ll understand what happiness is

When problems arise in a database, it is good to have a toolbelt of scripts available to help in quickly “touching” / assessing the problem.  In this case, I first turned to the following:

SELECT percent_complete, estimated_completion_time, cpu_time, total_elapsed_time, READS,writes,logical_reads
	,session_id,scheduler_id,start_time,status
	FROM sys.dm_exec_requests
go

My goal from this query is to quickly assess if the query is still progressing.  For the query in question, I should see that the query is steadily writing to the database.  I can also see really quick the status of the query.  Running this in conjunction with sp_who2 is helpful in identifying, at a high level, issues that may be occurring.  Another possible query to run in this kind of situation would be sp_whoisactive by Adam Machanic (Blog | Twitter), or the who2 script I wrote about here.  The idea, is to get a quick glimpse at this point to see if there is a potential issue.

In this case, I found nothing alarming or noteworthy from the initial queries.  Other queries could be run at this point to try and determine if there is a pressure point somewhere on your resources (memory, cpu).  I decided to take a quick glance at task manager just to get a quick visual of the server.  Looking at it, I saw that I had no memory available – all 96 GB were in use.  I proceeded to check the max memory setting on SQL server.  As expected, it was still set at the default setting.  A little further prodding showed that the SQL server process was consuming most of the memory available to the server.

I changed the max memory to 64GB and immediately saw improvements.  In task manager the effect could also be seen, as shown to in the pic to the left.  NO server restart or service restart was required.  This change took effect immediately.  The poorly performing query was soon performing back to normal.

I can smile at the old days

This should serve as a reminder to check your server settings from time to time and ensure that they are appropriately set.  This little change (and that was the only change), allowed my query to return to normal processing which was > 5 times faster than the problem range.  Happy troubleshooting.

Comments

Posted by ian stirk on 30 August 2010

Nice Article.

You might also be interested in other DMVs that can help identify performance problems.

Please see www.manning.com/stirk, the first chapter is available for FREE. It contains several useful scripts to identify your performance problems, and details of how they can be fixed, to significantly improve the performance of your SQL queries.

Thanks

Ian

Posted by John Sansom on 30 August 2010

Hi Jason, nice article. One thing that is not clear from reading your case, at least not to me, is why reducing the amount of memory enabled your query to perform better?

Was the query that experienced the performance drop using managed code such/CLR or a Linked servers for example?

Posted by Jeff Moden on 30 August 2010

Yep... BigJohn is asking the same question that's on my mind.  Reducing SQL Server memory to get better performance from a "tough" query seems contrary... is it simply because the operating system had a little more memory to handle all the disk writing the query was actually doing in this case?

Posted by jasona.work on 30 August 2010

I agree that reducing the memory sounds counter-intuitive.  Almost like removing a couple spark plugs from your cars motor to get more acceleration.  But, on thinking about this, I could see where doing this might help.  Possibly, the query was running slow, due to the OS not having enough RAM for "basic" functions.  So, it wasn't the query or SQL that was running slow, but the OS, which was slowing the query.

I now find myself wondering if some of the clients I support who occasionally have slow running SQL based apps might be helped by restricting SQL to less than the max RAM.  I'm inclined, at the moment, to think not, though, as I've looked when they've been reporting issues, and not seen the RAM maxed.

Jason A.

(NOT the article author!)

Posted by tony.hayes on 30 August 2010

In my experience, SQL server is a resource pig, consuming everything it can if you let it, including all available memory.  It will chew up so much memory that the OS on the box will have to start paging data to the hard drive for everyday operating system chores.  There are a lot of good articles on the web about throttling the memory and processor threads on the server to insure SQL doesn't step on the operating system.  Most of these changes will require a restart on the SQL services, however, so plan to do them outside of your production hours.

Posted by Mike Byrd on 30 August 2010

OK, I buy off on Sql Server being a memory hog and that maybe the OS is being starved.  What is the current rule of thumb for setting max SS memory?  Or how much should be left for the OS (assuming no other apps are running on the server)?

Posted by Steve on 30 August 2010

Is there a specific percentage of total OS memory tha we should set as a max for SQL Server to use?

Where do we set the max memory in MS SQL Server 2008?

Posted by Ion Freeman on 30 August 2010

Mike,

  I thought that was a good question, so I Googled it. Strictly, I, um, Bung it. Apparently, it depends if you have full text search running.

"Note  If you have installed and are running the Full-Text Search support (Microsoft Search service, also known as MSSearch), then you must set the max server memory option manually to leave enough memory for the MSSearch service to run. The max server memory setting must be adjusted in conjunction with the Windows NT 4.0 virtual memory size such that the virtual memory remaining for Full-Text Search is 1.5 times the physical memory (excluding the virtual memory requirements of the other services on the computer). Configure the SQL Server max server memory option so that there is sufficient virtual memory left to satisfy this Full-Text Search memory requirement. Total virtual memory - (SQL Server maximum virtual memory + virtual memory requirements of other services) >= 1.5 times the physical memory."

msdn.microsoft.com/.../aa196734(SQL.80).aspx

In any case, you have to know what else is running on the server.

"The max server memory server configuration option can be used to specify the maximum amount of memory SQL Server can allocate when it starts and while it runs. This configuration option can be set to a specific value if you know there are multiple applications running at the same time as SQL Server and you want to guarantee that these applications have sufficient memory to run. If these other applications, such as Web or e-mail servers, request memory only as needed, then do not set the max server memory server configuration option, because SQL Server releases memory to them as needed. However, applications often use whatever memory is available when they start and do not request more if needed. If an application that behaves in this manner runs on the same computer at the same time as SQL Server, set the max server memory server configuration option to a value that guarantees that the memory required by the application is not allocated by SQL Server."

msdn.microsoft.com/.../ms177455.aspx

Now, according to Jim McLeod, 32-bit SQL Server 2005 on Windows 2003 can't use more than 64 GB.

"The 32-bit versions of Windows Server 2003 support the following maximum memory limits: Standard Edition 4 GB, Enterprise Edition 32 GB, and Datacentre Edition 64 GB."

www.jimmcleod.net/.../sql-server-2005-and-awe-memory

We don't know what Jason B. was using here, but he uses a DMV available in 2005.

Posted by Jason Brimhall on 30 August 2010

Thanks guys for going back and forth on your Q and A.

BigJohn and Jeff - it was a tsql script.  All of the memory was being used by SQL Server and starving the OS from memory.  Thus as JasonA said - the OS was unable to perform some basic operations (also likely as Jeff said - write operations managed by the OS).

Version of OS was 2008 64 bit.

Version of SQL was 2008 Enterprise

I normally include that info in my blogs - oops.

Posted by John Sansom (BigJohn) on 30 August 2010

As some posters have asked for this, here is a link to a post on SQL Server Central by Glenn Berry that details some good guidelines that can form the basis of your starting point when considering what you should set the MAX Memory setting to for SQL Server.

www.sqlservercentral.com/.../suggested-max-memory-settings-for-sql-server-2005_2F00_2008.aspx

Posted by Jason Brimhall on 30 August 2010

Thanks John for posting that.

Posted by SQL Noob on 30 August 2010

had something similar after upgrading to SQL 2005. had to drop statistics and rebuild indexes from scratch to make it right

Posted by Eduard on 31 August 2010

with sql 2005 on a win2003 box i had a dev box which was performing dramatically. disks were stressed at max. implementing the max setting with less memory then it was using improved speed a lot. the swapping was gone. cause was virtualizing sql's memory.

but dont forget the minimum memory setting. i had a misbehaving monitoring app that fired vbs scripts and combined with an bad wscript/cscript engine it took 2+gb of ram every 5 min's. there goes your caching. minimum setting prevented sql server being put on the side line.

it can also be usefull on machines where due to circumstances there is competition for resources on one machine.

Posted by J.D. Gonzalez on 31 August 2010

Why the reduce to 64GB and not 72GB (or any other amount)?  Did I miss something?

Posted by Jason Brimhall on 31 August 2010

JD - no specific reason for 64Gb.  I just picked a 64GB since it nicely computes (32, 64, 128, 256, 512, 1024)

Posted by Andeavour on 1 September 2010

I always try and set a MAX Memory on any SQL Server I build now using SQL 2008. I've seen a few cases on servers without one where SQL hogs up all the memory, not leaving enough for basic OS operations, then Windows begins forcing SQL to page out some of its processes and performance goes to hell in a handcart.

Leaving just a few GB for the OS usually fixes this/prevents it from happening.

Posted by hirro on 9 September 2010

Any thoughts on this issue:

96 GB of memory on Server 2008 R2, SQL 2008 R2

Left the default setting for memory (it set to maximum, all 96GB).

-SQL process is using ~5GB with the DB loaded, lots of ram free according to task manager

Query takes 30 seconds to run.

When I drop the SQL memory setting down to 86GB or lower the sequel performance increases drastically (10-15 secs to run). If you look at task manager there is plenty of memory available and no paging going on. Can't make sense of why I'm getting better performance with lower memory allocated to SQL.

Thoughts?

Posted by grahamc on 10 September 2010

Having dealt with performance issues on a server at previous company and having brought MS in to help. The general recommendation for 64bit systems is that 5GB of memory should given to the OS (IE. taken from SQL Server). So for a server with 32GBs memory it should be 27GBs MAX for SQL Server. For your 96GBs SQL should be limited to 91GBs.

Leave a Comment

Please register or log in to leave a comment.