Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12»»

Analyzing Memory Requirements for SQL Server Expand / Collapse
Author
Message
Posted Tuesday, July 11, 2006 10:27 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, August 24, 2006 12:29 AM
Points: 13, Visits: 1
Comments posted to this topic are about the content posted at http://www.sqlservercentral.com/columnists/aBanerjee/analyzingmemoryrequirementsforsqlserver.asp
Post #293556
Posted Monday, July 17, 2006 6:03 AM
SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Thursday, May 5, 2011 1:38 AM
Points: 1,636, Visits: 604

I am a bit surprised by some things in this article... for example:

"Generally, the rule of thumb is to have as much as RAM as your data file is."

Is it really so? Isn't there some limit from where this rule does not apply? Our data file has over 200 GB, but I have never seen a server with that much memory. Obviously, it would be terribly expensive... I don't think databases around this size are rare, so what is your opinion about their memory requirements? We are running this DB on a box with 8GB RAM; I suppose it would work better with more memory, but the performance (with 20-40 concurrent users) is quite good.




Post #294772
Posted Monday, July 17, 2006 7:18 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Tuesday, March 20, 2007 12:30 PM
Points: 24, Visits: 1
Good article! Could learn some new things.
Post #294788
Posted Monday, July 17, 2006 8:56 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, August 24, 2006 12:29 AM
Points: 13, Visits: 1

Hi Vladan,

The general idea is to get the data in the cache whenever the need arises. It certainly does not mean that if I have a TB of data file, then I would have to have that much of RAM, but when I have a database where I am getting frequent performance problem and I want to troubleshoot the issue, then I would certainly look into the rule of thumb. If I have a 8GB data file and 8GB of memory, I would assume that the memory is OK, and jump into the other aspects, like CPU, Network or Disk for troubleshooting.

On the other side, I have given some perfmon counters which will help you to identify if you really require additional memory. The rule of thumb is just to give you a hint to look for when you do not have adequete time for investigation.

Hope this clarifies the issue.

Thanks for taking time to read and give your valuable feedback.

~Arindam.

Post #294862
Posted Monday, July 17, 2006 9:01 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, August 24, 2006 12:29 AM
Points: 13, Visits: 1

Hi Ravi,

Thanks for your time. Glad to know that it increased your knowledge. Hope to write more on capacity management. Please check those too.

Thanks,

~Arindam.

Post #294866
Posted Monday, July 17, 2006 9:08 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Tuesday, March 20, 2007 12:30 PM
Points: 24, Visits: 1

I have read both your articles. second one is more useful for me. Will be waiting for future articles from you.

Thank you.

 

 

 

Post #294871
Posted Monday, July 17, 2006 10:46 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Friday, November 23, 2012 11:29 AM
Points: 19, Visits: 42
Ravi, thks for the time in writing this. However I would need some more data for the article to really hit the spot to be honest. For instance, how much does the sql server kernel actually take in memory (just the core functionality disregarding any user db). And how much do instances take? I know it's wrong to assume that 2 instances of the with the exact same data and load will take 2x the memory. And would you be able to quantify how much improvement (if any) can be expected in running sql server as a background process in an OLTP scenario - i really don't see a correlation here you might be assuming something I am not.

Totally agree with you on the collecting data for baselines,
thanks for posting article,
you're a braver man than me,
SNMF
Post #294908
Posted Monday, July 17, 2006 10:57 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Tuesday, March 20, 2007 12:30 PM
Points: 24, Visits: 1

>>Ravi, thks for the time in writing this. However I would

Sergio, I think you are talking to the wrong guy! It should be ...Arindam....

Post #294914
Posted Monday, July 17, 2006 11:17 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, August 24, 2006 12:29 AM
Points: 13, Visits: 1

Hi Sergio,

Thanks for reading this article. The first question which you have is really a tricky one. I may not be correct on commenting in respect to this question because I did not find any dependable resource, but for my server, it takes 80-150mb approx for the kernel to run. However, you can adjust the MIN SERVER MEMORY and MAX SERVER MEMORY and see how much memory allocation will be optimum for you requirement. Once again, this comment is solely based on my observation and may be different than actual.

For the second question, every connection will take 64bytes of memory. It is not wrong, to assume that 2 instances connected to the same server and database will consume 2x memory! Actually, if you have n instances open for any server, the server will reserver 64n bytes of memory. Now, this brings us to the important consideration for the connection memory utilised and we do have a counter for that too in our favourite perfmon!

The answer for the third qustion is to liverage your operating system architecture. In Windows operating system, we can run an application as a foreground process or a background process. If we have an application which is not accessed by foreground, then it will be convenient for the OS and the App to run in background. If you want to run any foreground process on that server, other that SQL apps, the OS will be more efficient on managing the memory. Once again, I am not in the place to comment on the actual benefit it would cause, but can certainly say that it would be apt for the process and you would be getting the benefits in certain scenario as discussed above.

Hope this clarifies some part of your doubts. Please do write if you have more questions.

Sincerely,

~Arindam (not Ravi)  

Post #294924
Posted Monday, July 17, 2006 12:56 PM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Friday, August 22, 2014 7:27 AM
Points: 26, Visits: 161
 

"Generally, the rule of thumb is to have as much as RAM as your data file is."

Did you come up with this rule yourself? How many databases have you worked with in your career?

Post #294951
« Prev Topic | Next Topic »

Add to briefcase 12»»

Permissions Expand / Collapse