Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««12

Analyzing Memory Requirements for SQL Server Expand / Collapse
Author
Message
Posted Monday, July 17, 2006 11:50 PM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, August 24, 2006 12:29 AM
Points: 13, Visits: 1

Hi Ken,

I did not come to this conclusion only by myself. It came out with a discussion with one of the MVP (SQL Server) Vinod Kumar on a recent Tech-Ed session. I have explained it in the previous post.

Thanks,

~Arindam. 

Post #295033
Posted Tuesday, July 18, 2006 12:41 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Tuesday, March 20, 2007 12:30 PM
Points: 24, Visits: 1

Hey, even I have heard about this rule.

This is NOT a hard and fast rule, what it means is - If you have a RAM more than your DB size then increasing the RAM may not help!

Do you also want to know how many DB's I have worked on?

Post #295039
Posted Wednesday, July 30, 2008 7:23 AM
SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Thursday, November 20, 2014 10:24 AM
Points: 1,865, Visits: 3,620
Does the "Process\Working Set" counter include AWE memory utilization?
I don't think it does.

In my machine it reaches 700,000,000 (700 MB), which is way below the 13.2 GB allocated using AWE.


__________________________________________________________________________________

Turbocharge Your Database Maintenance With Service Broker: Part 2
Turbocharge Your Database Maintenance With Service Broker: Part 1
Real-Time Tracking of Tempdb Utilization Through Reporting Services
Monitoring Database Blocking Through SCOM 2007 Custom Rules and Alerts
Preparing for the Unthinkable - a Disaster/Recovery Implementation
Post #543440
Posted Wednesday, July 30, 2008 7:26 AM
SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Thursday, November 20, 2014 10:24 AM
Points: 1,865, Visits: 3,620
Ken Shapley (7/17/2006)


"Generally, the rule of thumb is to have as much as RAM as your data file is."

Did you come up with this rule yourself? How many databases have you worked with in your career?


This does make sense. It may not be feasible with today's technologies and current costs of buying new RAM, but will likely be reality in a few years time. The 64-bit platform was an exotic curiosity a few years back, but not any more.


__________________________________________________________________________________

Turbocharge Your Database Maintenance With Service Broker: Part 2
Turbocharge Your Database Maintenance With Service Broker: Part 1
Real-Time Tracking of Tempdb Utilization Through Reporting Services
Monitoring Database Blocking Through SCOM 2007 Custom Rules and Alerts
Preparing for the Unthinkable - a Disaster/Recovery Implementation
Post #543446
Posted Thursday, September 3, 2009 11:01 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Thursday, December 18, 2014 8:12 AM
Points: 1, Visits: 117
Vladan (7/17/2006)
I am a bit surprised by some things in this article... for example:

"Generally, the rule of thumb is to have as much as RAM as your data file is."


You are right to be "surprised" (I was too). There are so many things peculiar about that statement:

  • What happens when you have a 500GB database?

  • What about when I have 200 50MB databases on my system?

  • What about databases with multiple data files?

  • How does FILESTREAM affect this recommendation?

I understand trying to simplify the planning process and maybe that's what he was going for? You certainly can't hurt having that much RAM but I've never seen that as a recommendation from anyone before.

What we care about is maintaining enough RAM to cover connections, plans, and the buffer cache (generally speaking). Saying that, "the rule of thumb is to have as much as RAM as your data file is" is suggesting that SQL Server will load your entire database in memory and serve it from RAM - which it won't (at least not without you querying/using the actual data).

RAM recommendations need to be done per app+db - OLAP, OLTP, combo. IMO it's hard to guesstimate what an unknown database's memory requirements will be. If you don't tell me how big it is, what it is used for, how it is loaded, how many users there are at launch vs. one year later, etc - I don't know anyone who can accurately provide that info.

I'll give you an example - I asked earlier, "What happens when you have a 500GB database?" Can you estimate the memory requirements for that database? I certainly can't. There are massively different needs depending on how it is used, etc. Maybe you do need 500GB of RAM - I don't have a clue.


========================================================

I have about 1,000 video tutorials on SQL Server 2008, 2005, and 2000 over at http://www.learnitfirst.com/Database-Professionals.aspx
Post #782388
Posted Wednesday, October 2, 2013 3:41 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Monday, October 28, 2013 4:59 PM
Points: 1, Visits: 5
There are more to consider beside the database size. Some time, for the database with small size you can build a query that produced large amount of data. So we have to consider how your store procedure and query are built.
Post #1500944
« Prev Topic | Next Topic »

Add to briefcase ««12

Permissions Expand / Collapse