August 23, 2006 at 2:59 pm
Is there a general "rule of thumb" that for every gigabyte of database there should be so many megabytes of RAM
August 23, 2006 at 11:16 pm
try yo keep your ram = to your db size. this will keep i/o paging to a min.
if this is not possible then run your performance monitor: Buffer Cache Hit Ratio counter this should stay above 99% U are good to go. if it is below 90% U need to add more ram
August 24, 2006 at 6:36 am
I don't think there's a good rule of thumb for this. With database sizes ranging up to terrabytes, there's no way RAM can be made to match. I've managed everything from 5MB databases to 1.5 terrabyte ones.
I don't know if there's a rule of thumb that works within certain ranges, maybe someone else will post a useful one.
I think that workload plays a more important part. How many users are connecting, what types of queries are they running, etc.
August 24, 2006 at 9:35 am
here is good place to start, there is segment for Ram that will get u moving in the right direction.
GOOD LUCK!!
http://www.sql-server-performance.com/sql_server_performance_audit3.asp
Viewing 4 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply