June 16, 2003 at 7:27 pm
We are using a web application using ADO for connection to SQL server database. Web application performs decent when running 30 client databases on 4 proc server with 4 Gig of ram. Application performs decent for 60 client databases on 4 proc server with 8 Gig of ram. Processor rarely goes above 50%. I/O does not seem to be a bottleneck, neither does network. Our goal is to have 100 clients on 4 proc box while utilizing only 4 Gig of ram (possibly 8). In each scenario, when the number of client databases is increased, we definitely noticed the cache hit ratio will start falling below 90% and subsequently the users will experience slower performance from the web app.
Obviously increasing ram from 8 – 16 Gig seems like it will take care of the situation. However, other than adding an incredible amount of memory for a box that is barely utilizing the CPU’s does not seem like it should always be the answer. What else can be tuned, looked at, monitored, changed etc… in order to achieve decent performance without increasing memory? Is there anything that seems to stand out as the “smoking gun”?
June 16, 2003 at 8:15 pm
Hi
A bulk of sql servers memory usage is the buffer cache. Therefore, the core aim of any dba is to maximise the effectiveness of the cache, this has a two fold affect, in that 1) more logical reads rather than physical ones, 2) cache "space" is being used optimally and can service more applications with the available ram with little or no performance hit.
The DBA has a few strategies but all of which is dependent on the fact that you can alter SQL etc (which you cant in a lot of off the shelf apps). The key tool is profiler of course, and then factoring in issues of adhoc reporting, dss like queries, pinning very common tables (take care with this in your case). All takes some time to nut out and at the hrly rate of DBA's, you gotta do the cost/benefit analysis...
Cheers
Ck
Chris Kempster
www.chriskempster.com
Author of "SQL Server Backup, Recovery & Troubleshooting"
Author of "SQL Server 2k for the Oracle DBA"
June 17, 2003 at 11:18 am
Chris has good advice. You can also take some of your long running queries from Profiler and check them in Quary Analyzer. See if there are any physical reads. If not, then memory won't help since you aren't going to disk anyway. Memory often helps, but might not be the answer.
Steve Jones
Viewing 3 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply