Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase «««123

Simulating memory pressure Expand / Collapse
Author
Message
Posted Wednesday, December 17, 2008 3:17 PM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Tuesday, August 6, 2013 5:22 AM
Points: 391, Visits: 87
EdVassie (9/1/2008)
I don't think that the question gives enough detail to say that limiting SQL Server memory to 512 MB is the right answer.

Pretty sure - you're right and of course 512 Limit should be too much.
Assuming that you have at least 64 GB for a 1 TB Database (usually more like 256 GB if budget appears) a DB shrink of 1:1000 should also shrink the mem limit by 1:1000 ==> 64 MB to 256 MB could be a useful limit

If you are trying to tune a query against a TB-sized object you must ensure your test environment can recreate the conditions experienced by your full-sized database. Queries against a 1 GB object will behave differently to queries against a 1TB object.


Usually you got much different query plans (depending one your query) And of course - if you don't have a 1 TB data warehouse table - you will have some joins - sorry but this question and the answers are misleading at all. (even DROPCLEANBUFFERS will simulate hard disc access BUT you have a slightly different storage subsystem and your desk and on the server

... we was doing a lot of tests with large databases also for development shrinking 10 times and 100 times. Try it out you 'll wonder about the results. And then - if you're sure what to do - simply go to another machine (from 4 Core to 8 Core and from 8 core to 16 Core, from 8 GB RAM to 12 GB to 24 to 32 GB) you'll pretty sure wondering again. And do not forget - change from INTEL to AMD platform and you'll see new versions

So, if you simply dive into a 2GB desktop machine and hope to tune problems found in a TB-sized object, your solutions are at best suspect and at worst counterproductive.


Highly agree with this last sentence. (But the a other written before - the question was simulating memory pressure - so I suggest 512 MB is to much to achive this We cannot tune a database here in the forum )
Post #621662
Posted Saturday, December 18, 2010 12:56 PM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 6:51 PM
Points: 7,702, Visits: 9,432
I don't like any of the answers offered. The one I think comes nearest to being sensible is to increase the test DB size (or actually to not make such a stupidly small test db in the first place), but as has already been said a much better solution is to use a full size DB with a test server capable of handling it. Anyway, a stored proc that is perhaps handling a significant proportion of a terabyte of data with a shortage of memory is going to be soing a good deal more than 1GB of IO, and there's probably no imaginable way that the same SP handling a similar proportion of 1GB with 0.5GB available is going to be handling anything like the same amount of II, so reducing the store available to SQL Server on the test machine to 0.5GB is not going to give you a clue as to the performance of the production system on the full size DB, so I don't at all like the "correct" answer.

Tom
Post #1036916
Posted Saturday, December 18, 2010 2:59 PM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 1:37 PM
Points: 6,007, Visits: 8,270
Tom, thanks for your comment. I mostly agree with what you say.

This was one of my first questions here, and not one I am still very proud of. The idea was to educate people about a possible use of the max server memory setting that I had come across while preparing a talk about query optimization, where I wanted to demonstrate cached vs non-cached IO without having to create an immense test database (because I was running out of free hard disk space).

But rereading the question and answer options now, I must admit that I somewhat disagree with myself now.



Hugo Kornelis, SQL Server MVP
Visit my SQL Server blog: http://sqlblog.com/blogs/hugo_kornelis
Post #1036924
Posted Thursday, August 28, 2014 1:48 PM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Today @ 7:30 AM
Points: 2,947, Visits: 243
This seemed the most logical.
Post #1608406
« Prev Topic | Next Topic »

Add to briefcase «««123

Permissions Expand / Collapse