SQL Clone
SQLServerCentral is supported by Redgate
 
Log in  ::  Register  ::  Not logged in
 
 
 


Simulating memory pressure


Simulating memory pressure

Author
Message
Ralf Dietrich
Ralf Dietrich
SSC-Addicted
SSC-Addicted (417 reputation)SSC-Addicted (417 reputation)SSC-Addicted (417 reputation)SSC-Addicted (417 reputation)SSC-Addicted (417 reputation)SSC-Addicted (417 reputation)SSC-Addicted (417 reputation)SSC-Addicted (417 reputation)

Group: General Forum Members
Points: 417 Visits: 87
EdVassie (9/1/2008)
I don't think that the question gives enough detail to say that limiting SQL Server memory to 512 MB is the right answer.

Pretty sure - you're right and of course 512 Limit should be too much.
Assuming that you have at least 64 GB for a 1 TB Database (usually more like 256 GB if budget appears) a DB shrink of 1:1000 should also shrink the mem limit by 1:1000 ==> 64 MB to 256 MB could be a useful limit

If you are trying to tune a query against a TB-sized object you must ensure your test environment can recreate the conditions experienced by your full-sized database. Queries against a 1 GB object will behave differently to queries against a 1TB object.


Usually you got much different query plans (depending one your query) And of course - if you don't have a 1 TB data warehouse table - you will have some joins - sorry but this question and the answers are misleading at all. (even DROPCLEANBUFFERS will simulate hard disc access BUT you have a slightly different storage subsystem and your desk and on the server

... we was doing a lot of tests with large databases also for development shrinking 10 times and 100 times. Try it out you 'll wonder about the results. And then - if you're sure what to do - simply go to another machine (from 4 Core to 8 Core and from 8 core to 16 Core, from 8 GB RAM to 12 GB to 24 to 32 GB) you'll pretty sure wondering again. And do not forget - change from INTEL to AMD platform and you'll see new versions

So, if you simply dive into a 2GB desktop machine and hope to tune problems found in a TB-sized object, your solutions are at best suspect and at worst counterproductive.


Highly agree with this last sentence. (But the a other written before - the question was simulating memory pressure - so I suggest 512 MB is to much to achive this We cannot tune a database here in the forum :coolSmile
Tom Thomson
Tom Thomson
One Orange Chip
One Orange Chip (25K reputation)One Orange Chip (25K reputation)One Orange Chip (25K reputation)One Orange Chip (25K reputation)One Orange Chip (25K reputation)One Orange Chip (25K reputation)One Orange Chip (25K reputation)One Orange Chip (25K reputation)

Group: General Forum Members
Points: 25753 Visits: 12494
I don't like any of the answers offered. The one I think comes nearest to being sensible is to increase the test DB size (or actually to not make such a stupidly small test db in the first place), but as has already been said a much better solution is to use a full size DB with a test server capable of handling it. Anyway, a stored proc that is perhaps handling a significant proportion of a terabyte of data with a shortage of memory is going to be soing a good deal more than 1GB of IO, and there's probably no imaginable way that the same SP handling a similar proportion of 1GB with 0.5GB available is going to be handling anything like the same amount of II, so reducing the store available to SQL Server on the test machine to 0.5GB is not going to give you a clue as to the performance of the production system on the full size DB, so I don't at all like the "correct" answer.

Tom

Hugo Kornelis
Hugo Kornelis
SSCoach
SSCoach (18K reputation)SSCoach (18K reputation)SSCoach (18K reputation)SSCoach (18K reputation)SSCoach (18K reputation)SSCoach (18K reputation)SSCoach (18K reputation)SSCoach (18K reputation)

Group: General Forum Members
Points: 18633 Visits: 12426
Tom, thanks for your comment. I mostly agree with what you say.

This was one of my first questions here, and not one I am still very proud of. The idea was to educate people about a possible use of the max server memory setting that I had come across while preparing a talk about query optimization, where I wanted to demonstrate cached vs non-cached IO without having to create an immense test database (because I was running out of free hard disk space).

But rereading the question and answer options now, I must admit that I somewhat disagree with myself now. :-D


Hugo Kornelis, SQL Server MVP
Visit my SQL Server blog: http://sqlblog.com/blogs/hugo_kornelis
Go


Permissions

You can't post new topics.
You can't post topic replies.
You can't post new polls.
You can't post replies to polls.
You can't edit your own topics.
You can't delete your own topics.
You can't edit other topics.
You can't delete other topics.
You can't edit your own posts.
You can't edit other posts.
You can't delete your own posts.
You can't delete other posts.
You can't post events.
You can't edit your own events.
You can't edit other events.
You can't delete your own events.
You can't delete other events.
You can't send private messages.
You can't send emails.
You can read topics.
You can't vote in polls.
You can't upload attachments.
You can download attachments.
You can't post HTML code.
You can't edit HTML code.
You can't post IFCode.
You can't post JavaScript.
You can post emoticons.
You can't post or upload images.

Select a forum

































































































































































SQLServerCentral


Search