• EdVassie (9/1/2008)


    I don't think that the question gives enough detail to say that limiting SQL Server memory to 512 MB is the right answer.

    Pretty sure - you're right and of course 512 Limit should be too much.

    Assuming that you have at least 64 GB for a 1 TB Database (usually more like 256 GB if budget appears) a DB shrink of 1:1000 should also shrink the mem limit by 1:1000 ==> 64 MB to 256 MB could be a useful limit

    If you are trying to tune a query against a TB-sized object you must ensure your test environment can recreate the conditions experienced by your full-sized database. Queries against a 1 GB object will behave differently to queries against a 1TB object.

    Usually you got much different query plans (depending one your query) And of course - if you don't have a 1 TB data warehouse table - you will have some joins - sorry but this question and the answers are misleading at all. (even DROPCLEANBUFFERS will simulate hard disc access BUT you have a slightly different storage subsystem and your desk and on the server

    ... we was doing a lot of tests with large databases also for development shrinking 10 times and 100 times. Try it out you 'll wonder about the results. And then - if you're sure what to do - simply go to another machine (from 4 Core to 8 Core and from 8 core to 16 Core, from 8 GB RAM to 12 GB to 24 to 32 GB) you'll pretty sure wondering again. And do not forget - change from INTEL to AMD platform and you'll see new versions

    So, if you simply dive into a 2GB desktop machine and hope to tune problems found in a TB-sized object, your solutions are at best suspect and at worst counterproductive.

    Highly agree with this last sentence. (But the a other written before - the question was simulating memory pressure - so I suggest 512 MB is to much to achive this We cannot tune a database here in the forum :cool:)