• EdVassie (9/1/2008)


    I don't think that the question gives enough detail to say that limiting SQL Server memory to 512 MB is the right answer.

    (snip)

    So, if you simply dive into a 2GB desktop machine and hope to tune problems found in a TB-sized object, your solutions are at best suspect and at worst counter-productive.

    You are absolutely correct. My question was never intended to question the best way to perform tests on a TB-sized DB, but rather to question the best way to "simulate memory pressure". That's why the QotD is titled "simulating memory pressure".

    If the wording of my question causes people to think that this a "best practice" for performance testing, than I hope they'll read your response as well. Thanks for the clarification, and my apologies if I have caused any confusion.


    Hugo Kornelis, SQL Server/Data Platform MVP (2006-2016)
    Visit my SQL Server blog: https://sqlserverfast.com/blog/
    SQL Server Execution Plan Reference: https://sqlserverfast.com/epr/