Simulating memory pressure

  • Comments posted to this topic are about the item Simulating memory pressure


    Hugo Kornelis, SQL Server/Data Platform MVP (2006-2016)
    Visit my SQL Server blog: https://sqlserverfast.com/blog/
    SQL Server Execution Plan Reference: https://sqlserverfast.com/epr/

  • Nicely Explained... the question needed such an explanation as all the options would have worked for the scenario but with some limitations and issues.

    But I am still confused about DROPCLEANBUFFERS / FREEPROCCACHE. The SP / Query once executed will be placed in the cache (if the memory is available). This will still happen if the memory is 512 MB. Please correct me if i am wrong...

    Atif Sheikh

    ----------------------------------------------------------------------------------------------------------------------------------------------------------------------
    Sometimes, winning is not an issue but trying.
    You can check my BLOG
    [font="Arial Black"]here[/font][/url][/right]

  • Atif Sheikh (9/1/2008)


    Nicely Explained... the question needed such an explanation as all the options would have worked for the scenario but with some limitations and issues.

    But I am still confused about DROPCLEANBUFFERS / FREEPROCCACHE. The SP / Query once executed will be placed in the cache (if the memory is available). This will still happen if the memory is 512 MB. Please correct me if i am wrong...

    Atif Sheikh

    Hi Atif,

    Thanks for the kind words! 🙂

    Regarding your question. On the real system, I would expect the most frequently used data pages (mainly root and intermediate level index pages, and maybe the data of some smaller and more frequently used tables) to get into cache and remain there most of the time. Most data pages of the larger table will get into cache when used, but will usually be swapped out before their next use.

    To simulate that as closely as possible, the test machine must also have a cache size that is smaller than the database size. If it's larger, the whole database will be in cache, and the amount of I/O will have much less impact on execution times.

    Running DBCC DROPCLEANBUFFERS and DBCC FREEPROCCACHE will force the cache empty once, but after that it will fill up again. And since the question involved a procedure with several steps, you should expect all but the first step to draw all data from cache only. And adding these DBCC commands in between all the steps in the procedure (that I did not add as an answer option, though I did consider it) would also not be a good simulation, since that would completely clean out the cache, whereas the real thing would keep the most frequently used (and the last accessed) pages in cache in between the queries.

    I hope this removes your confusion, but please do tell me if you need more information!


    Hugo Kornelis, SQL Server/Data Platform MVP (2006-2016)
    Visit my SQL Server blog: https://sqlserverfast.com/blog/
    SQL Server Execution Plan Reference: https://sqlserverfast.com/epr/

  • Hm, I am not sure whether I am correct, but from the QOTD, it is not clear whether the several steps involve the same tables or different tables. So DROPCLEANBUFFERS might still be an option.

    Anyhow, my first assumption was that the QOTD author mistakenly assumed that DROPCLEANBUFFERS would be the solution, so I chose it instead of the correct answer.

    If I had known that Hugo is the author of the QOTD, I would have chosen the correct answer.

    So how about including the author in the QOTD description? (Not only in the forum part)?

    But still, very good question as well as explained answer!

    Best Regards,

    Chris Büttner

  • I don't think that the question gives enough detail to say that limiting SQL Server memory to 512 MB is the right answer.

    If you are trying to tune a query against a TB-sized object you must ensure your test environment can recreate the conditions experienced by your full-sized database. Queries against a 1 GB object will behave differently to queries against a 1TB object.

    For example, with a 1GB object you may try using a table variable to hold some intermediate data and get better performance than using a #temp table. Scale up to even 100GB and the table variable could be holding so much data that a properly indexed #temp table is vastly more performant. Likewise many access plans that work well with a 1GB object will be poor with 1 1TB object. You can easily live with half your data being pulled into a workfile with a 1GB object, but copying 500 GB into tempdb will add a LOT of time to your query execution and it may be better to formulate a query that avoids this.

    So, if you simply dive into a 2GB desktop machine and hope to tune problems found in a TB-sized object, your solutions are at best suspect and at worst counter-productive.

    Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.

    When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hélder Câmara

  • My reading of the question is that you are trying to see what effect tweaking (or whatever) makes to your problem.

    If this is the case then DROPCLEANCACHE is surely essential (as many times as needed)

    Having the whole thing run "in memory" is actually an advantage because you will get your results quicker and they will be completely independent of the performance of the caching sub-system (which is a complete irrelevance)

  • Christian Buettner (9/1/2008)


    If I had known that Hugo is the author of the QOTD, I would have chosen the correct answer.

    So how about including the author in the QOTD description? (Not only in the forum part)?

    But still, very good question as well as explained answer!

    Well, if you have the daily emails sent to you, then the author is stated in the QOTD section in the email:

    Question of the Day

    Today's Question:

    You have been asked to optimize a stored procedure that runs against a terabyte-sized database. The stored procedure executes several steps consecutively. The performance problems appear to be mainly I/O related.

    You install a severely trimmed down test version of the database (1 GB in size) on your desktop computer running SQL Server Developer Edition. Before you start optimizing, you want to establish a baseline by timing the stored procedure on your development machine, so that you can later compare performance after adding indexes and tweaking code.

    However, your desktop has 2 GB of memory installed, and you are concerned that the performance test results may be skewed because the test version of the database fits entirely in cache. What is the best way to simulate the production circumstances as closely as possible?

    By Hugo Kornelis

    Also, if you go to the home page (http://www.sqlservercentral.com/) and take a look at the QOTD section of the left hand side of the page, the author is stated there as well:

    Question of the Day

    Simulating memory pressure

    By Hugo Kornelis 2008/09/01 | Category: Performance

  • Hi skyline666!

    I have turned off the daily notification, therefore I don't get the author from there.

    And I usually use the link on the left menu and therefore missed the QOTD showing on the right hand side.

    Thanks for the hints, I will now use the right QOTD frame to figure out the author.

    (Unfortunately this works only for the current question, so for older QOTDs, it still would be nice to show the author in the QOTD itself).

    Best Regards,

    Chris Büttner

  • Christian Buettner (9/1/2008)


    Hm, I am not sure whether I am correct, but from the QOTD, it is not clear whether the several steps involve the same tables or different tables. So DROPCLEANBUFFERS might still be an option.

    Hi Christian,

    If it's known that the different steps (a) involve different tables, and (b) do not involve that might end up refering to the same tables either, then DROPCLEANBUFFERS is indeed more useful.

    If I had known that Hugo is the author of the QOTD, I would have chosen the correct answer.

    If this is the compliment it looks like, then I thank you for it.

    (And if it isn't, then I'll just choose to misunderstand you and perceive it as a compliment anyway :D)


    Hugo Kornelis, SQL Server/Data Platform MVP (2006-2016)
    Visit my SQL Server blog: https://sqlserverfast.com/blog/
    SQL Server Execution Plan Reference: https://sqlserverfast.com/epr/

  • EdVassie (9/1/2008)


    I don't think that the question gives enough detail to say that limiting SQL Server memory to 512 MB is the right answer.

    (snip)

    So, if you simply dive into a 2GB desktop machine and hope to tune problems found in a TB-sized object, your solutions are at best suspect and at worst counter-productive.

    You are absolutely correct. My question was never intended to question the best way to perform tests on a TB-sized DB, but rather to question the best way to "simulate memory pressure". That's why the QotD is titled "simulating memory pressure".

    If the wording of my question causes people to think that this a "best practice" for performance testing, than I hope they'll read your response as well. Thanks for the clarification, and my apologies if I have caused any confusion.


    Hugo Kornelis, SQL Server/Data Platform MVP (2006-2016)
    Visit my SQL Server blog: https://sqlserverfast.com/blog/
    SQL Server Execution Plan Reference: https://sqlserverfast.com/epr/

  • Thanks for the clarification 🙂

    Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.

    When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hélder Câmara

  • Hugo Kornelis (9/1/2008)


    If this is the compliment it looks like, then I thank you for it.

    Yes, it was a compliment (although a "passive" one ;-). You are welcome.

    Best Regards,

    Chris Büttner

  • According to Microsoft (link), SQL Server 2008 supports up to 8TB of RAM, and Windows Server 2003 SP1 on X64 currently supports 1TB of RAM.

    Two out-of-the-box solutions to your problem, both of which assume your server is x64 with 2TB of RAM installed, and your database is 1.0TB (more specific than you posed in your question), and Windows Server 2003 SP1 is installed on your server...

    a. Remove half your memory in your desktop.

    b. Wait for a version of Windows Server that supports more than 1TB of RAM and upgrade.

    🙂

  • What about reducing total memory size available to Windows by changing boot.ini:

    "MAXMEM=

    Limits Windows to ignore (not use) physical memory beyond the amount indicated. The number is interpreted in megabytes. Example: /MAXMEM=32 would limit the system to using the first 32 MB of physical memory even if more were present."

    Is SQL Server memory for file cache totally independent from Windows? If not, Windows may still use free memory outside of SQL Server for caching pages from SQL Server, which will affect the final modelling result.

  • I learned a new thing. Thanks guys.

Viewing 15 posts - 1 through 15 (of 22 total)

You must be logged in to reply to this topic. Login to reply