Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 123»»»

Simulating memory pressure Expand / Collapse
Author
Message
Posted Saturday, August 30, 2008 2:57 PM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Yesterday @ 5:40 PM
Points: 5,930, Visits: 8,180
Comments posted to this topic are about the item Simulating memory pressure


Hugo Kornelis, SQL Server MVP
Visit my SQL Server blog: http://sqlblog.com/blogs/hugo_kornelis
Post #561662
Posted Monday, September 1, 2008 1:20 AM


Hall of Fame

Hall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of Fame

Group: General Forum Members
Last Login: Friday, June 27, 2014 8:02 AM
Points: 3,241, Visits: 4,996
Nicely Explained... the question needed such an explanation as all the options would have worked for the scenario but with some limitations and issues.

But I am still confused about DROPCLEANBUFFERS / FREEPROCCACHE. The SP / Query once executed will be placed in the cache (if the memory is available). This will still happen if the memory is 512 MB. Please correct me if i am wrong...

Atif Sheikh


----------------------------------------------------------------------------------------------------------------------------------------------------------------------

Sometimes, winning is not an issue but trying.

You can check my BLOG here

Post #561876
Posted Monday, September 1, 2008 2:22 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Yesterday @ 5:40 PM
Points: 5,930, Visits: 8,180
Atif Sheikh (9/1/2008)
Nicely Explained... the question needed such an explanation as all the options would have worked for the scenario but with some limitations and issues.

But I am still confused about DROPCLEANBUFFERS / FREEPROCCACHE. The SP / Query once executed will be placed in the cache (if the memory is available). This will still happen if the memory is 512 MB. Please correct me if i am wrong...

Atif Sheikh


Hi Atif,

Thanks for the kind words! :)

Regarding your question. On the real system, I would expect the most frequently used data pages (mainly root and intermediate level index pages, and maybe the data of some smaller and more frequently used tables) to get into cache and remain there most of the time. Most data pages of the larger table will get into cache when used, but will usually be swapped out before their next use.

To simulate that as closely as possible, the test machine must also have a cache size that is smaller than the database size. If it's larger, the whole database will be in cache, and the amount of I/O will have much less impact on execution times.

Running DBCC DROPCLEANBUFFERS and DBCC FREEPROCCACHE will force the cache empty once, but after that it will fill up again. And since the question involved a procedure with several steps, you should expect all but the first step to draw all data from cache only. And adding these DBCC commands in between all the steps in the procedure (that I did not add as an answer option, though I did consider it) would also not be a good simulation, since that would completely clean out the cache, whereas the real thing would keep the most frequently used (and the last accessed) pages in cache in between the queries.

I hope this removes your confusion, but please do tell me if you need more information!



Hugo Kornelis, SQL Server MVP
Visit my SQL Server blog: http://sqlblog.com/blogs/hugo_kornelis
Post #561889
Posted Monday, September 1, 2008 2:45 AM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Tuesday, July 22, 2014 1:45 AM
Points: 2,826, Visits: 3,866
Hm, I am not sure whether I am correct, but from the QOTD, it is not clear whether the several steps involve the same tables or different tables. So DROPCLEANBUFFERS might still be an option.

Anyhow, my first assumption was that the QOTD author mistakenly assumed that DROPCLEANBUFFERS would be the solution, so I chose it instead of the correct answer.

If I had known that Hugo is the author of the QOTD, I would have chosen the correct answer.
So how about including the author in the QOTD description? (Not only in the forum part)?

But still, very good question as well as explained answer!


Best Regards,
Chris Büttner
Post #561895
Posted Monday, September 1, 2008 3:17 AM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: 2 days ago @ 5:34 AM
Points: 2,859, Visits: 3,187
I don't think that the question gives enough detail to say that limiting SQL Server memory to 512 MB is the right answer.

If you are trying to tune a query against a TB-sized object you must ensure your test environment can recreate the conditions experienced by your full-sized database. Queries against a 1 GB object will behave differently to queries against a 1TB object.

For example, with a 1GB object you may try using a table variable to hold some intermediate data and get better performance than using a #temp table. Scale up to even 100GB and the table variable could be holding so much data that a properly indexed #temp table is vastly more performant. Likewise many access plans that work well with a 1GB object will be poor with 1 1TB object. You can easily live with half your data being pulled into a workfile with a 1GB object, but copying 500 GB into tempdb will add a LOT of time to your query execution and it may be better to formulate a query that avoids this.

So, if you simply dive into a 2GB desktop machine and hope to tune problems found in a TB-sized object, your solutions are at best suspect and at worst counter-productive.



Original author: SQL Server FineBuild 1-click install and best practice configuration of SQL Server 2014, 2012, 2008 R2, 2008 and 2005. 28 July 2014: now over 30,000 downloads.
Disclaimer: All information provided is a personal opinion that may not match reality.
Concept: "Pizza Apartheid" - the discrimination that separates those who earn enough in one day to buy a pizza if they want one, from those who can not.
Post #561909
Posted Monday, September 1, 2008 3:30 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Friday, April 10, 2009 12:04 PM
Points: 27, Visits: 90
My reading of the question is that you are trying to see what effect tweaking (or whatever) makes to your problem.
If this is the case then DROPCLEANCACHE is surely essential (as many times as needed)
Having the whole thing run "in memory" is actually an advantage because you will get your results quicker and they will be completely independent of the performance of the caching sub-system (which is a complete irrelevance)
Post #561914
Posted Monday, September 1, 2008 4:15 AM
Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Wednesday, September 24, 2008 1:27 PM
Points: 350, Visits: 229
Christian Buettner (9/1/2008)

If I had known that Hugo is the author of the QOTD, I would have chosen the correct answer.
So how about including the author in the QOTD description? (Not only in the forum part)?

But still, very good question as well as explained answer!


Well, if you have the daily emails sent to you, then the author is stated in the QOTD section in the email:

Question of the Day
Today's Question:


You have been asked to optimize a stored procedure that runs against a terabyte-sized database. The stored procedure executes several steps consecutively. The performance problems appear to be mainly I/O related.

You install a severely trimmed down test version of the database (1 GB in size) on your desktop computer running SQL Server Developer Edition. Before you start optimizing, you want to establish a baseline by timing the stored procedure on your development machine, so that you can later compare performance after adding indexes and tweaking code.

However, your desktop has 2 GB of memory installed, and you are concerned that the performance test results may be skewed because the test version of the database fits entirely in cache. What is the best way to simulate the production circumstances as closely as possible?

By Hugo Kornelis



Also, if you go to the home page (http://www.sqlservercentral.com/) and take a look at the QOTD section of the left hand side of the page, the author is stated there as well:

Question of the Day
Simulating memory pressure
By Hugo Kornelis 2008/09/01 | Category: Performance
Post #561927
Posted Monday, September 1, 2008 4:33 AM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Tuesday, July 22, 2014 1:45 AM
Points: 2,826, Visits: 3,866
Hi skyline666!

I have turned off the daily notification, therefore I don't get the author from there.
And I usually use the link on the left menu and therefore missed the QOTD showing on the right hand side.

Thanks for the hints, I will now use the right QOTD frame to figure out the author.

(Unfortunately this works only for the current question, so for older QOTDs, it still would be nice to show the author in the QOTD itself).


Best Regards,
Chris Büttner
Post #561934
Posted Monday, September 1, 2008 6:34 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Yesterday @ 5:40 PM
Points: 5,930, Visits: 8,180
Christian Buettner (9/1/2008)
Hm, I am not sure whether I am correct, but from the QOTD, it is not clear whether the several steps involve the same tables or different tables. So DROPCLEANBUFFERS might still be an option.


Hi Christian,

If it's known that the different steps (a) involve different tables, and (b) do not involve that might end up refering to the same tables either, then DROPCLEANBUFFERS is indeed more useful.

If I had known that Hugo is the author of the QOTD, I would have chosen the correct answer.


If this is the compliment it looks like, then I thank you for it.
(And if it isn't, then I'll just choose to misunderstand you and perceive it as a compliment anyway :D)



Hugo Kornelis, SQL Server MVP
Visit my SQL Server blog: http://sqlblog.com/blogs/hugo_kornelis
Post #561976
Posted Monday, September 1, 2008 6:37 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Yesterday @ 5:40 PM
Points: 5,930, Visits: 8,180
EdVassie (9/1/2008)
I don't think that the question gives enough detail to say that limiting SQL Server memory to 512 MB is the right answer.

(snip)

So, if you simply dive into a 2GB desktop machine and hope to tune problems found in a TB-sized object, your solutions are at best suspect and at worst counter-productive.


You are absolutely correct. My question was never intended to question the best way to perform tests on a TB-sized DB, but rather to question the best way to "simulate memory pressure". That's why the QotD is titled "simulating memory pressure".

If the wording of my question causes people to think that this a "best practice" for performance testing, than I hope they'll read your response as well. Thanks for the clarification, and my apologies if I have caused any confusion.



Hugo Kornelis, SQL Server MVP
Visit my SQL Server blog: http://sqlblog.com/blogs/hugo_kornelis
Post #561979
« Prev Topic | Next Topic »

Add to briefcase 123»»»

Permissions Expand / Collapse