Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««123»»

Simulating memory pressure Expand / Collapse
Author
Message
Posted Monday, September 1, 2008 7:08 AM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Today @ 4:14 AM
Points: 2,868, Visits: 3,215
Thanks for the clarification :)

Original author: SQL Server FineBuild 1-click install and best practice configuration of SQL Server 2014, 2012, 2008 R2, 2008 and 2005. 28 July 2014: now over 30,000 downloads.
Disclaimer: All information provided is a personal opinion that may not match reality.
Concept: "Pizza Apartheid" - the discrimination that separates those who earn enough in one day to buy a pizza if they want one, from those who can not.
Post #561990
Posted Monday, September 1, 2008 7:14 AM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Friday, August 22, 2014 12:47 AM
Points: 2,840, Visits: 3,872
Hugo Kornelis (9/1/2008)

If this is the compliment it looks like, then I thank you for it.

Yes, it was a compliment (although a "passive" one . You are welcome.


Best Regards,
Chris Büttner
Post #561992
Posted Monday, September 1, 2008 12:06 PM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, May 8, 2014 4:18 PM
Points: 17, Visits: 52
According to Microsoft (link), SQL Server 2008 supports up to 8TB of RAM, and Windows Server 2003 SP1 on X64 currently supports 1TB of RAM.

Two out-of-the-box solutions to your problem, both of which assume your server is x64 with 2TB of RAM installed, and your database is 1.0TB (more specific than you posed in your question), and Windows Server 2003 SP1 is installed on your server...

a. Remove half your memory in your desktop.
b. Wait for a version of Windows Server that supports more than 1TB of RAM and upgrade.

:)
Post #562074
Posted Monday, September 1, 2008 7:51 PM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Wednesday, June 23, 2010 7:48 PM
Points: 38, Visits: 183
What about reducing total memory size available to Windows by changing boot.ini:

"MAXMEM=
Limits Windows to ignore (not use) physical memory beyond the amount indicated. The number is interpreted in megabytes. Example: /MAXMEM=32 would limit the system to using the first 32 MB of physical memory even if more were present."

Is SQL Server memory for file cache totally independent from Windows? If not, Windows may still use free memory outside of SQL Server for caching pages from SQL Server, which will affect the final modelling result.
Post #562155
Posted Tuesday, September 2, 2008 9:07 AM
SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Friday, August 29, 2014 4:30 AM
Points: 5,345, Visits: 1,388
I learned a new thing. Thanks guys.


Post #562448
Posted Tuesday, September 2, 2008 9:23 AM


Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Tuesday, February 25, 2014 12:24 PM
Points: 19, Visits: 247
"Creating a bigger test database will also work, but involves a lot more work, will cause your tests to run longer, and might require you to clean up your collection of holiday pictures."

Personally, I don't care for any of the possible answers.

I've never had to cut down a 1TB database to 1GB for testing. Why would that be more work than dropping it to 5GB? I wouldn't recommend keeping so many personal pictures on your company's computer. I also wouldn't recommend doing this kind of performance testing on a laptop or desktop.

You want your testing environment to match your production environment as closely as possible. Move ALL the data back down to your dev or test server and rip into it. If you don't have the room to place it then there is a good chance you don't have a disaster recovery plan in place.

David
Post #562463
Posted Thursday, September 4, 2008 2:26 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Wednesday, September 10, 2008 7:34 AM
Points: 38, Visits: 41
I think this is more or less the outcome of the discussion so far, but I think a combination of limiting the server memory and DBCC FREEPROCCACHE before executing the procedure would be the way forward to cover both the "memory pressure" and the "fair comparison" elements of the question.

Note that a CHECKPOINT immediately before the performance test is recommended to also write the dirty
buffers to disk.
Post #563621
Posted Thursday, September 4, 2008 9:13 PM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Wednesday, June 23, 2010 7:48 PM
Points: 38, Visits: 183
Maybe use all of this:

1) checkpoint
2) dbcc dropcleanbuffers
3) dbcc freeproccache
4) dbcc freesystemcache('all')

(The last one N4 probably is the same as N2 + N3).

Using tests for the whole TB database may be very time-consuming even when you have the whole DB available. This should be done after the first approach with reduced in size DB.
Post #564299
Posted Wednesday, September 24, 2008 7:22 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Tuesday, September 14, 2010 1:10 PM
Points: 109, Visits: 79
About question: Simulating memory pressure

You have been asked to optimize a stored procedure that runs against a terabyte-sized database. The stored procedure executes several steps consecutively. The performance problems appear to be mainly I/O related.

You install a severely trimmed down test version of the database (1 GB in size) on your desktop computer running SQL Server Developer Edition. Before you start optimizing, you want to establish a baseline by timing the stored procedure on your development machine, so that you can later compare performance after adding indexes and tweaking code.

However, your desktop has 2 GB of memory installed, and you are concerned that the performance test results may be skewed because the test version of the database fits entirely in cache. What is the best way to simulate the production circumstances as closely as possible?

I'm don't understand, someone prune my explicate? Thank you that question is very hard for me! i'm novice in SQL Server
Post #575168
Posted Wednesday, September 24, 2008 7:40 AM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Today @ 4:14 AM
Points: 2,868, Visits: 3,215
lucassouzace,

You should read the previous posts in this thread. A number of people thought the 'correct' answer was wrong or the question did not have a single answer.

I am happy to accept that the 'correct' answer will simulate memory pressure, but I am also certain the 'correct' answer is not the right way to troubleshoot a problem with a 1TB table.


Original author: SQL Server FineBuild 1-click install and best practice configuration of SQL Server 2014, 2012, 2008 R2, 2008 and 2005. 28 July 2014: now over 30,000 downloads.
Disclaimer: All information provided is a personal opinion that may not match reality.
Concept: "Pizza Apartheid" - the discrimination that separates those who earn enough in one day to buy a pizza if they want one, from those who can not.
Post #575191
« Prev Topic | Next Topic »

Add to briefcase ««123»»

Permissions Expand / Collapse