High Memory Usage

  • In our production server , memory usage is around 95% even tho it has been capped (8 GB ) . Total memory 12 GB

    When I run a script I got a db let’s say Db 1 which has the highest catched_pages_count (574.198)  

    And I run another script , I got a table let’s say table 1  which has the highest catched_pages_count (514.201)

    I really have no idea what to do with that table

    Should I just lower down the memory setting ?

    cheers

  • No... 12GB is too little for a database server.  Buy some more memory for it.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden - Thursday, February 2, 2017 9:54 PM

    No... 12GB is too little for a database server.  Buy some more memory for it.

    Thanks for the reply !
    hmm how do i measure the number of physical memory that the server needs?

  • If you sum up the "Size" column of sys.master_files for all of your MDF/NDF files and divide by 128, what do you get for a result?

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • WhiteLotus - Thursday, February 2, 2017 8:38 PM

    In our production server , memory usage is around 95% even tho it has been capped (8 GB ) . Total memory 12 GB

    For some context, my laptop has 16GB of memory, my development desktop has 32 GB.
    I set up a new server for a client yesterday, 512 GB memory.

    My laptop should not have more memory than your production server. Upgrade that thing, and not to the minimum you think you can get away with. SQL loves memory, it uses memory to cache data and plans to avoid the cost of fetching/regenerating them.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • Thanks for the valuable input guys πŸ™‚ I will upgrade it !!

  • Jeff Moden - Friday, February 3, 2017 12:25 AM

    If you sum up the "Size" column of sys.master_files for all of your MDF/NDF files and divide by 128, what do you get for a result?

    I get 84641.9375 . What does it mean ?

  • WhiteLotus - Monday, February 6, 2017 3:54 PM

    Jeff Moden - Friday, February 3, 2017 12:25 AM

    If you sum up the "Size" column of sys.master_files for all of your MDF/NDF files and divide by 128, what do you get for a result?

    I get 84641.9375 . What does it mean ?

    It means you need a SQL Server which has at least 100GB RAM in order to be able to cache all your data.  πŸ™‚  If you want to avoid memory pressures, and you want to avoid waiting for data to be cached into memory, then you need a server with enough memory to hold all your SQL Server data plus the SQL Server overhead, plus the OS overhead.

    Thomas Rushton
    blog: https://thelonedba.wordpress.com

  • ThomasRushton - Tuesday, February 7, 2017 4:57 AM

    WhiteLotus - Monday, February 6, 2017 3:54 PM

    Jeff Moden - Friday, February 3, 2017 12:25 AM

    If you sum up the "Size" column of sys.master_files for all of your MDF/NDF files and divide by 128, what do you get for a result?

    I get 84641.9375 . What does it mean ?

    It means you need a SQL Server which has at least 100GB RAM in order to be able to cache all your data.  πŸ™‚  If you want to avoid memory pressures, and you want to avoid waiting for data to be cached into memory, then you need a server with enough memory to hold all your SQL Server data plus the SQL Server overhead, plus the OS overhead.

    'zactly. πŸ˜‰  It certainly means that 12GB isn't going to hack it.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • You might also want to check the memory limitations for your edition and version before you start throwing too much memory at the problem:
    https://msdn.microsoft.com/en-us/library/ms143685(v=sql.105).aspx

    Although obviously any increase on 12 GB isn't going to hurt...

  • ThomasRushton - Tuesday, February 7, 2017 4:57 AM

    If you want to avoid memory pressures, and you want to avoid waiting for data to be cached into memory, then you need a server with enough memory to hold all your SQL Server data plus the SQL Server overhead, plus the OS overhead.

    I'm going to disagree there. While the entire DB is 86 GB, we don't know how much of that is hot data, how much is cold data that's never touched, how much is used during business hours, how much is only used overnight, etc.

    12 is almost certainly not enough (my laptop has 16), but 100GB may be far too much.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • GilaMonster - Tuesday, February 7, 2017 10:52 AM

    ThomasRushton - Tuesday, February 7, 2017 4:57 AM

    If you want to avoid memory pressures, and you want to avoid waiting for data to be cached into memory, then you need a server with enough memory to hold all your SQL Server data plus the SQL Server overhead, plus the OS overhead.

    I'm going to disagree there. While the entire DB is 86 GB, we don't know how much of that is hot data, how much is cold data that's never touched, how much is used during business hours, how much is only used overnight, etc.

    12 is almost certainly not enough (my laptop has 16), but 100GB may be far too much.

    how much do u think Gail ?

  • WhiteLotus - Tuesday, February 7, 2017 5:07 PM

    GilaMonster - Tuesday, February 7, 2017 10:52 AM

    ThomasRushton - Tuesday, February 7, 2017 4:57 AM

    If you want to avoid memory pressures, and you want to avoid waiting for data to be cached into memory, then you need a server with enough memory to hold all your SQL Server data plus the SQL Server overhead, plus the OS overhead.

    I'm going to disagree there. While the entire DB is 86 GB, we don't know how much of that is hot data, how much is cold data that's never touched, how much is used during business hours, how much is only used overnight, etc.

    12 is almost certainly not enough (my laptop has 16), but 100GB may be far too much.

    how much do u think Gail ?

    More than 12.
    I have no metrics or stats that would guide me to a better value.
    Buy what your company can afford, plan for future growth and the more the better within reason.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • GilaMonster - Tuesday, February 7, 2017 11:59 PM

    WhiteLotus - Tuesday, February 7, 2017 5:07 PM

    GilaMonster - Tuesday, February 7, 2017 10:52 AM

    ThomasRushton - Tuesday, February 7, 2017 4:57 AM

    If you want to avoid memory pressures, and you want to avoid waiting for data to be cached into memory, then you need a server with enough memory to hold all your SQL Server data plus the SQL Server overhead, plus the OS overhead.

    I'm going to disagree there. While the entire DB is 86 GB, we don't know how much of that is hot data, how much is cold data that's never touched, how much is used during business hours, how much is only used overnight, etc.

    12 is almost certainly not enough (my laptop has 16), but 100GB may be far too much.

    how much do u think Gail ?

    More than 12.
    I have no metrics or stats that would guide me to a better value.
    Buy what your company can afford, plan for future growth and the more the better within reason.

    Apparently only 2 or 3 GB would be added

  • WhiteLotus - Wednesday, February 8, 2017 8:12 PM

    Apparently only 2 or 3 GB would be added

    So do you want my laptop to run your production DB? It's got more memory. Hell, my gaming and development desktop is running 32GB.

    These days, I wouldn't go for less than 32GB for a new production DB server, mostly because memory is so incredibly cheap that there's no reason to skimp.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass

Viewing 15 posts - 1 through 15 (of 16 total)

You must be logged in to reply to this topic. Login to reply