High Page File Usage

  • Brahmanand Shukla

    Right there with Babe

    Points: 778

    Page File Usage in one of our production database server is continuously up-to more than 200% of the total available memory. Is it a signal to add more physical memory?

  • Jeff Moden

    SSC Guru

    Points: 995976

    Yes, partially (although it almost never hurts to do so).  It may also be an indication that you have code to fix that might still blow through whatever memory you could possibly add to the machine.

    Until you know what's causing the page-file usage, you're really just guessing and will likely end up being frustrated with proverbial "shots in the dark" to fix it.  For example, it could just be that you've over-allocated the amount of memory that SQL Server is allowed to use and have simply starved the operating system.  It could also be that someone has some other code running on the box that you don't know about.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column.
    "If you think its expensive to hire a professional to do the job, wait until you hire an amateur."--Red Adair
    "Change is inevitable... change for the better is not."

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jonathan AC Roberts

    SSCoach

    Points: 17204

    It could be that you are not limiting the maximum amount of memory available to SQL Server?

    If you right-click on the server in the SSMS object explorer, then on the memory page make sure that you haven't assigned a too high value to the "Maximum server memory (in MB)" option.

    If there is nothing other than SQL Server running on the machine then you should allocate the amount of memory on the machine minus about 4GB to allow enough memory for the operating system. So for example, if you have 32GB memory on the server then you should allow SQL Server to use a maximum of 28,000 MB.

  • Brahmanand Shukla

    Right there with Babe

    Points: 778

    Max Memory capping is applied on the instance.

    Total Available Memory : 32 GB

    Max Memory configured : 26 GB

    As far as I know, nothing additional running on the instance, except the SQL queries and SQL objects such as Stored Procedures, Functions, Views etc. It's a dedicated SQL Server instance.

    I agree with Jeff that there could be some code which may be chocking up the memory. Unless we fix the code problem, we will end up adding the additional memory without any gain. What do you recommend to identify the top code that is consuming lot of memory and what possibility can be done to fix it?

    Additionally, I need to justify the H/W upgrade to my infra team. Without the valid justification, they will not allocate the additional H/W including Memory. Please see if you guys can suggest any article or anything that I can use to convince my Infra team.

    For now, system is working properly can say well. But can't say if they are performing wonderful. I don't want to wait for a situation to act when system started performing bad or there is absolute shut down. Hence as part of Proactive measure, I'm trying to understand this so that I can add it to my checklist of Proactive Action to make sure Databases I'm responsible for are up and running smoothly all the times.

    Thanks in advance for your time and help !

  • Jonathan AC Roberts

    SSCoach

    Points: 17204

    Have you tried running taskmgr or perfmon to see which processes are using a lot of memory?

  • Jeff Moden

    SSC Guru

    Points: 995976

    Brahmanand Shukla wrote:

    Max Memory capping is applied on the instance.

    Total Available Memory : 32 GB

    Max Memory configured : 26 GB

    Brahmanand Shukla wrote:

    As far as I know, nothing additional running on the instance, except the SQL queries and SQL objects such as Stored Procedures, Functions, Views etc. It's a dedicated SQL Server instance.

    It can't be "as far as I know".  You have to know for sure.  When you say it's a "dedicated SQL Server instance", are you sure that it's one physical server or are you actually sharing hardware (CPU, Memory, C:Drive) behind the scenes?  It's understood that you'll be sharing disk space on a SAN.

    Brahmanand Shukla wrote:

    I agree with Jeff that there could be some code which may be chocking up the memory. Unless we fix the code problem, we will end up adding the additional memory without any gain. What do you recommend to identify the top code that is consuming lot of memory and what possibility can be done to fix it?

    With the understanding that it shows only what is currently cached, you can right click on the instance name in the SQL Explorer window and following your nose to reporting for highest CPU and I/O.  It will provide you with the actual queries and how many times they've executed.  A lot of the time, your biggest problem ISN'T the longest running code.  Sometimes it's fairly short running code executing thousands of times.  I think you'll be most interested in the I/O, which is logical reads, which is memory.

    Brahmanand Shukla wrote:

    Additionally, I need to justify the H/W upgrade to my infra team. Without the valid justification, they will not allocate the additional H/W including Memory. Please see if you guys can suggest any article or anything that I can use to convince my Infra team.

    Until you can prove what's using memory and what your workload is (using the above is a decent start), your hardware team probably won't budge on additional memory.  You also need to do as Jonathan suggested and actually look at Task Manager for the box to see what's using a lot of memory.  I expect that SQL Server will always be at the top of that list but there might be something else using a wad of memory that may be requiring trips to the swap file.

    Brahmanand Shukla wrote:

    For now, system is working properly can say well. But can't say if they are performing wonderful. I don't want to wait for a situation to act when system started performing bad or there is absolute shut down. Hence as part of Proactive measure, I'm trying to understand this so that I can add it to my checklist of Proactive Action to make sure Databases I'm responsible for are up and running smoothly all the times.

    Using the built in reporting can become a chore for doing such a thing.  I've attached some code that I use to get away from that.  It's the same basis and it has been a huge help in us being proactive in searching for and finding the worst code that we needed to repair.  That's why it's called "sp_ShowWorst".  Instructions are  included in the flower box in the code.

    The "JeffModen03.txt" file is actually an msc file that I add to the desktop of my servers (I don't have many).  You need to rename it from txt to msc for the extension.  Once you copy/rename it to the desktop of the server (in other words, you've RDP'd into the server), just double click on it and it'll start the same PerfMon session that I use to see what's going on with the server at a high level.  It'll at least give you and idea of what to look for.  A lot of the "lines" have been disabled because I use those only for special purpose checks.  The lines that are currently enabled usually provide the adequate hints as to what to use sp_ShowWorst for.

    sp_ShowWorst won't find it all (and, don't forget, only shows problems that may exist according to what is in cache).  You also need to find out what is recompiling the most.  No sense in me providing the code I use because I got it from the following link.  It helped me find something that ran 10's of thousands of times per day (100 ms) but took 2-22 seconds to recompile AND it recompiled almost EVERY time it was used.

    https://www.sqlskills.com/blogs/jonathan/identifying-high-compile-time-statements-from-the-plan-cache/

    Again, I don't know what the workload  on your machine is but, for typical workloads nowadays, it seems a wee bit under-powered to me.  Memory is probably then most important thing to add to a server and is usually one of the least expensive things to add.  In most cases (again, gotta check the workload), I won't even bother working on a production server that doesn't have at least 128GB of RAM.  Shoot... my laptop has 32GB of RAM and 2TB of SSDs.

     

     

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column.
    "If you think its expensive to hire a professional to do the job, wait until you hire an amateur."--Red Adair
    "Change is inevitable... change for the better is not."

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden

    SSC Guru

    Points: 995976

    Good lord!!!  A SQL forum that doesn't allow you to upload .SQL or TXT files!  This is just silly.

    Here's sp_ShowWorst... I've got to figure out how to do the other file.  <major facepalm><headdesk>

    USE [master]
    GO
    ALTER PROCEDURE [dbo].[sp_ShowWorst]
    /**********************************************************************************************************************
    Purpose:
    This store procedure returns the "worst" performing queries in descending order by "CPU" time (worker time), logical
    "Reads", total "Duration", or most used (execution "Count") for whatever is currently cached.

    Programmer's Notes:
    1. This stored procedure should live in the Master database.
    2. After building the stored procedure, you need to run the following code to convert it into a "system" stored
    procedure that can be executed from database.

    USE MASTER;
    EXEC sp_ms_marksystemobject 'sp_ShowWorst';
    SELECT name, is_ms_shipped
    FROM sys.objects
    WHERE name = 'sp_ShowWorst'
    ;
    -----------------------------------------------------------------------------------------------------------------------
    Usage:
    sp_ShowWorst --(from any database, defaults to "Reads" an "10")
    ... OR ...
    EXEC sp_ShowWorst --(from any database, defaults to "Reads" an "10")

    sp_ShowWorst [@pSortType][,@pRows][,@pDebug] --Both parameters are option. Logical "Reads" and "30" will be presumed.

    Valid values for @pSortType:
    Reads - Logical Reads (current default) since last compile date
    CPU - Total worker time in micro seconds since last compile date
    Duration - Total elapsed time in micro seconds since last compile date
    Count - Total number of executions since last compile date
    Physical - Total number of physical reads since last compile date
    Recent - Adding the word "Recent" to any of the above limits the output to only those things that have executed in
    the last 60 minutes.

    Valid values for @pRows:
    Any positive integer (current default is 10)

    Valid values for @pDebug
    0 - Debug is off (default) and the code will be executed
    1 - Debug if on and the code will be displayed instead of being executed.
    -----------------------------------------------------------------------------------------------------------------------
    Example Usage:
    --===== Example usage for things that have run in the last hour
    -- regardless of when they were compiled.
    EXEC sp_ShowWorst 'Recent CPU' , 10;
    EXEC sp_ShowWorst 'Recent Reads' , 10;
    EXEC sp_ShowWorst 'Recent Duration' , 10;
    EXEC sp_ShowWorst 'Recent Writes' , 10;
    EXEC sp_ShowWorst 'Recent Physical' , 10;
    EXEC sp_ShowWorst 'Recent Count' , 10;

    --===== Example usage for things since they were last compiled.
    EXEC sp_ShowWorst 'CPU' , 10;
    EXEC sp_ShowWorst 'Reads' , 10;
    EXEC sp_ShowWorst 'Duration', 10;
    EXEC sp_ShowWorst 'Writes' , 10;
    EXEC sp_ShowWorst 'Physical', 10;
    EXEC sp_ShowWorst 'Count' , 10;
    -----------------------------------------------------------------------------------------------------------------------
    Notes:
    1. Remember that values returned are as of the creation date and time (creation_time column) of the execution plan
    whether the RECENT option is used or not. It IS possible for a plan to never be flushed from cache and the
    cumulative usage may skew the results. This frequently occurs over weekends.
    2. Note to self... (TODO) Find a way to overcome the problem stated in the note above.
    3. Note to self... (TODO) Find out why SQL Server sometimes doesn't return an execution plan. It's understood that
    "Optimize for Ad Hoc Queries" can cause this for singleton queries, but not for queries
    that are executed multiple times with the same plan handle.
    -----------------------------------------------------------------------------------------------------------------------
    Revision History:
    Rev 00 - 29 Feb 2013 - Jeff Moden
    - Formalize a script for regular usage.
    Rev 01 - 25 Feb 2015 - Jeff Moden
    - Add object name and ID information and make data collection for different types a lot easier by adding
    "Sample*" information columns.
    Rev 02 - 04 Jul 2016 - Jeff Moden
    - 2.1 Added code and columns to determine the currently active database when ad hoc queries are executed.
    - 2.2 Added the "Exec" prefix to columns that identify the object that was actual executed. Note that these
    columns are normmally NULL for ad hoc queries via ORMs, applications, and SSMS.
    **********************************************************************************************************************/
    --===== Define the I/O for this proc
    @pSortType VARCHAR(30) = 'RECENT Reads',
    @pRows INT = '10',
    @pDebug BIT = 0
    AS

    --===== Environmental Presets
    SET NOCOUNT ON;
    SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED;

    --===== Declare some obviously named variable(s)
    DECLARE @SQL VARCHAR(MAX)
    ;
    --===== Create the SQL to get the data in the order requested
    SELECT @SQL = REPLACE(REPLACE(REPLACE(REPLACE('
    SELECT TOP (<<@pRows>>) --sp_ShowWorst
    SampleDT = GETDATE() --Rev 01
    ,SampleType = ''<<@pSortType>>'' --Rev 01
    ,SampleValue = [stats].<<SortColumn>> --Rev 01
    ,CurrentDBName = db.DBName --Rev 02.1
    ,CurrentDBID = DB_ID(db.DBName) --Rev 02.1
    ,ExecDBID = [txt].DBID --Rev 01 - Rev02.2
    ,ExecObjectID = [txt].ObjectID --Rev 01 - Rev02.2
    ,ExecDBName = DB_NAME([txt].DBID) --Rev 01 - Rev02.2
    ,ExecSchemaName = OBJECT_SCHEMA_NAME([txt].ObjectID, [txt].DBID) --Rev 01 - Rev02.2
    ,ExecObjectName = OBJECT_NAME([txt].ObjectID, [txt].DBID) --Rev 01 - Rev02.2
    ,StatementText = (SELECT REPLACE(CAST(''--'' + CHAR(10)
    + SUBSTRING(txt.Text
    ,[stats].Statement_Start_Offset/2+1
    ,(CASE [stats].Statement_End_Offset
    WHEN -1
    THEN DATALENGTH(txt.Text)
    ELSE [stats].Statement_End_Offset
    END)/2 - [stats].Statement_Start_Offset/2+1
    ) AS VARCHAR(MAX)),CHAR(0),'''')
    AS [processing-instruction(StatementTextSQL)] FOR XML PATH(''''), TYPE)
    ,FullText = (SELECT REPLACE(CAST(''--''+CHAR(10)+[txt].Text AS VARCHAR(MAX)),CHAR(0),'''')
    AS [processing-instruction(FullTextSQL)] FOR XML PATH(''''), TYPE)
    ,[plan].query_plan
    ,[stats].*
    FROM sys.dm_exec_query_stats [stats]
    OUTER APPLY sys.dm_exec_sql_text ([stats].sql_handle) [txt]
    OUTER APPLY sys.dm_exec_query_plan([stats].plan_handle) [plan]
    OUTER APPLY (SELECT DBName = DB_NAME(CONVERT(INT,value)) -- Rev 02.1
    FROM sys.dm_exec_plan_attributes([stats].plan_handle)
    WHERE attribute = ''dbid'') [db]
    WHERE [Stats].Last_Execution_Time >= ''<<LookBackDate>>''
    ORDER BY [stats].<<SortColumn>> DESC
    ;'
    ------- This is the other end of the REPLACE's
    ,'<<@pSortType>>',@pSortType)
    ,'<<@pRows>>',CAST(@pRows AS VARCHAR(10)))
    ,'<<LookBackDate>>', CAST(CASE WHEN @pSortType LIKE '%Recent%' THEN DATEADD(hh,-1,GETDATE()) ELSE '1753' END AS VARCHAR(20)))
    ,'<<SortColumn>>', CASE
    WHEN @pSortType LIKE '%Count%' THEN 'Execution_Count'
    WHEN @pSortType LIKE '%CPU%' THEN 'Total_Worker_Time'
    WHEN @pSortType LIKE '%Duration%' THEN 'Total_Elapsed_Time'
    WHEN @pSortType LIKE '%Reads%' THEN 'Total_Logical_Reads'
    WHEN @pSortType LIKE '%Physical%' THEN 'Total_Physical_Reads'
    WHEN @pSortType LIKE '%Writes%' THEN 'Total_Logical_Writes'
    END)
    ;
    --===== Produce the output according to what the @pDebug variable is set to
    IF @pDebug = 0 EXEC (@SQL);
    ELSE PRINT @SQL;

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column.
    "If you think its expensive to hire a professional to do the job, wait until you hire an amateur."--Red Adair
    "Change is inevitable... change for the better is not."

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Brahmanand Shukla

    Right there with Babe

    Points: 778

    Hi Jonathan - Have tried Task Manager and it has now totally confused me 🙁

    Out of around 26 GB configured for SQL Server, only 475 MB is getting used. Please help me to figure out why? Or may be I'm not interpreting it the right way.

    Attaching the screenshot of SQL Server Memory Setting and Task Manager for your reference.

    I can also see few more applications/services running on the server such as SQLAgent.exe, devenv.exe (This could be due to SSDT), few services pertaining to Antiviruses etc.

    I'm also going through the suggestions made by Jeff and soon shall review it and post my findings.

    Attachments:
    You must be logged in to view attached files.
  • Jonathan AC Roberts

    SSCoach

    Points: 17204

    When you say Page File usage is up to 200% of the memory, what you are talking about is the size of the page-file not the page file actually being used?

    It is normal for the page file to be double size of the memory as it will write a copy of all the memory to the file as a crash dump if there is a system failure.

    If you need the space on the hard drive you can reduce the size of the page-file from Windows System Properties. But if you don't need the disk space I'd leave it as it is.

  • Brahmanand Shukla

    Right there with Babe

    Points: 778

    Jonathan AC Roberts wrote:

    When you say Page File usage is up to 200% of the memory, what you are talking about is the size of the page-file not the page file actually being used?

    Perfect and very valid question but frankly speaking I don't know the answer.

    I'm referring to the below query in order to come to the conclusion of the Percentage of Page file used. See if you can point out something to clear my doubt. Thanks for all your time, patience and efforts !

    SELECT (total_page_file_kb / 1024) AS Total_Page_File_Size_In_MB
    , ((total_page_file_kb - available_page_file_kb) / 1024) AS Used_Page_File_Size_In_MB
    , (total_physical_memory_kb / 1024) AS Total_Physical_Memory_Size_In_MB
    , ((total_physical_memory_kb - available_physical_memory_kb) / 1024) AS Used_Physical_Memory_Size_In_MB
    , CEILING(((total_page_file_kb * 1.00) / (total_physical_memory_kb * 1.00)) * 100) AS [Percentage_Of_Total_Page_File_Vs_Used_Physical_Memory]
    , CEILING(((total_physical_memory_kb * 1.00 - available_physical_memory_kb * 1.00) / 1024)
    / ((total_page_file_kb * 1.00 - available_page_file_kb * 1.00) / 1024)) * 100 AS [Percentage_Of_Used_Page_File_Vs_Used_Physical_Memory]
    , system_memory_state_desc
    FROM sys.dm_os_sys_memory

Viewing 10 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply