want to know top 10 high memory queries

  • Hi,

    Recently, we got high memory incidents from monitoring tool. I have lowered the max server memory for time being.

    Using SQL Server 2016 Enterprise Edition.

    From task mgr I see sqlservr.exe is the one taking up 85% memory (i.e. working set value).

    Now I want to know withing sql server , what are the sql queries are taking up more memory.

    I used sp_whoisactive but I am seeing "used_memory" column values as 2,3,9, 79,243 , 4,221. I am not sure what is the value indicate, is it number of 8K pages?

    Is there are proper way to tell which statements taking most memory in MB/GB and I wanted to store them in a table. This information will be share to developers.

    Thank You!

  • Try taking a look at theses 2 management views.  Change the ORDER BY in your queries to get highest use per I/O, CPU, Memry

    sys.dm_exec_query_stats

    sys.dm_exec_procedure_stats

     

  • This was removed by the editor as SPAM

  • Memory used is not a particularly useful metric in SQL Server.  It grabs as memory and never releases it so that it can keep as much as possible in memory.

  • DesNorton wrote:

    Try taking a look at theses 2 management views.  Change the ORDER BY in your queries to get highest use per I/O, CPU, Memry sys.dm_exec_query_stats

    sys.dm_exec_procedure_stats

    Just curious... I don't see how either of those relate to actual memory used.  Logical READs <> Memory used.  I can write a query against a Tally Table (for example) that will consume millions of reads even though the table itself is only 19 pages long.  If it's already in memory, the physical reads will be 0.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden wrote:

    DesNorton wrote:

    Try taking a look at theses 2 management views.  Change the ORDER BY in your queries to get highest use per I/O, CPU, Memry sys.dm_exec_query_stats

    sys.dm_exec_procedure_stats

    Just curious... I don't see how either of those relate to actual memory used.  Logical READs <> Memory used.  I can write a query against a Tally Table (for example) that will consume millions of reads even though the table itself is only 19 pages long.  If it's already in memory, the physical reads will be 0.

    Jeff Moden wrote:

    DesNorton wrote:

    Try taking a look at theses 2 management views.  Change the ORDER BY in your queries to get highest use per I/O, CPU, Memry sys.dm_exec_query_stats

    sys.dm_exec_procedure_stats

    Just curious... I don't see how either of those relate to actual memory used.  Logical READs <> Memory used.  I can write a query against a Tally Table (for example) that will consume millions of reads even though the table itself is only 19 pages long.  If it's already in memory, the physical reads will be 0.

     

    So, if I had to troubleshoot high memory , then what queries should I be tuning or what thing I should be looking at?

  • Jeff Moden wrote:

    DesNorton wrote:

    Try taking a look at theses 2 management views.  Change the ORDER BY in your queries to get highest use per I/O, CPU, Memry sys.dm_exec_query_stats

    sys.dm_exec_procedure_stats

    Just curious... I don't see how either of those relate to actual memory used.  Logical READs <> Memory used.  I can write a query against a Tally Table (for example) that will consume millions of reads even though the table itself is only 19 pages long.  If it's already in memory, the physical reads will be 0.

    I am not a DBA, so never get to look see this stuff.

    That said, if I were looking for high memory usage, I would look at the *_grant_kb and *_used_grant_kb fields

  • The queries would look something like this

    SELECT      TOP ( 10 )
    dbname = DB_NAME( qt.dbid )
    , qt.objectid
    , qs.execution_count
    , query_text = SUBSTRING(
    qt.text, qs.statement_start_offset / 2 + 1
    , ( CASE
    WHEN qs.statement_end_offset = -1 THEN LEN( CONVERT( nvarchar(MAX), qt.text )) * 2
    ELSE qs.statement_end_offset
    END - qs.statement_start_offset ) / 2 )
    , avg_worker_time = qs.total_worker_time /qs.execution_count, qs.total_worker_time, qs.last_worker_time, qs.min_worker_time, qs.max_worker_time -- CPU TIME - in microseconds (but only accurate to milliseconds)
    , avg_elapsed_time = qs.total_elapsed_time /qs.execution_count, qs.total_elapsed_time, qs.last_elapsed_time, qs.min_elapsed_time, qs.max_elapsed_time -- DURATION - in microseconds (but only accurate to milliseconds)
    , qs.total_grant_kb, qs.last_grant_kb, qs.min_grant_kb, qs.max_grant_kb
    , qs.total_used_grant_kb, qs.last_used_grant_kb, qs.min_used_grant_kb, qs.max_used_grant_kb
    FROM sys.dm_exec_query_stats AS qs
    CROSS APPLY sys.dm_exec_sql_text( qs.sql_handle ) AS qt
    ORDER BY qs.total_grant_kb DESC -- Total GRANT memory
    --ORDER BY qs.total_worker_time DESC -- Total CPU Time
    --ORDER BY qs.total_elapsed_time DESC -- Total query DURATION
  • bobrooney.81 wrote:

    Jeff Moden wrote:

    DesNorton wrote:

    Try taking a look at theses 2 management views.  Change the ORDER BY in your queries to get highest use per I/O, CPU, Memry sys.dm_exec_query_stats

    sys.dm_exec_procedure_stats

    Just curious... I don't see how either of those relate to actual memory used.  Logical READs <> Memory used.  I can write a query against a Tally Table (for example) that will consume millions of reads even though the table itself is only 19 pages long.  If it's already in memory, the physical reads will be 0.

    Jeff Moden wrote:

    DesNorton wrote:

    Try taking a look at theses 2 management views.  Change the ORDER BY in your queries to get highest use per I/O, CPU, Memry sys.dm_exec_query_stats

    sys.dm_exec_procedure_stats

    Just curious... I don't see how either of those relate to actual memory used.  Logical READs <> Memory used.  I can write a query against a Tally Table (for example) that will consume millions of reads even though the table itself is only 19 pages long.  If it's already in memory, the physical reads will be 0.

    So, if I had to troubleshoot high memory , then what queries should I be tuning or what thing I should be looking at?

     

    Once again why have you determined that this is a problem that needs to be troubleshot?

  • I've not had to do such a thing because I've never had the same problem.  If I did, I'd probably start with the following search...

    https://duckduckgo.com/?q=How+to+determine+which+databases+are+consuming+th+most+buffer+memory+in+SQL+Server

     

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Maybe a bit more directly....

    https://www.google.com/search?q=how+to+find+queries+taking+the+most+memory+in+sql+server

     

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • I think you are wasting your time.  SQL should use all the memory made available to it.  You WANT it to use the available memory.  I would be concerned if it dropped below 100% CPU usage on a database server

    Michael L John
    If you assassinate a DBA, would you pull a trigger?
    To properly post on a forum:
    http://www.sqlservercentral.com/articles/61537/

Viewing 12 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic. Login to reply