Identifying Queries Running Slower Than Normal

  • Here's one with divide by zero error removed and indentation:

    ------------------------------------------------------------------------

    -- Purpose: Identify queries that are running slower than normal,

    -- when taking into account IO volumes.

    --------------------------------------------------------------------------

    ALTER PROC [dbo].[dba_QueryTimeDelta]

    AS

    BEGIN

    -- Do not lock anything, and do not get held up by any locks.

    SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED

    -- Identify queries running slower than normal.

    SELECT TOP(100)

    [Runs] = qs.execution_count,

    -- [Total time] = qs.total_worker_time - qs.last_worker_time,

    [Avg time] = (qs.total_worker_time - qs.last_worker_time) / (qs.execution_count - 1),

    [Last time] = qs.last_worker_time,

    [Time Deviation] = (qs.last_worker_time - ((qs.total_worker_time - qs.last_worker_time) / (qs.execution_count - 1))),

    [% Time Deviation] = CASE WHEN qs.last_worker_time = 0 THEN 100

    ELSE (qs.last_worker_time - ((qs.total_worker_time - qs.last_worker_time) / (qs.execution_count - 1))) * 100

    END / NullIf((((qs.total_worker_time - qs.last_worker_time) / (qs.execution_count - 1))), 0),

    [Last IO] = last_logical_reads + last_logical_writes + last_physical_reads,

    [Avg IO] = ((total_logical_reads + total_logical_writes + total_physical_reads) - (last_logical_reads + last_logical_writes + last_physical_reads)) / (qs.execution_count - 1),

    [Individual Query] = Substring (qt.text,qs.statement_start_offset / 2,

    (CASE WHEN qs.statement_end_offset = -1

    THEN Len(Convert(nvarchar(Max), qt.text)) * 2

    ELSE qs.statement_end_offset

    END - qs.statement_start_offset)/2),

    [Parent Query] = qt.text,

    [DatabaseName] = DB_NAME(qt.dbid)

    INTO #SlowQueries

    FROM sys.dm_exec_query_stats qs

    CROSS APPLY sys.dm_exec_sql_text(qs.plan_handle) qt

    WHERE qs.execution_count > 1

    ORDER BY [% Time Deviation] DESC

    -- Calculate the [IO Deviation] and [% IO Deviation].

    -- Negative values means we did less I/O than average.

    SELECT TOP 100 [Runs],

    [Avg time],

    [Last time],

    [Time Deviation],

    [% Time Deviation],

    [Last IO],

    [Avg IO],

    [IO Deviation] = [Last IO] - [Avg IO],

    [% IO Deviation] = CASE WHEN [Avg IO] = 0 THEN 0

    ELSE ([Last IO]- [Avg IO]) * 100 / [Avg IO]

    END,

    [Individual Query],

    [Parent Query],

    [DatabaseName]

    INTO #SlowQueriesByIO

    FROM #SlowQueries

    ORDER BY [% Time Deviation] DESC

    -- Extract items where [% Time deviation] less [% IO deviation] is 'large'

    -- These queries are slow running, even when we take into account IO deviation.

    SELECT TOP(100)

    [Runs],

    [Avg time],

    [Last time],

    [Time Deviation],

    [% Time Deviation],

    [Last IO],

    [Avg IO],

    [IO Deviation],

    [% IO Deviation],

    [Impedance] = [% Time Deviation] - [% IO Deviation],

    [Individual Query],

    [Parent Query],

    [DatabaseName]

    FROM #SlowQueriesByIO

    WHERE [% Time Deviation] - [% IO Deviation] > 20

    ORDER BY [Impedance] DESC

    -- Tidy up.

    DROP TABLE #SlowQueries

    DROP TABLE #SlowQueriesByIO

    END

    GO

  • Hi,

    Nice to hear some of you are enjoying the tool.

    If the database column value is NULL, it means someone is running ad hoc or prepared SQL statements (i.e. it is not a stored proc).

    You can discover all about the wonderful world of DMVs here:

    http://msdn.microsoft.com/en-gb/magazine/cc135978.aspx

    Unfortunately there is no SQL Server 2000 equivalent of DMVs.

    Ian

  • Thanks for the response.

    Is there something that can work in SQL 2000.

    I just need to find sp/query running slow in 2k

  • Hi,

    For SQL Server 2000, you can use SQL Server Trace (that contains a Duration column) to determine what SQL is running slowly (this is different from this article, which is about what is running unusually slowly).

    Lots of info on the Internet about this...

    Ian

  • The Code posted is tough but the attached Word document is very good. I think they just cut&pasted the Word text into the post and that just does not work 🙂

    The real value is the discussion in the Word document which would make a much better article than the posted code.

    Ian has done a nice job providing insight into a very useful DMV that I have not looked at before. I do have one caution on the subject. The Average and delta that Ian calculates are a fairly simple view of performance change. The technique will not give an accurate indication of changes for queries that tend to have a highly variable runtime. For example assume a stored procedure that checks a queue for work. 3 or 4 times in 5 it finds nothing to do and runs in less than a tick. Occassionally it finds work and takes 10 to 15 ms. In a case like this the Average is misleading.

    Ian's queries provide a nice starting point for monitoring queries, but you really need to add some historical and trend data for each query to get the whole picture.

    Thanks Ian for highlighting the DMV for me.

  • I'm getting a different error. I copied the SQL from the word doc into notepad. Copied it from there to Mgmt. Studio but I have the following error:

    Msg 102, Level 15, State 1, Procedure dba_QueryTimeDelta, Line 49

    Incorrect syntax near '.'.

    The line is: CROSS APPLY sys.dm_exec_sql_text(qs.plan_handle) qt

  • After adding linefeeds (so the first comment doesn't invalidate the entire query) and adding the required whitespace between ORDER and BY, and TOP and 100, so on and so on..

    BEGIN

    -- Do not lock anything, and do not get held up by any locks.

    SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED

    -- Identify queries running slower than normal.

    SELECT TOP 100 [Runs] = qs.execution_count

    --

    , [Total time] = qs.total_worker_time - qs.last_worker_time

    , [Avg time] = (qs.total_worker_time - qs.last_worker_time) /(qs.execution_count - 1)

    , [Last time] = qs.last_worker_time

    , [Time Deviation] = (qs.last_worker_time - ((qs.total_worker_time - qs.last_worker_time) /(qs.execution_count - 1)))

    , [% Time Deviation] =

    CASE WHEN qs.last_worker_time = 0

    THEN 100

    ELSE (qs.last_worker_time - ((qs.total_worker_time - qs.last_worker_time) /(qs.execution_count - 1)))* 100

    END /(((qs.total_worker_time - qs.last_worker_time) /(qs.execution_count - 1)))

    , [Last IO] = last_logical_reads + last_logical_writes + last_physical_reads

    , [Avg IO] = ((total_logical_reads + total_logical_writes + total_physical_reads) - (last_logical_reads + last_logical_writes + last_physical_reads)) / (qs.execution_count - 1)

    , [Individual Query] = SUBSTRING (qt.text,qs.statement_start_offset/2

    , (CASE WHEN qs.statement_end_offset = -1

    THEN LEN(CONVERT(NVARCHAR(MAX), qt.text)) * 2

    ELSE qs.statement_end_offset END - qs.statement_start_offset)/2)

    , [Parent Query] = qt.text

    , [DatabaseName] = DB_NAME(qt.dbid)

    INTO #SlowQueries

    FROM sys.dm_exec_query_stats qs

    CROSS APPLY sys.dm_exec_sql_text(qs.plan_handle) qt

    WHERE qs.execution_count > 1

    ORDER BY [% Time Deviation] DESC

    -- Calculate the [IO Deviation] and [% IO Deviation].

    -- Negative values means we did less I/O than average.

    SELECT TOP 100 [Runs] , [Avg time] , [Last time] , [Time Deviation] , [% Time Deviation] , [Last IO]

    , [Avg IO] , [IO Deviation] = [Last IO] - [Avg IO]

    , [% IO Deviation] =CASE WHEN [Avg IO] = 0

    THEN 0 ELSE ([Last IO]- [Avg IO]) * 100 / [Avg IO]

    END

    , [Individual Query] , [Parent Query] , [DatabaseName]

    INTO #SlowQueriesByIO

    FROM #SlowQueries

    ORDER BY [% Time Deviation] DESC

    -- Extract items where [% Time deviation] less [% IO deviation] is 'large'

    -- These queries are slow running, even when we take into account IO deviation.

    SELECT TOP 100 [Runs] , [Avg time] , [Last time] , [Time Deviation] , [% Time Deviation] , [Last IO] , [Avg IO] , [IO Deviation]

    , [% IO Deviation] , [Impedance] = [% Time Deviation] - [% IO Deviation] , [Individual Query] , [Parent Query] , [DatabaseName]

    FROM #SlowQueriesByIO

    WHERE [% Time Deviation] - [% IO Deviation] > 20 ORDER BY [Impedance] DESC

    -- Tidy up.

    DROP TABLE #SlowQueries

    DROP TABLE #SlowQueriesByIO

    END

    Neat info, thanks!

  • Hi Ray,

    Thanks for your feedback.

    About your caution… it should not be a problem here. This is because the utility is measuring at the individual SQL statement level, the active work that has been done on the CPU, taking into account the amount of data processed (IO). If we were measuring at the stored procedure level and/or including waiting, blocking etc, or not taking into account the amount of data, then it would be a problem.

    Note, we are only measuring what work is happening on the CPU (i.e. _worker_time, not _elapsed_time). Most query times appear variable because they are waiting variable amounts of time on other resources (these are excluded here), the actual time doing work (i.e. not waiting on a resource) is often fairly consistent.

    Also note that the utility takes into account the amount of data processed.

    To take into account concurrency, blocking etc, replace occurrences of the string _worker_time, with the string _elapsed_time in the utility. This will show the effect of concurrency, blocking etc on query times…

    Thanks again

    Ian

  • Okay, I found my problem with the syntax error. This was a new install of SQL Server 2005 developer edition but somehow the master database had a compatability level of 80???? I changed it to 90 and restarted so the syntax error went away but now I've got another that is probably still due to some particularity with this instance.

    Msg 4104, Level 16, State 1, Procedure dba_QueryTimeDelta, Line 12

    The multi-part identifier "qs.plan_handle" could not be bound.

    Update - all is good. I found it and it runs ok now.

  • Pretty much useless. A waste of time, don't bother.

  • This looks really cool, and sounds interesting enough that I am going to attempt to look at the document.

    However, it would be really nice if we could get the contents of the document properly formatted and put in the article so we don't have to download it to read anything.

  • Hi Jereme,

    I agree with your thoughts completely. I think this article is a new and interesting look on a familiar problem. Unfortunately it has been let down by bad editing. Hopefully it will be fixed soon and we’ll be able to see the whole article rather that some unintelligible unformatted SQL…

    Ian

  • Would love to run this query for a problem we are having with timeouts on a particular table and we need to find the query responsible, but Copy and Paste makes such a MESS in QA that is pretty hard if not impossible to sort out if you are not a Master SQL Programmer...not the least of the problem is that when you paste it goes in one big long string that you need to sort out for yourself and statements that need to run are then embedded after comments, as it is hard to tell when the comment ends and a statement begins. I am on 2005, but the site I need to run this on is 2000, which I read is not compatible anyway. 🙁 However even if I could sort out the jumble that pastes into QA, it looks like there are multiple syntax errors, for example SELECTTOP, I am pretty sure should be 2 words, not 1, yet it is 1 thoughout the query example. That one I know, this error, however the slew of other errors, such as (Incorrect syntax near 'UNCOMMITTED'.), I have no idea what to do with.

    I guess, from my perspective, it would have been great to have some instructions to go with this query (like where to insert your own information, instead of having to guess) and the ability get it into QA in a proper structure so it made sense.

  • Try copying and pasting the code from the attached .doc.

    I pasted that into SQL2005's query screen and it formatted and ran perfectly the first time.

  • THANKS! I copied from a post above, that is much better. Maybe I can get it to run from here.

Viewing 15 posts - 16 through 30 (of 57 total)

You must be logged in to reply to this topic. Login to reply