Blog Post

SQL Server: Part 4 : Approaching Database Server Performance Issue

,

In the last three parts, we have discussed about different queries that can be used to list the current state of the database server. In this post let us discuss about listing execution stats from the plan cache.

The below query will help us to list the CPU intensive queries from your  plan cache.

/*****************************************************************************************
List heavy query based on CPU/IO. Change the order by clause appropriately
******************************************************************************************/

SELECT TOP 20
DB_NAME(qt.dbid) AS DatabaseName
,DATEDIFF(MI,creation_time,GETDATE()) AS [Age of the Plan(Minutes)]
,last_execution_time AS [Last Execution Time]
,qs.execution_count AS [Total Execution Count]
,CAST((qs.total_elapsed_time) / 1000000.0 AS DECIMAL(28,2)) [Total Elapsed Time(s)]
,CAST((qs.total_elapsed_time ) / 1000000.0/ qs.execution_count AS DECIMAL(28, 2)) AS [Average Execution time(s)]
,CAST((qs.total_worker_time) / 1000000.0 AS DECIMAL(28,2)) AS [Total CPU time (s)]
,CAST(qs.total_worker_time * 100.0 / qs.total_elapsed_time AS DECIMAL(28,2)) AS [% CPU]
,CAST((qs.total_elapsed_time - qs.total_worker_time)* 100.0 /qs.total_elapsed_time AS DECIMAL(28, 2)) AS [% Waiting]
,CAST((qs.total_worker_time) / 1000000.0/ qs.execution_count AS DECIMAL(28, 2)) AS [CPU time average (s)]
,CAST((qs.total_physical_reads) / qs.execution_count AS DECIMAL(28, 2)) AS [Avg Physical Read]
,CAST((qs.total_logical_reads) / qs.execution_count AS DECIMAL(28, 2))  AS [Avg Logical Reads]
,CAST((qs.total_logical_writes) / qs.execution_count AS DECIMAL(28, 2)) AS [Avg Logical Writes]
,max_physical_reads
,max_logical_reads
,max_logical_writes
, SUBSTRING (qt.TEXT,(qs.statement_start_offset/2) + 1,((CASE WHEN qs.statement_end_offset -1
   THEN LEN(CONVERT(NVARCHAR(MAX), qt.TEXT)) * 2
  
ELSE qs.statement_end_offset
  
END - qs.statement_start_offset)/2) + 1) AS [Individual Query]

, qt.TEXT AS [Batch Statement]
, qp.query_plan
FROM SYS.DM_EXEC_QUERY_STATS qs
CROSS APPLY SYS.DM_EXEC_SQL_TEXT(qs.sql_handle) AS qt
CROSS APPLY SYS.DM_EXEC_QUERY_PLAN(qs.plan_handle) qp
WHERE qs.total_elapsed_time > 0
ORDER BY 
[Total CPU time (s)] 
--[Avg Physical Read]
--[Avg Logical Reads]
--[Avg Logical Writes]
--[Total Elapsed Time(s)]
--[Total Execution Count]
DESC

Let us discuss the output columns.

Column NameDescription
DatabaseNameDatabase Context of the execution plan.
Age of the Plan(Minutes)Age of the plan in the plan cache in minutes.
Last Execution Time

Date and Time of the last execution of this plan.
Total Execution CountTotal number of execution after the last compilation
Or Total number of execution in the last [Age of the Plan(Minutes)]
(since it was last compiled)

Total Elapsed Time(s)
Total time (second) took to execute this plan [Total Execution Count] times
Average 
Execution time(s)
Average Time(seconds) took for single execution of this plan
Total CPU time (s)

Total CPU time(seconds) took to 

execute this plan [Total Execution Count] times

% CPUPercentage of CPU time compared to  [Total Elapsed Time(s)]

% Waiting 

Percentage of waiting time(wait for resource)  compared to  [Total Elapsed Time(s)]
CPU time 
average (s)
Average CPU time (Seconds) used for single exection

Avg Physical Read
Average number of Physical read for single execution

Avg Logical Reads
Average number of Logical read for single execution

Avg Logical Writes
Average number of Logical writes for single execution

max_physical_reads
Maximum number of physical reads that this plan has ever performed during a

single execution.


max_logical_reads
Maximum number of logical reads that this plan has ever performed during a single execution.

max_logical_writes
Maximum number of logical writes that this plan has ever performed during a single execution.

Individual Query
Part of Batch Statement 

Batch Statement
Batch Query

query_plan
XML execution . On clicking this we can see the graphical execution plan

I used to  analyse  the first five to ten records ( in each category by changing the order by clause) to see  procedures implementation. Most of the time I used to find some thing wrong with the procedure  like unnecessary temp table usage, distinct clause , cursors, table joining with out proper joining condition,no proper indexing etc. The other issue usually happen is, enormous calls to database for single procedure (CPU cost and execution time might less). This might be a wrong implementation, which should be fixed by discussing with development team. Usually this can be fixed by adding some types of caching  in application.Some time calls to the database is only to check if there is any change in the result data. Some thing like if there is new records in a table, it should be processed immediately. To achieve this, application might be querying the table to find the unprocessed record multiple time in a second, This can be fixed by implementing some kind of asynchronous call to the application by the application which inserts the data into this table or by implementing the notification event using SqlDependency available in the .Net frame work.

If you liked this post, do like my page on FaceBook

Rate

You rated this post out of 5. Change rating

Share

Share

Rate

You rated this post out of 5. Change rating