SQLServerCentral Article

Obtaining Query Execution Plans Through SQL Profiler Traces


Often clients ask me how they can go about optimizing their stored procedures and during our discussions we usually end up talking about using execution plans to aid in the optimization of those stored procedures. Most of my clients are aware that you can obtain execution plans through Query Analyzer. What most of the clients do not know is that you can utilize SQL Profiler to obtain the execution plans of queries running on a particular system. The problem is once you obtain a SQL Profiler trace file – how do you weed through all the information found that file to just return execution plans and the query associated with that plan.

This article shows a simple way to associate execution plans found in SQL profiler traces to the statement generating that plan. While the following scripts are not a 100% solution to obtain every single association in a trace file – some adjustments may be needed once in awhile – they are able to obtain the vast majority of the associations and will go a long way in helping you obtain the information you need to tune your procedures.

While you can use the SQL Profiler GUI to obtain your traces, I usually use a script to execute the trace without the GUI. This method lessens the impact of the system tracing and may make the difference in being able to obtain a trace on an extremely active system or not. While there is nothing special about the trace I executed, you do need to make sure you obtain the execution plans by including the following code in your script.

exec sp_trace_setevent @traceid, 68, 1, @on --Execution Plan, TextData
exec sp_trace_setevent @traceid, 68, 2, @on --Execution Plan, BinaryData
exec sp_trace_setevent @traceid, 68, 3, @on --Execution Plan, DatabaseID
exec sp_trace_setevent @traceid, 68, 4, @on --Execution Plan, TransactionID
exec sp_trace_setevent @traceid, 68, 6, @on --Execution Plan, NTUserName
exec sp_trace_setevent @traceid, 68, 7, @on --Execution Plan, NTDomainName
exec sp_trace_setevent @traceid, 68, 8, @on --Execution Plan, ClientHostName
exec sp_trace_setevent @traceid, 68, 9, @on --Execution Plan, ClientProcessID
exec sp_trace_setevent @traceid, 68, 10, @on --Execution Plan, ApplicationName
exec sp_trace_setevent @traceid, 68, 11, @on --Execution Plan, SQLSecurityLoginName
exec sp_trace_setevent @traceid, 68, 12, @on --Execution Plan, SPID
exec sp_trace_setevent @traceid, 68, 13, @on --Execution Plan, Duration
exec sp_trace_setevent @traceid, 68, 14, @on --Execution Plan, StartTime
exec sp_trace_setevent @traceid, 68, 15, @on --Execution Plan, EndTime
exec sp_trace_setevent @traceid, 68, 16, @on --Execution Plan, Reads
exec sp_trace_setevent @traceid, 68, 17, @on --Execution Plan, Writes
exec sp_trace_setevent @traceid, 68, 18, @on --Execution Plan, CPU
exec sp_trace_setevent @traceid, 68, 21, @on --Execution Plan, EventSubClass
exec sp_trace_setevent @traceid, 68, 22, @on --Execution Plan, ObjectID
exec sp_trace_setevent @traceid, 68, 25, @on --Execution Plan, IntegerData
exec sp_trace_setevent @traceid, 68, 27, @on --Execution Plan, Eventclass

Full script can be found here: create_script.txt.

Once you start your trace and wait a sufficient amount of time based on the activity of your system, you can use the following small script to stop the trace.

use master
-- Declare and set trace ID
declare @TraceID int
set @traceid = 1
-- stop trace
exec sp_trace_setstatus @TraceID, 0
-- close trace file
exec sp_trace_setstatus @TraceID, 2
Print 'TraceID stopping...'

Once the trace file has been obtained, you can use the fn_trace_gettable function to load the file or files into a series of tables to facilitate mining the data.

SELECT IDENTITY(INT, 1,1) AS trc_id , *  INTO trace_tables.dbo.trace_TRC1 
--Increment number if multiple traces exist
FROM ::fn_trace_gettable('C:\SQLProfilerTrace_SCALABIL-_20050331151958.trc', default) nolock
 --Change path if multiple traces exist

Or you can use a slightly more complex script to create the database and load one or more files for you into several different tables: load_trace.txt

Once the trace file has been loaded into table(s), a small script can be used to associate the query and its execution plan by looking for the execution plan and then finding the preceding statement for the SPID of the execution plan. This method works the majority of time since the execution plan usually is the next statement obtained from a particular SPID in the trace. There may be multiple lines of information between the query and its execution plan, but you will notice that all of those lines of information belong to different SPIDs leaving you the ability to associate statements by SPIDs and the IDENTITY column we created when loading the trace file into a table.

USE Trace_Tables
DECLARE @executiontree TABLE (id INTEGER IDENTITY(1,1), idnumber INTEGER, spid INTEGER) 
INSERT INTO @executiontree (idnumber, spid) 
SELECT trc_id, SPID FROM Trace_Tables_TRC1  
WHERE TextData LIKE '%Execution Tree%'
SET @rowcounts = @@ROWCOUNT
WHILE @rowcounts > 0
SELECT @trc_id = (SELECT idnumber FROM @executiontree WHERE id = @rowcounts) 
SELECT @spid = (SELECT SPID FROM @executiontree WHERE id = @rowcounts) 

SELECT TOP 1 TextData FROM Trace_Tables_TRC1 
AND trc_id < @trc_id
AND SPID = @spid
SELECT TextData FROM Trace_Tables_TRC1 
WHERE trc_id = @trc_id
SET @rowcounts = @rowcounts- 1

Like I said earlier, not really a complex process when you utilize the SPID column of a trace file to associate queries with their execution plans. You might have to “tweak” the scripts a bit with your environment, but there is enough information in this little article to help you start the process in your own environment and obtain the maximum benefit from your SQL Profiler traces.

About the author

I started working with SQL Server in the mid 1990's after spending time as both a Visual Basic developer and Microsoft Access developer. Numerous projects upsizing Access to SQL Server lead me to become a full-time SQL Server DBA and I have remained one ever since.

I have a large variety of experiences dealing with SQL Server, Oracle, Sybase, DB2, Access, and other database platforms over the years and have worked with environments that had over 30 Terabytes of data and environments that had 100's of database with only a few megabytes of data. During this time I have worked as a production DBA, a development DBA, a mixture of both, and as a database architect and have enjoyed all of those positions.

You can find more about me and works on one of my web sites:






5 (1)




5 (1)