Blog Post

Log Files from Different Sources

,

Well, it has been quite a pause between articles in the series. It is well past time to continue the series. With such a delay in the series, it is prudent to remind all that there is a series on Extended Events I had done over the course of two months. The complete table of contents for this series can be found here. yule logTruth be told, I had intended to start this series again as my usual 12 days of Christmas series. Sadly, I am a bit behind on that series. Maybe some of there will be multiple posts a day to help catch up.

As was the case throughout the course of the series, I intend over the next few articles to continue to produce building blocks that may be helpful for later articles, or concepts. With that in mind, the chosen topic today is very basic in nature. While basic, it is a rather important topic.

While working with Extended Events, or with any sort of logging or tracing utility, it becomes critical to consume the data from disparate sources or even to consume the data from a source server on a different server. The question becomes: How to consume that data?

Getting Sticky

When it comes to working with log files from different sources, or most tasks within SQL Server for that matter, there are two general methods. The first method is of the type frequently frowned upon by the more senior of database professionals, sysadmins and so forth. This first method is to use the graphical user interface or GUI. This is more of a point and click method. This is also a method that I will generally shy away from for most tasks. That said, this method is easily performed through the GUI and is reasonably safe.

The first step is illustrated with the following image:

XE_merge

From within Management Studio, navigate the file menu as shown in the preceding image. After clicking on “Merge Extended Event Files”, a dialog will open as shown (in the background) of the next image:

XE_merge_opendialog

Clicking add from the “Merge Extended Event Files” window will open a dialog that permits you to navigate to the location of the XEL files that need to be opened and merged. Once located, a single file or multiple files can be selected. One should proceed carefully here to ensure against opening too many files or files that may be too large. Opening too many files or files that are too large can cause various unwanted effects.

Once the file(s) is(are) open, a grid like window will be open in SSMS that looks like the following:

merge_gui_display

From here, one can manipulate the display to something more suitable to his/her display preferences (e.g. filtering or column layout). I will not cover that here but will be saving it (the display customization tutorial) for another time. That aside, I have just shown how easy it is to look into an Extended Event Log file through the use of the GUI.

Not So Sticky

For the less sticky  (less GUI) method, I have shown a similar routine in the past that lays the groundwork for how to do this via script. You can read a little about that here. Building on that method, there is a tweak to be made – we must specifically name the path to the XEL files to be merged rather than try to build it dynamically.

Using the same session files as shown in the GUI examples, I will show how to proceed with the merge via script.

/* xel and xem are required for 2008 and r2 even if using sql 2012 or later to interpret the data
files copied down from remote server 
*/INSERT into DBA.dbo.SlowQueriesXE
SELECT CAST ([t2].[event_data] AS XML) AS event_data, t2.file_offset,t2.file_name,''
FROM sys.fn_xe_file_target_read_file(
--xel path xel = extended event log
'G:\Database\XE\XE\SlowQueries*.xel'
--mdpath or xem path xem = extended event metadata
,'G:\Database\XE\XE\SlowQueries*.xem'
--initial file name
, NULL
--initial file offset
, NULL) t2
GO

This script lays out a rather simple process. That process being to load the log data into a table (in a database of your choosing) and then once loaded, query the data to better understand what has been captured. In the preceding script I left a few notes. It is important to understand that if trying to pull in the logs from a SQL 2008 or R2 instance then the XEM file must be included (as I have done for this particular example).

To parse the data into a friendly format that I can use for analysis, I would do something like the following:

SELECT pars.*, map.map_value
FROM (
SELECT x.event_data.value('(event/@name)[1]', 'varchar(50)') AS event_name
--, xe.event_data.value('(@package)[1]', 'varchar(50)') AS package_name
, DATEADD(hh, DATEDIFF(hh, GETUTCDATE(), CURRENT_TIMESTAMP),
x.event_data.value('(event/@timestamp)[1]', 'datetime2')) AS [timestamp]
, x.event_data.value('(event/action[@name="sql_text"]/value)[1]',
'varchar(max)') AS sql_text
, x.event_data.value('(event/action[@name="session_id"]/value)[1]',
'varchar(max)') AS session_id
, x.file_offset,x.predicate--,x.file_name
,event_data.value('(event/data[@name="cpu_time"]/value)[1]','bigint') AS cpu_time
,event_data.value('(event/data[@name="duration"]/value)[1]','bigint') AS duration
,event_data.value('(event/data[@name="physical_reads"]/value)[1]','bigint') AS physical_reads
,event_data.value('(event/data[@name="logical_reads"]/value)[1]','bigint') AS logical_reads
,event_data.value('(event/data[@name="writes"]/value)[1]','bigint') AS writes
,event_data.value('(event/data[@name="row_count"]/value)[1]','bigint') AS row_count
,event_data.value('(event/data[@name="result"]/value)[1]','varchar(max)') AS result
, event_data.value('(action[@name="tsql_stack"]/value/frames/frame/@handle)[1]','varchar(max)') AS stackhandle
, event_data.value('(action[@name="tsql_stack"]/value/frames/frame/@offsetStart)[1]','bigint') AS stackoffset
, event_data.value('(action[@name="tsql_stack"]/value/frames/frame/@offsetEnd)[1]','bigint') AS stackoffsetend
--, x.event_data
FROM DBA.dbo.SlowQueriesXE x) pars
LEFT OUTER JOIN (SELECT oc.object_name as EventName,oc.name as ColName,mv.name as MapName, map_key, map_value 
FROM sys.dm_xe_map_values mv
Inner Join sys.dm_xe_object_columns oc
on mv.name = oc.type_name
AND mv.object_package_guid = oc.object_package_guid
WHERE oc.object_name = 'sql_batch_completed'
AND oc.column_type <> 'readonly'
) map
ON pars.result = map.map_key;

From here it is really easy to add/remove columns or re-order the columns into a more friendly format for the person reviewing the data. This is incredibly easy – especially given the appropriate recipe / script / process. I would also venture that this method will require fewer resources and lead to less chance of error.

Enjoy working with evaluating and merging these XEL files from different sources. It should be fun!

This has been another article in the 60 Days of XE series. If you have missed any of the articles, or just want a refresher, check out the TOC.

 

Rate

You rated this post out of 5. Change rating

Share

Share

Rate

You rated this post out of 5. Change rating