Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
Log in  ::  Register  ::  Not logged in

T-SQL Tuesday – Powershell and Perfmon Results – #TSQL2sDay

TSQL2sDay150x150This month, Wayne Sheffield (Blog | @DBAWayne) is running T-SQL Tuesday on PowerShell.  T-SQL Tuesday is a monthly blogging event where a number of us in the SQL Server community gets together to blog on a central topic. The topic that was chosen for this month was PowerShell. And more specifically:

“Blog about using PowerShell for doing, well, anything as long as it is related to SQL Server or the computer that SQL Server runs on. Scripts are encouraged, but not required.”

 Performance Monitor Outputs

Before I go into the script I want to show in this post, let’s first talk about the output formats from Performance Monitor, or perfmon, and the problems associated with them.  There are four output types available from perfmon; these are comma separated file, tab separated file,  SQL Server table, and binary file.

With comma separated and tab separated files, each row of the file contains a snapshot of the counters for a point in time.  If the collector is collecting data every 15 seconds, then there will be a new line added every 15 seconds.  Every column in the file represents one of the counters being collected.  The main issue with these two format is whenever a new counter is added, the process being used to consume the counters needs to be changed and the columns remapped to accommodate the additional information.  This can easily become a big pain.

The SQL Server table inserts a row for each time period collected into a table in SQL Server.  Like the other output, if the data is being collected every 15 seconds, then a new row is inserted every 15 seconds.  And similarly, there is a column in the table for every counter being collected.  Besides concerns with saving performance data to SQL Server as it is being collected, this output format has the same issues with having to re-code solutions when the counters being collected change.  Like the previous two output formats, this output format can be a huge pain.

The last output format from perfmon is the binary format.  This format contains a binary object for every point in time in which counters are collected.  Thus, if the collection is every 15 seconds a new binary object is created for each of those time periods.  The biggest issue with this output format is that it is in binary.  Meaning that you can’t just open the file and read it.  It can be converted to the comma or tab separated formats, but that then brings the issues with those formats.

With each of the formats, there are issues.  Issues that make it more of a challenge to use the format in it’s native format than is worth dealing with.  What’s needed is a comma or tab separated format that contains name-value pairs.  This would allow the counters collected to change at a whim without the need to rewrite the entire process to accommodate a new column in the process.  Creating that output format is what the rest of this post is about.

Perfmon Name-Value Pairs

As I mentioned, we want to convert the binary format into name-value pairs for each time period collected.  To accomplish this, the Powershell script needs to perform two main steps.  First, the script needs to loop through each of the binary objects inside the file.  Each point in time is it’s own object, so each will need to be parsed.  Second, within each binary object, the script needs to iterate through each of the counters and create a row of output for each of these items.

Along with the main tasks, there are a few other things that will make the script easier to use and improve the usefulness of the output.  First, you will likely want to inject the name of the server where the counters were collected.  While it is a repeating value, it is sometimes easier to deal with the server name as data than a parameter on an import.  Second, derive the name of the output file from the name of the input file and check to see if the file exists before parsing the file.  Doing this should help reduce reprocessing files.  Lastly, you’ll want to export the values to the file as pipe-delimited values.  While tab or comma separated is more common, pipe-delimiting the values decreases the likelihood that a counter value will interfere with the format.

Pulling all of these tasks together results in the script provided in Listing 1.  All you need to do to use this is to change the name of the input file to a binary (blg) perfmon file that you already have and it will parse through the file.

#Listing 1 - Name-Value Pair Perfmon Script
$file = get-item "C:\temp\SQLPerf_000001.blg"
$serverName = "Unicorn42"

$newFile = "$($file.Directory.FullName)\$($file.basename).csv"

if (!(Test-Path ($newfile))) {
import-counter -path $file | % { $_.counterSamples } | `
Select-Object timestamp, path, instance, cookedvalue, @{Expression={"$serverName"};Label="Server"} `
| export-CSV -path $newFile -Delimiter "|"


After these performance counter files are converted, you can easily import the name-value pairs into SQL Server or Excel and start analyzing the results.  Since the number and format of columns in the file won’t change, regardless of the counters collected, you’ll be able to build re-usable solutions that read the data to perform the work are aren’t overly dependent on having specific columns in a specific order.  While this isn’t the most complex PowerShell script in the world, it is one that I’ve found quite useful.

How do you deal with the challenge of handling the formats available from Performance Monitor?


Leave a comment on the original post [, opens in a new window]

Loading comments...