Collecting Diagnostic data from multiple SQL Server instances with dbatools

,

Keeping their SQL Server instances under

control is a crucial part of the job of a DBA. SQL Server offers a wide variety

of DMVs to query in order to check the health of the instance and establish a

performance baseline.

My favourite DMV queries are the ones

crafted and maintained by Glenn Berry: the SQL Server Diagnostic Queries. These

queries already pack the right amount of information and can be used to take a

snapshot of the instance’s health and performance.

Piping the results of these queries to a set

of tables at regular intervals can be a good way to keep an eye on the instance.

Automation in SQL Server rhymes with dbatools, so today I will show you how to

automate the execution of the diagnostic queries and the storage of the results

to a centralized database that you can use as a repository for your whole SQL

Server estate.

The script

The script I’m using for this can be found on GitHub and you can download it, modify it and adapt it to your needs.

I won’t include it here, there is really no need for that, as you can find it on Github already. So, go, grab it from this address, save it and open it in your favourite code editor.

Done? Excellent! Let’s go through it together.

The script, explained

What I really love about PowerShell is how

simple it is to filter, extend and manipulate tabular data using the pipeline,

in a way that resonates a lot with the experience of T-SQL developers.

The main part of the script is the one that invokes all the diagnostic queries included in the list $queries. This is done by invoking the cmdlet Invoke-DbaDiagnosticQuery, that takes care of using a version of the diagnostic query that matches the version of the target server and selecting the data. As usual with dbatools, the -SqlInstance parameter accepts a list of servers, so you can pass in the list of all the SQL Servers in your infrastructure.

Invoke-DbaDiagnosticQuery -SqlInstance $SourceServer  -QueryName $queries

Sometimes the queries do not generate any data, so it is important to filter out the empty result sets.

Where-Object { $_.Result -ne $null } 

In order to store the data collected at

multiple servers and multiple points in time, you need to attach some

additional columns to the result sets before writing them to the destination

tables. This is a very simple task in PowerShell and it can be accomplished by

using the Select-Object cmdlet.

Select-Object accepts a list of columns

taken from the input object and can also add calculated columns using

hashtables with label/expression pairs. The syntax is not the friendliest

possible (in fact, I have to look it up every time I need it), but it gets the

job done.

In this case, you need to add a column for the server name, one for the database name (only for database scoped queries) and one for the snapshot id. I decided to use a timestamp in the yyyyMMdd as the snapshot id. This is what the code to define the properties looks like:

        $TableName = $_.Name
        $DatabaseName = $_.Database
        $ServerName = $_.SqlInstance
        $snapshotProp = @{
            Label = "snapshot_id"
            Expression = {$SnapshotId}
        }
        $serverProp = @{
            Label = "Server Name"
            Expression = {$ServerName}
        }
        $databaseProp = @{
            Label = "Database Name"
            Expression = {$DatabaseName}
        }

Now that the hashtables that define the

additional properties are ready, you need to decide whether the input dataset

requires the new properties or not: if a property with the same name is already

present you need to skip adding the new property.

Unfortunately, this has to be done in two different ways, because the dataset produced by the diagnostic queries could be returned as a collection of System.Data.Datarow objects or as a collection of PsCustomObject.

        if(-not (($_.Result.PSObject.Properties | Select-Object -Expand Name) -contains "Server Name")) {
            if(($_.Result | Get-Member -MemberType NoteProperty -Name "Server Name" | Measure-Object).Count -eq 0) {
                $expr += ' $serverProp, '
            }
        }

Now comes the interesting part of the

script: the data has to get written to a destination table in a database.

Dbatools has a cmdlet for that called Write-DbaDataTable.

Among the abilities of this nifty cmdlet, you can auto create the destination tables based on the data found in the input object, thus making your life much easier. In order to pass all the parameters to this cmdlet, I will use a splat, which improves readability quite a bit.

        $expr += '*'
        $param = @{
            SqlInstance     = $DestinationServer
            Database        = $DestinationDatabase
            Schema          = $DestinationSchema
            AutoCreateTable = $true
            Table           = $TableName
            InputObject     = Invoke-Expression $expr
        }
        Write-DbaDataTable @param

As you can see, you need to pass a

destination server name, a database name, a schema name and a table name. As I

already mentioned, Write-DbaDataTable will take care of creating the target

table.

One thing to note is how the data is passed

to the cmdlet: the InputObject is the result of an expression, based on the

dynamic select list generated inside the ForeEach-Object cmdlet. This is very

similar to building a dynamic query in T-SQL.

Conclusion

This script can be downloaded from GitHub

and you can schedule it on a centralized management server in order to collect

diagnostic data across your entire SQL Server estate.

Dbatools is the ultimate toolset for the

dba: if you’re still using the GUI or overly complicated T-SQL scripts to

administer and maintain your SQL Server estate, you’re missing out.

Dbatools is also a great opportunity for me to learn new tricks in Powershell, which is another great productivity tool that can’t be overlooked by DBAs. What are you waiting for? Go to dbatools.io now and start your journey: you won’t regret it.

Original post (opens in new tab)
View comments in original post (opens in new tab)

Rate

Share

Share

Rate