Blog Post

T-SQL Tuesday #110 – Automate All the Things

,

Automate All the Things

I had a strange thing happen to me this week. My IFTTT applet that sends me an email notification of a new T-SQL Tuesday announcement post didn’t send that email until yesterday. That applet is just one example of things that I use to “Automate All the Things” that I can. Normally, IFTTT sends the email the day after the announcement post is published. Since it didn’t work as intended this time around, I’ve only been able to think about this topic for a short while.

In this month’s T-SQL Tuesday, Garry Bargsley ( blog | twitter ) wants to talk about “Automating All the Things”. Specifically, he is asking for us to respond to two different questions:

  • What do you want to automate or what automation are you proud of completing?
  • What is your go-to technology for automation?

My go-to technology

In my opening paragraph, I mentioned using IFTTT to send me an email. Using a cloud-based technology is certainly one method of doing things. However, I like to use whatever option (or options) that make the most sense for what I’m doing. In that case, using an IFTTT applet to check for changes and to send me an alert made the most sense.

However, this is not the technology that I use in my day-to-day automating. As a SQL Server consultant, my go-to technology is to use the T-SQL language to create the scripts to do what automation I need. I will often add in a mixture of PowerShell.

What do you want to automate or what automation are you proud of completing?

There are two automation processes that I want to talk about today. The first is a script that is still in progress. In this script, I analyze the indexes in a database, looking for duplicate or overlapping indexes. Removing one of the indexes will not sacrifice performance. This script also allows me to find similar indexes that could be combined into one. Any time that you can get rid of an index, you reduce the hit that a Data Manipulation Language (DML) statement has, making those statements perform better.

Planned improvements for this script are to consider the missing index recommendations, and to utilize the index usage statistics.

The other process that I want to talk about is my “Deadlock Analysis” script. This script extracts information from the deadlock graph and presents it in a tabular format. Having all of this information in a table allows one to have a better, faster view of what is going on with specific deadlocks. One of the things that I really like is that a lot of people use this script. Several MCMs / MVPs have told me that they use it, and I know that SentryOne mentioned it during a past PASS Summit.

The script in my code library is a bit dated now – but I’ve been working on improvements to it. Upcoming features include:

  • Ability to acquire the deadlocks from other monitoring services (not just SentryOne); for instance, Idera’s SQL Diagnostic Manager and Red-Gate’s SQL Monitor.
  • I use a #temporary table to store the results in, allowing me to run aggregations against it.
  • A multitude of various aggregations to present the information.
  • By combining this script with PowerShell, I have been able to automate exporting the aggregation data into an Excel spreadsheet and emailing the spreadsheet for review.

How Automation has helped me

Here’s an example of how this script has helped me. At a recent client, I was able to identify that one stored procedure was causing 95% of the deadlocks occurring in the past week (over 1500). Analyzing this stored procedure, I identified that a parameter was defined as a nvarchar, while the column’s underlying data type was a varchar. Because of this type mismatch, an implicit conversion was being performed where the column was being converted to a nvarchar. This made the existing index useless, and the query was scanning the entire table. Simply removing one character from this stored procedure (changing nvarchar to varchar) allowed this query to run focused index seeks. This change immediately eliminated all the deadlocks being caused by this procedure.

Without this script being available, I would have dealt with the deadlocks in a random order. Due to the volume of the deadlocks, I’m sure that I would have quickly ran into this procedure. However, running this script with the aggregations allowed me to focus on the most troublesome issue immediately. However, if I had needed to do all of this manually, it would have been a huge expenditure of time.

Summary

From a DBA’s perspective, automation is a skill that is necessary. This post shows just a few areas where I use automation in my life.

If you use my Deadlock Analysis script, let me know if you have any suggestions on how to enhance it.

The post T-SQL Tuesday #110 – Automate All the Things appeared first on Wayne Sheffield.

Original post (opens in new tab)
View comments in original post (opens in new tab)

Rate

You rated this post out of 5. Change rating

Share

Share

Rate

You rated this post out of 5. Change rating