Technical Article

PowerShell Tool Time: Controlling Our Tools

,

Welcome back for another edition of Powershell Tool Time. Last article I provided a basic introduction to Powershell tools and the framework for writing them: functions. For this article, I want to go into a little more detail about how we can define our functions to suit our needs, specifically around controlling our inputs.

Specific Sizes

Whatever your handyman project might be, from hanging picture frames to remodeling your bathroom, you will have to consider a tool that can only operate along certain size guidelines. Maybe it is a drill with a selection of drill bits or a ratchet set that has specific sockets. Whatever the case, the use of your tool is controlled by the attachments you can use with it.

We achieve the same capability in Powershell by properly defining our function parameters.  Powershell gives us a breadth of options around parameters, allowing us to control how each parameter can be used. For the complete detail on these options, review ‘Get-Help about_Functions_Advanced_Parameters’ in the Powershell help system. For our purposes, we are specifically going to look at parameter defaults and how we can use ValidateSet to control the inputs.

Managing Backup Files

Working with our backup files is a common task for most data professionals. In the last article where we reviewed disk space, it is common to address space shortages by removing old backup files. Of course, this can be a tedious task if one has to sift through directories to find the correct files, analyze them for age or size, and then ultimately remove them. Collecting this kind of information is a task at which PowerShell excels.

If I want to construct a helper function to gather a collection of these files, I need to have the user input a few items so the function knows what to collect:

  • The top directory to search under.

  • What backup files I want to collect (fulls, differentials, or logs).

  • Do I want files that are older than a certain time? Or earlier?

We can express this as the following parameter block:

  param([Parameter(Mandatory=$true)][string]$Path
  ,[ValidateSet('Full','Differential','Log')]$Type
  ,[int] $OlderThanHours=0
  ,[int] $EarlierThanHours=0)

Note that the first parameter has [Parameter(Mandatory=$True)] declaration. By using the [Parameter()] declaration, we can specify certain attributes of our parameter, in this case that it is required. This means that if someone calls our function and does not supply a Directory parameter, the function will ask for one.

Next up is the [ValidateSet('Full','Differential','Log')] declaration. [ValidateSet()] is one of several declarations that allow us to restrict the inputs that the parameter can accept. In this case, the Type parameter can only have the values of Full, Differential, and Log, meaning we can restrict the values we want our helper function to search on. The bonus is that Powershell’s Intellisense will recognize the ValidateSet and allow us to tab complete with these values. This degree of control can be extremely useful to how we can manage our code.

Finally, by declaring that the parameter is equal to value, as with OlderThanHours=0 and EarlierThanHours=0, we can set defaults and allow these parameters to be optional in the call. This helps us control the inputs by setting an initial value, allowing us to assume that we’ll always have something information. Using defaults is an effective way to manage how our function is used.

We can then flesh out the function with our file search logic:

  function Get-SQLBackups{
     param([Parameter(Mandatory=$true)][string]$Path
         ,[ValidateSet('Full','Differential','Log')]$Type
         ,[int] $OlderThanHours=0
         ,[int] $EarlierThanHours=0)
     if(Test-Path $Path){
         $extension = switch($Type){
                  Full{'.bak'}
                  Differential{'.dff'}
                  Log{'.trn'}
                  }
         if($OlderThanHours -gt 0){
             $files = Get-ChildItem $Path -Filter "*$extension" -Recurse | Where-Object {$_.LastWriteTime -lt (Get-Date).AddHours(-$OlderThanHours)}
             }
         elseif($EarlierThanHours -gt 0){
             $files = Get-ChildItem $Path -Filter "*$extension" -Recurse | Where-Object {$_.LastWriteTime -gt (Get-Date).AddHours(-$EarlierThanHours)}
             }
         else{
             $files = Get-ChildItem $Path -Filter "*$extension" -Recurse
             }      
         return $files
         }
     else{
         Write-Warning "'$Path' is not a valid path."
     } 
  }

The function is not much more than a wrapper for Get-ChildItem (aka ‘dir’ or ‘ls’). However, we can save ourselves a lot of effort and create consistency for how we search for our files with this block of logic. Notice how we can also use nested IF statements to control and restrict how the function works, so that the behavior of the function is more consistent. These are key points to any automated process, which is what we are doing by building these tools.

When we execute this function, we receive a list of files matching the extension passed in, of the age matching the filter, under the folder we passed into the folder. Here are some sample results:

Practical Use

What can we use this function for? Well we already discussed the need to clean up old files. By calling this function, we can easily find all our old backups:

  Get-SQLBackups -Path C:\DBFiles\backups -Type Log -OlderThanHours 24

What is great about this is with a simple pipeline call, we can then remove those files:

  Get-SQLBackups -Path C:\DBFiles\backups -Type Log -OlderThanHours 24 | Remove-Item

We can use this function for other situations as well. Another situation is to restore a series of log files to a database for point in time recovery. While there are many ways to script this out, this function gives us way to handle this by leveraging the pipeline:

  "--A Simple TLog Restore" | Out-File C:\TEMP\RestoreDemo.sql
  $tlogs=Get-SQLBackups -Path ‘C:\DBFiles\backups’ -Type Log -EarlierThanHours 24 | Sort-Object LastWriteTime
  foreach($tlog in $tlogs){
     "RESTORE LOG [RestoreDemo] FROM DISK=N'$($tlog.name)' WITH NORECOVERY;" | Out-File C:\TEMP\RestoreDemo.sql -Append
  }

Let’s walk through what’s going on here. The initial line is just making a nice header for our restore .sql file. The next line is where we call our function to get all our log backups in the last 24 hours, which we then sort by LastWriteTime. The next section loops through each object in the output from our function and writes it as a line to our .sql file. The result looks something like this:

It becomes a simple matter to take our helper function and start to extend it beyond the inital functionality, giving us powerful tools to help us manage our environments with consistent ease.

Drill Bits and Ratchet Sockets

Controlling our parameters is a useful and effective way to manage the use of our tools. Just as a handyman knows he can rely on what drill bit he should chooses to produce a specific size of hole, we can ensure our functions will execute the right task when we call them. Building our functions in a generic way ensures we are automating correctly and consistently, which is the core of scripting any task.

Thank you for reading and I hope you will return for the next article in this series. There have been a lot of questions about how to package these functions and reuse them.  Next article, we will review the Powershell profile and how it can be used to load your functions into your Powershell sessions. Also, I’ve created a GitHub repository for all the code in this series. Please feel free to download and make use of this code as well as provide feedback. You can find the repository at https://github.com/MikeFal/PowershellToolbox-SSC.

Rate

You rated this post out of 5. Change rating

Share

Share

Rate

You rated this post out of 5. Change rating