SQLServerCentral Article

Moving Indexes with Powershell and SMO

,

The Preamble

My mother in law was here visiting last week. She is "up there" in years and not particularly tech savvy (and by that I mean she refers to her microwave as complicated). For the past 9 years or so, she has consistently brought up the fact that my job as a DBA/Architect/Developer could get outsourced at any time. Convincing her that my job is secure was similar to convincing developers that using stored procedures is in their best interest.

Anyway, my theory to staying gainfully employed is to always get better at your job and bring more skills to the table each year. My guess is, that if you are reading this, you probably believe in that as well (or you need a quick fix). One theory that I have, is to become better at other disciplines like storage, virtualization, and administration. One language that will carry across those for the foreseeable future is Powershell. It makes it easy to automate tasks and data collection with little effort and uses the same commandlets across disciplines.

The What

This article focuses on moving all the non-clustered indexes located in the PRIMARY filegroup to the SECONDARY filegroup, this could be done to balance IO across LUNS on the SAN or because you are short of disk space. A logfile is created that stores all the T-SQL generated by the script so you have a trail of what was changed. In addition, an error log is generated as well to track any issues with the index being moved.

The Why

In order to move indexes across a number of servers, with a number of databases and hundreds of tables per instance, you would likely turn to OSQL or a scripting language to take care of it. Being most comfortable with T-SQL, I started solving this problem by generating DDL from the DMVs. That particular solution turned out to be extremely complex and cumbersome when you start adding in filtered indexes and other properties. Once you see what Powershell and SMO bring to the "table" and how quickly it is to modify this basic script to serve other needs, you may want to buy me a beer in Seattle in November 🙂

As a great real world example, as I was writing this, Glenn "DMV a day" Berry wrote about a problem with mother-in-law statistics (I mean old, outdated statistics). The indexes were built with the NORECOMPUTE setting on (don't auto update statistics) and the indexes needed to be rebuilt with the proper setting.(in SMO its "NoAutomaticRecomputation"). The script above can tackle something like that quite easy when you have multiple target databases or servers!

The How

To get started, lets load up the SMO assembly, and the ScriptingOptions as well as declare the two log files, one for the T-SQL generated($logFile) and one for errors($errorLogFile).

[Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo")
$scrp = New-Object Microsoft.SqlServer.Management.Smo.ScriptingOptions
$scrp.IncludeDatabaseContext = $true; $dt = get-date
$logFile = "C:\Powershell\Archives\IndexMovementScript_"+ $dt.Year.ToString() + $dt.Month.ToString() + $dt.Day.ToString() + $dt.Hour.ToString()+ $dt.Minute.ToString() + ".sql" $errorLogFile = "C:\Powershell\Archives\IndexMovementScriptFailures_"+ $dt.Year.ToString() + $dt.Month.ToString() + $dt.Day.ToString() + $dt.Hour.ToString()+ $dt.Minute.ToString() + ".sql"

Now, we declare the instance we are working with (change Server\Instance to your SQL Server) and load it up in SMO. We then get a collection of the databases that are user databases ($_.IsSystemObject -eq $false) and not read only $_.ReadOnly -eq $false. To make this a multi-instance script, simply read in a collection of servers and add another foreach loop.

 

SMO point: To see the plethora of properties exposed in SMO, simply type $sqlServer after the variable is set for instance properties(or do the same with $userDbs for database properties).

$sqlInstanceName = "Server\Instance";

#Get the SMO SQL Object $sqlServer = New-Object -typeName Microsoft.SqlServer.Management.Smo.Server -argumentList "$sqlInstanceName" #Get the User Databases that are not in read only
$userDbs = $sqlServer.Databases | Where-Object {$_.IsSystemObject -eq $false -and $_.ReadOnly -eq $false}

I use a SQL function that is based on something I saw in Chad Miller's PASS AppDev VC - ETL with PowerShell presentation. I just took out a few parameters I did not want to pass in. This function simply takes in an argument of the SQL instance and the query you want to run against that instance and returns them as a data table.

function Invoke-Sqlcmd3
{
param(
[string]$ServerInstance,
[string]$Query
)
$QueryTimeout=30
$conn=new-object System.Data.SqlClient.SQLConnection
$constring = "Server=" + $ServerInstance + ";Integrated Security=True"
$conn.ConnectionString=$constring
$conn.Open()
if($conn){
$cmd=new-object system.Data.SqlClient.SqlCommand($Query,$conn)
$cmd.CommandTimeout=$QueryTimeout
$ds=New-Object system.Data.DataSet
$da=New-Object system.Data.SqlClient.SqlDataAdapter($cmd)
[void]$da.fill($ds)
$conn.Close()
$ds.Tables[0]
}
}

 

At this point, we simply loop each database in $userDBs and verify it contains a SECONDARY filegroup. The next line loops through each table in the database that has an index ( $_.HasIndex -eq $true) and is not a system table ($_.IsSystemObject -eq $false) and returns the table name and index collection. At this point, we loop through each index and create the DDL to drop and recreate the index with the FILEGROUP property changed.

foreach($currentDB in $userDbs)
{
if($currentDB.FileGroups.Contains("SECONDARY"))
{

foreach($tb in $currentDB.Tables | Where-Object {$_.HasIndex -eq $true -and $_.IsSystemObject -eq $false} | Select-Object name,indexes )
{
foreach($ind in $tb.Indexes | Where-Object{$_.IsClustered -eq $false -and $_.IsXmlIndex -eq $false})
{ if($ind.FileGroup -eq "PRIMARY")
{

try{

#Setup the DROP Index Stmt. $scrp.IncludeIfNotExists = $true;
$scrp.ScriptDrops = $true;
$sql = $ind.Script($scrp);

#Setup the CREATE Index stmt on the new filegroup
$ind.FileGroup = "SECONDARY" # New Filegroup
$scrp.IncludeIfNotExists = $false; $scrp.DriAll = $true;
$scrp.ScriptDrops = $false;
# append create to drop statement $sql += $ind.Script($scrp);

#Log the SQL to a file (named above with a datetime suffix) $sql | Out-File $logFile -append #Execute the SQL index move
Invoke-Sqlcmd3 $sqlServer.Name $sql
}
catch { $errorMsg = "No File in FileGroup:" + $currentDB.Name + ":" + $ind.Name $errorMsg | Out-File $errorLogFile -append }
} # index in file group
} # for each index
} #for each table
}
}

This is pretty straightforward scripting with Powershell and SMO. Be careful as you could do some serious damage easily. To be safe, comment out the Invoke-Sqlcmd3 line while you are working and use the log file. This script works with filtered indexes, and all the other properties I could imagine wanting to change.

The Conclusion

Hopefully you learned how to utilize Powershell with SMO to quickly make changes to indexes. It has been a valuable tool at my organization for automation, monitoring and doing mass SQL administrative updates as needed. If you are looking for Powershell knowledge, there are some great blogs, articles, and scripts out there. Check out Chad Miller, Aaron Nelson, The Scripting Guys, et al have great information and I would also recommend signing up for Power Tips from Powershell.com.

 

Resources

Rate

4.6 (15)

You rated this post out of 5. Change rating

Share

Share

Rate

4.6 (15)

You rated this post out of 5. Change rating