Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 

Loading Data With Powershell

By Chad Miller, (first published: 2008/12/16)

From PowerShell you will inevitably want to load command and script output into a  database. Let's look at three methods to you can use to load data into SQL Server from Powershell.

Getting Started

The examples in this article use the following Powershell script to extract disk space information. Save the Powershell code below to a file as Get-DiskUsage.ps1:

param ( [string]$ComputerName = "." )

Get-WmiObject -computername "$ComputerName" Win32_LogicalDisk -filter "DriveType=3" 
| 
foreach { add-member -in $_ -membertype noteproperty UsageDT $((Get-Date).ToString("yyyy-MM-dd"))
add-member -in $_ -membertype noteproperty SizeGB 
$([math]::round(($_.Size/1GB),2))
add-member -in $_ -membertype noteproperty FreeGB $([math]::round(($_.FreeSpace/1GB),2))
add-member -in $_ -membertype noteproperty PercentFree 
$([math]::round((([float]$_.FreeSpace/[float]$_.Size) * 100),2)) -passThru } |
Select UsageDT, SystemName, DeviceID, VolumeName, SizeGB, FreeGB, PercentFree



You'll need to create the following table in your database to store the output:

CREATE TABLE dbo.DiskUsage(
        UsageDT datetime NOT NULL,
        SystemName varchar(50) NOT NULL,
        DeviceID varchar(50) NOT NULL,
        VolumeName varchar(50) NULL,
        SizeGB numeric(18, 2) NOT NULL,
        FreeGB numeric(18, 2) NOT NULL,
        PercentFree numeric(18, 2) NOT NULL
);


Note: This article uses C:\Users\u00\bin as the file path, Z002 as the computer name, Z002\SQLEXPRESS as the SQL instance and dbautility as the database. You'll need to change each item to suite your environment. Also note in Powershell ./ denotes the current directory, the examples presented assume both script and output are saved to the current directory from which you are executing the commands.

Method 1: Export-CSV/BULK INSERT

Powershell makes it very easy to create CSV files from any Powershell command/script by using the built-in cmdlet Export-CSV. For example, the following code executes the Get-DiskUsage.ps1 script against the computer named Z002, exports the results to a CSV using Export-CSV and as a result of -noTypeInformation switch, omits the type information as part of the output :

./get-diskusage.ps1 Z002 | export-CSV -path  ./diskusage.csv -noTypeInformat


Let's look at the of the diskusage.csv file . The output includes a header row with the property names and a row for each drive:

UsageDT,SystemName,DeviceID,VolumeName,SizeGB,FreeGB,PercentFree
2008-10-25,Z002,C:,LOCAL_120,111.79,24.95,22.32
 


The output of Export-CSV can almost be imported as-is into SQL Server using the T-SQL command BUIK INSERT with one minor edit. Since a header row is included you will need to remove the first line which has the property names but not the actual data. The first line is removed in order to use the T-SQL BULK INSERT command without specifying a format file to skip the first line. Removing the first line is easy to do with Powershell. For example, the following command will read in the contents of the diskusag.csv file skipping the first line and write the output back to the diskusage.csv file:

(Get-Content ./diskusage.csv) | where {$_.readcount -gt 1} | Set-Content ./diskusage.csv


The CSV file is ready to be imported into the SQL Server table created during setup. The following code uses the SQL Server command-line utility SQLCMD from PowerShell to execute the BULK INSERT command against the diskusage.csv file:

sqlcmd -S "Z002\SqlExpress" -E -Q "BULK INSERT dbautility.dbo.DiskUsage FROM 'C:\Users\u00\bin\diskusage.csv' 
  WITH (FIELDTERMINATOR = ',', ROWTERMINATOR = '\n')"


Pros: Export-CSV/BULK INSERT is the simplest method and uses natively available cmdlets and utilities. This method is the fastest method in terms of load speed when dealing with large data sets.

Cons: Requires file management when the solution may not require staging the data. Some editing of the CSV file produced by Export-CSV  is required. Depending on the data you may need to perform additional edits including removing extra quote marks around strings and converting Boolean values from true/false to 0 or 1 representations. Some data may have commas in the output which can wreak havoc on any load routines.

Recommendation: Use this method if your load process will benefit from staging the data in CSV files, when you are dealing with large data sets and when you know the data does not contain extra commas.

Method 2: XML/XMLBulkLoad

An additional setup task is required for this method, you'll need to download and install SQLXML. SQLXML 3.0 or 4.0 will work. It is difficult to find SQLXML on the Microsoft site so here is a link which will take you to the x86 SQLXML 4.0 SP1 install package download at the time of this writing. Note: SQLXML 4.0 is also available on the SQL 2005 installation CD, but is not installed by default.

The XML/SQLXMLBulkLoad method requires we convert the output to XML and generate an XSD schema file with SQLXML annotations. Both of these tasks can be accomplished using a the New-XML and New-XSD Powershell scripts. The Powershell Team blog posted a function called New-XML in this blog entry entitled Using PowerShell to Generate XML Documents. Copy the function to a text file called XML.ps1, save and source the function from PowerShell as follows:

. ./XML.ps1

Note: When you source a function you are simply loading the definition of the function into your current Powershell session, but not executing the function. The notation is dot space dot forward slash. The New-Xsd script is available in the Resources section below. Save the file as New.Xsd.ps1. Now you are ready to create XML and XSD files. The following creates an XML file of the disk space information called diskusge.xml:

./get-diskusage.ps1 Z002 | ./New-Xml -ItemTag DiskUsage -Attribute 
UsageDT,SystemName,DeviceID -ChildItems Volu meName,SizeGB,FreeGB,PercentFree > 
diskusage.xml


To create the XSD file you first need to assign our disk usage information to a the variable $du as shown below::

$du = ./get-diskusage.ps1 Z002


Next run the following command from Powershell to create the XSD file diskusage.xsd:

./New-Xsd.ps1 -Object $du -ItemTag DiskUsage -Attribute UsageDT,SystemName,DeviceID -ChildItems VolumeName,SizeGB,F reeGB,PercentFree > 
./diskusage.xsd



Examine the XSD file produced by New-Xsd. Some properties may not be mapped to the desired SQL data type. For example UsageDT should be a datetime rather than a varchar(255). You'll need to manually change the data type in the XSD file to datetime as follows:

<xs:attribute name="UsageDT" sql:field="UsageDT" sql:datatype="datetime"/> 


Because the XSD file can be reused for importing data in the same format, the file only needs to be generated and manually edited once . And finally to import the data into your SQL Server table. Run the following set of commands from Powershell:

$sqlxml = new-object -comobject "SQLXMLBulkLoad.SQLXMLBulkLoad"
$sqlxml.ConnectionString = "provider=sqloledb;server=Z002\SQLEXPRESS;database=dbautility;Integrated Security=SSPI"
$sqlxml.SchemaGen = $true
$sqlxml.Bulkload = $true
$sqlxml.Execute("C:\Users\u00\bin\diskusage.xsd","C:\Users\u00\bin\diskusage.xml")



One of the benefits unique to this method of importing data is the option to automatically create the table if the table does not already exist by setting the SchemaGen property to true.

Pros: The data is well-formed XML. No need to worry about editing data produced as long as the schema file is correct. Bulk loading of the XML file is very fast. The biggest benefit is the automatic creation of the table from the XSD schema definition.

Cons: Some manual editing of the XSD schema file may be required. Does not validate input for possible escaping of invalid XML. The slowest of the three methods in terms of generating the XML file. Like the CSV method requires storing the data in an intermediate XML file when the solution may not need to be staged to a file.

Recommendation: Use this method if the data is already in an XML format, or when your process will benefit from XML file storage or when you need to automatically create the tables.

Method 3:  DataTable/SQLBulkCopy

ADO.NET and the various wrappers around ADO.NET including the SMO ExecuteWithResults method, SQLPSX Get-SqlData and Microsoft SQL Server Powershell cmdlet Invoke-Sqlcmd will return a DataTable. But what about the times when the output of a Powershell command isn't a DataTable such as the DiskUsage.ps1 script? Well, not surprisingly it is fairly easy to take the output of any Powershell command and convert to a DataTable. Marc van Orsouw (/\/\o\/\/) in his ThePowerShellGuy blog posted a function called Out-DataTable which will convert the output of any Powershell command to a DataTable. This method will use the function to convert the disk usage information into a data table. Copy the function to a text file called DataTable.ps1, save and source the function from PowerShell as follows:

. ./DataTable.ps1


Note: As stated in Method 2, when you source a function you are simply loading the definition of the function into your current Powershell session, but not executing the function. The notation is dot space dot forward slash. Now you can use the Out-DataTable function with the DiskUsage.ps1 saving the output to a new variable called $dataTable:

$dataTable = ./DiskUsage.ps1 Z002 | Out-DataTable


Since the output of the Powershell command is saved as a DataTable you can  use the .NET 2.0 class, SqlBulkCopy with a few simple lines of code to load the data into a SQL Server table:

$connectionString = "Data Source=Z002\SqlExpress;Integrated Security=true;Initial Catalog=dbautility;"
$bulkCopy = new-object ("Data.SqlClient.SqlBulkCopy") $connectionString
$bulkCopy.DestinationTableName = "DiskUsage"
$bulkCopy.WriteToServer($dataTable)

Pros: No intermediate file storage is required since everything is done in memory. Fastest when the data is already in a DataTable format. Not as complex as the XML method. For this example of loading disk space information, this method is the best solution.

Cons: Some solutions may require an intermediate file storage. Slower than CSV method on very large data sets.

Recommendation: This method is ideal for dealing with smaller result sets or when the data is already in DataTable format

Conclusion

This article demonstrated three methods you can use to import data from Powershell into SQL Server. These methods produce a CSV file, an XML/XSD file or DataTable from Powershell which can be imported into SQL Server. Using these techniques you can load any Powershell output into a SQL Server table.

Resources:

New-Xsd.ps1
Total article views: 25717 | Views in the last 30 days: 54
 
Related Articles
FORUM

PowerShell detailed reference

Command and object properties & methods

ARTICLE

Stairway to SQL PowerShell Level 3: Input and Output with PowerShell

This level will cover some of the Input/Output operations you can use with PowerShell.

FORUM

SQL Server Agent running powershell commands

Getting token errors running powershell commands

FORUM

Help: Execute Process Task --> Powershell Command

Would like to Execute a Powershell Command from SSIS Execute Process Task

FORUM

SSIS and MSMQ datatable

SSIS and MSMQ datatable

 
Contribute

Join the most active online SQL Server Community

SQL knowledge, delivered daily, free:

Email address:  

You make SSC a better place

As a member of SQLServerCentral, you get free access to loads of fresh content: thousands of articles and SQL scripts, a library of free eBooks, a weekly database news roundup, a great Q & A platform… And it’s our huge, buzzing community of SQL Server Professionals that makes it such a success.

Join us!

Steve Jones
Editor, SQLServerCentral.com

Already a member? Jump in:

Email address:   Password:   Remember me: Forgotten your password?
Steve Jones