Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 

Trading in xp_cmdshell for SQLCLR (Part 2) - Export Data to Flat Files

By Jonathan Kehayias, (first published: 2009/03/03)

This is the second article in a series (first article here) that will show how to use the SQLCLR to create a safer, more secure environment rather than enabling xp_cmdshell in SQL Server 2005/2008. I see at least one post a week asking how to export data from a table to a flat using a stored procedure on the various forums that I answer on, including on here. Often the way that most people recommend doing this is by enabling xp_cmdshell and using the bcp tool to perform the export. As we know from the first article, this can have unexpected side effects since xp_cmdshell is unrestricted in the limits of what someone can do once it is enabled if they have the rights to call it.

This article gets a little bit more technical with the code in .NET that is being used, but worry not, full source is available at the bottom, and thanks to some of the feedback/criticism that I received from Phil Factor in the first article in this series, I have made sure to code proper exception handling for the case where exceptions might occur. The included source code is in both C# and VB.NET so that it is easily used by either party as well.

Rather than write what would equate to a entire chapter for a book explaining all the code, I am going to hit on some key portions of code. The first thing is that this process is based on a modified version of the CSVWriter class that can be found on Andreas Knab's Blog. Since the implementation there is for a DataTable which is an expensive memory consumer it wasn't ideal for use inside of SQLCLR. It has been changed to accommodate a SqlDataReader and allow for variable delimiter values to create tab delimited and comma separated files from the same code block. The DelimitedFileWriter writer class is the primary workhorse for outputing the formatted data to the file.

For flexibility purposes, I chose to include two separate stored procedures, WriteResultsToCsvFile and WriteResultsToTabFile, which have identical code with the only difference being the delimiter being passed to the DelimitedFileWriter calls, a comma for the Csv and a "\t" escape character for the tab delimited files. This could be encapsulated in a single stored procedure with the use of a parameter to drive the delimiter and prevent redundant code, and if I were to use this in production code, this is how I would likely use it.

One item that was brought up in the discussion of the first article of this series was concern over when the GarbageCollector in SQLCLR may trigger collection to free memory allocations back after code execution. One of the recommended practices where possible in SQLCLR is to wrap code and object creation in a "using" Statement which allows the developer to define the scope of resource usage for the objects being created. This construct can be used with any object that implements the IDisposable .NET Interface.


// Create a SqlConnection object to the context connection
using (SqlConnection conn = new SqlConnection("context connection=true;"))
{
 // Create a SqlCommand object to pass the query into
 using (SqlCommand cmd = new SqlCommand(sqlcmd, conn))
 {
 // Open the connection to the database
 conn.Open();

// Retrieve a SqlDataReader to process the results of the SqlCommand using (SqlDataReader dr = cmd.ExecuteReader()) { // Create a TextWriter to stream the results to the file using Unicode Encoding using (TextWriter stream = new StreamWriter(filename, false, Encoding.Unicode)) { // Write the Header Row to the file if required. if (includeheaders) { DelimitedFileWriter.WriteHeadersToStream(stream, dr, ",", quoteall); } // Loop over each row returned and write it to the file while (dr.Read()) { rowcount++; DelimitedFileWriter.WriteRowToStream(stream, dr, ",", quoteall); } // Clean up by closing the open SqlDataReader and SqlConnection. dr.Close(); conn.Close();

// Flush the stream buffer to the File and close the stream. stream.Flush(); stream.Close(); } } } }


This will allow for the memory resources to be freed as soon as the execution scoped by the blocks ends. In the above code snip it, notice that the SqlDataReader is explicitly closed by the code using the dr.Close() call as is the SqlConnection. Good cleanup as you code in .NET is an important task to ensure resources are not held longer than necessary. One caveat with the .NET streaming classes from System.IO namespace is that they buffer information in memory that has to be flushed before closing the object. Failing to do so will cause your output file to be improperly terminated and have missing data and an invalid format.

Once the file has been written, there is a Catch block to catch an Exception that might occur during the execution of the File creation or SqlCommand. To keep the usage of these procedures flexible, a Exception object is created at the start of the code with a default error message set to "None". Should a actual error during execution occur, this default Exception is replaced by the Error Exception in the Catch handler.

A Finally block is then used to check whether the caller of the stored procedure specified whether or not the parameter to include output information was set to true. If it was then a SqlDataRecord object is created for the return row that will contain the output information:


finally
 {
 if (provideoutput)
 {
 SqlDataRecord sdr = new SqlDataRecord(
 new SqlMetaData("RowsWritten", SqlDbType.Int),
 new SqlMetaData("FileName", SqlDbType.NVarChar, 480),
 new SqlMetaData("Errors", SqlDbType.NVarChar, 4000)
 );

sdr.SetValue(0, rowcount); sdr.SetValue(1, filename); sdr.SetValue(2, error.Message); SqlContext.Pipe.Send(sdr); } }


First note that the SqlDataRecord is not wrapped in a "using" Statement. This is not a mistake, it is by design. the SqlDataRecord object does not implement the IDisposable interface. The SqlMetaData objects define each of the columns that will be output for the SqlDataRecord. The SqlDbType.NVarChar type requires that the precision be set when used, which is why the FileName and Errors definitions have the size values set as a part of their definition. With the object created, it can have the values for each of the columns set using the SetValue() method and passing the zero based column ordinal that matches the column position in the SqlDataRecord definition. With the object created it can then be returned using the SqlContext.Pipe.Send() method.

Since we already have a Asymmetric Key to sign the Assembly for External_Access usage in the first part of this series, the same key has been used for this article as well. Ideally, in a production implementation, a single assembly would hold the all of the SQLCLR objects, though they can exist in multiple assemblies as well.

One of the problems with SQLCLR objects is that default values can not be natively bound to parameters when using Visual Studio to deploy the Assembly. For this reason, I prefer to manually deploy the TSQL objects myself, especially where in most cases, I have a preference for how the objects should function. In the case of these stored procedures, the @sqlcmd and @filename parameters shouldn't have a default value bound to them, but @includeheaders, @quoteall, and @provideoutput parameters could have a default value set on them to simplify the calling of the procedure to only require the @sqlcmd and @filename parameters. Without defaults set, the following is an example for how to call the stored procedure.


DECLARE @sqlcmd nvarchar(4000)
DECLARE @filename nvarchar(4000)
DECLARE @includeheaders bit
DECLARE @quoteall bit
DECLARE @provideoutput bit

SET @sqlcmd = 'select * from sys.objects' SET @filename = 'c:\object_export.csv' SET @includeheaders = 1 SET @quoteall = 1 SET @provideoutput = 1

EXECUTE [dbo].[WriteResultsToCsvFile] @sqlcmd ,@filename ,@includeheaders ,@quoteall ,@provideoutput GO


To set defaults on the three options, the TSQL stored procedure DDL can be scripted and manually changed and altered as follows:


CREATE PROCEDURE [dbo].[WriteResultsToCsvFile]
 @sqlcmd [nvarchar](4000),
 @filename [nvarchar](4000),
 @includeheaders [bit] = 1,
 @quoteall [bit] = 1,
 @provideoutput [bit] = 0
WITH EXECUTE AS CALLER
AS
EXTERNAL NAME [SQLCLRBulkCopy].[StoredProcedures].[WriteResultsToCsvFile]
GO

This simplifies the code to call the stored procedure to be:


DECLARE @sqlcmd nvarchar(4000)
DECLARE @filename nvarchar(4000)

SET @sqlcmd = 'select * from sys.objects' SET @filename = 'c:\object_export.txt'

EXECUTE [dbo].[WriteResultsToCsvFile] @sqlcmd ,@filename GO


or even better:


EXECUTE [dbo].[WriteResultsToCsvFile] 
 @sqlcmd = 'select * from sys.objects'
 ,@filename = 'c:\object_export.txt'


I'm looking for other things that people use xp_cmdshell for to continue this series. If you have something you'd like to see a SQLCLR solution to, please Join the Discussion and leave a comment. I'd be very happy to take a stab at writing a External_Access assembly to solve the problem.

Resources:

SQLCLRNet_BCPReplacement_CSharp.zip | SQLCLRNet_BCPReplacement_VB.zip | Install SQLCLRNet_BCPReplacement_CSharp.sql
Total article views: 16837 | Views in the last 30 days: 35
 
Related Articles
FORUM

Invoke-Sqlcmd and Transactions

How can I impose a transaction context on Invoke-Sqlcmd or emulate Invoke-Sqlcmd using SMO objects?

FORUM

create Procedure within Procedure

create Procedure within Procedure

FORUM

Creating Database Objects from a Trigger-invoked stored procedure....

I am trying to create job-specific database objects (views, sp) after creating the db and tables suc...

FORUM

create a stored procedure to create a txt file

create a stored procedure to create a txt file

FORUM

Create separate file for Procedure

Create separate file for Procedure

Tags
sqlclr    
xp_cmdshell    
 
Contribute

Join the most active online SQL Server Community

SQL knowledge, delivered daily, free:

Email address:  

You make SSC a better place

As a member of SQLServerCentral, you get free access to loads of fresh content: thousands of articles and SQL scripts, a library of free eBooks, a weekly database news roundup, a great Q & A platform… And it’s our huge, buzzing community of SQL Server Professionals that makes it such a success.

Join us!

Steve Jones
Editor, SQLServerCentral.com

Already a member? Jump in:

Email address:   Password:   Remember me: Forgotten your password?
Steve Jones