May 26, 2010 at 4:43 am
I've developed a C# CLR stored procedure that writes data out to a file, the Reason for this is to avoid using BCP with XP cmd shell enabled.
The only problem with the CLR is performace:
before I can execute the following command
EXEC writetofile @v_outbuffer,'c:\export_table_test.txt'
I need to loop through the target export table and build up the @v_outbuffer variable row by row.
I am new to CLR development and was wondering if anyone has any advice or suggestions for a faster way of implementing the above
May 26, 2010 at 6:29 am
thanks Stewart for the suggestion but SSIS doesnt offer the flexibility I need....
e.g. the filename is dynamic, the source could be dyanmic. You could say am trying to replicate SQL servers BCP functionality with a CLR. I'm a c# noob so any help/tips would be appreciated.
cheers
shane
May 26, 2010 at 6:41 am
as long as != flexible
I would have to do something like run an SSIS package programmatically
e.g.
create an export package
configure an agent job to run the package
call sp_start_job from proc
all of the above would need to run from a procedure and in my opinion be a complete mess hence looking into A CLR to do this....
May 26, 2010 at 8:16 am
thanks stewart,
I've already written the c# class so I'm not a completely new to c#
I'm looking for help with achieving the above with something like
http://dotnetguts.blogspot.com/2007/01/exporting-datatable-to-csv-file-format.html
(without the front end button click method as this will be fired from the db)
I guess I need some help from someone who has a bit more c#/CLR experience. I've googled the hell out of it but to no avail 🙁
May 29, 2010 at 2:32 am
hello, my english ist not the best so I have some porblems to understand the exact problem...
You write the performance is the problem? is that right?
have you checked if the problem is to generate the output-variable or the use of the export which uses the clr?
possible test: generate the output variable and write it to an NVARCHAR(MAX) Field in a table and then check the execution time of the clr with the table content.
do you write the data with own data and line separators or du you build in sql a string like csv e.g 'field1row1','field2row1';'field1row2','field2row2';
can you post your code of the c# clr?
I have little experience in using this clr for http://www.mssqltips.com/tipprint.asp?tip=1662 . But I dont use it for exporting complete tabledata.
May 29, 2010 at 11:35 am
I do have experience with CLR, but building up a huge variable first then trying to pass it to something to export isn't the right way to go. You want this streamed to the output file so you're not waiting for this huge thing to build.
Your CLR should do both the building of each row AND the streaming of the data out.
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
June 3, 2010 at 4:38 am
Hi Matt,
Re: "Your CLR should do both the building of each row AND the streaming of the data out", thats what I need thanks then I can avoid slow performing cursors
Would you be able to provide me with a very basic example that demonstrates the idea above?
Thanks
Shane
June 3, 2010 at 5:10 am
i havent tried to compile the code below but this would be my attempt without knowing c#
using System;
using System.Data;
using Microsoft.SqlServer.Server;
using System.Data.SqlTypes;
using System.IO;
public class SQLCLRIO
{
conn = new SqlConnection(
"Data Source=(local);Initial Catalog=dbnamehere ;Integrated Security=SSPI");
[Microsoft.SqlServer.Server.SqlProcedure]
public static void WriteToFile(String content, String filename)
{
-- Querying Data
SqlCommand cmd = new SqlCommand("select CategoryName from Categories", conn);
SqlDataReader rdr = cmd.ExecuteReader();
while (rdr.Read())
{
Console.WriteLine(rdr[0]);
try
{
File.WriteAllText(filename, rdr[0]);
}
catch (Exception ex)
{
SqlContext.Pipe.Send("Error writing to file " + ex.Message);
}
}
}
finally
{
// close the reader
if (rdr != null)
{
rdr.Close();
}
// Close the connection
if (conn != null)
{
conn.Close();
}
}
}
}
}
June 3, 2010 at 6:47 am
slightly updated quasi working version below is only writing the last row of the table, So I guess I need to either
A. build up a buffer
B. Write or Append File for each iteration of the loop
as I'm after B - I guess I can solve the rest of this on my own.
thanks for the help 🙂
using System;
using System.Data;
using Microsoft.SqlServer.Server;
using System.Data.SqlTypes;
using System.Data.SqlClient;
using System.IO;
public class SQLCLRIO
{
[Microsoft.SqlServer.Server.SqlProcedure]
public static void WriteToFile(String content, String filename)
{
SqlConnection conn;
conn = new SqlConnection("Data Source=(local);Initial Catalog=namehere;Integrated Security=SSPI");
conn.Open();
//Querying Data
SqlCommand cmd = new SqlCommand("select export_record from dotnet_test", conn);
SqlDataReader rdr = cmd.ExecuteReader();
while (rdr.Read())
{
//do not reference UI library i.e Console.WriteLine(rdr[0]); use Pipe.Send instead
SqlContext.Pipe.Send("Error writing to file " + (rdr[0].ToString()));
try
{
// String test = (rdr[0].ToString());
File.WriteAllText(filename, (rdr[0].ToString()));
}
catch (Exception ex)
{
SqlContext.Pipe.Send("Error writing to file " + ex.Message);
}
}
}
}
June 3, 2010 at 8:14 am
Here are some class handouts going over some examples of this exact scenario:
It's a big document, so scan for streamwriter:
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
June 3, 2010 at 8:45 am
Cheers Matt,
That document looks pretty useful, its pretty hard trying to find a decent website on this topic.
MY CLR is currently exporting a table with 98,308 rows in 3 minutes 38 seconds.
I will try another version with the StringBuilder object
what time would you think a well written CLR would be capable of to export the same table?
June 3, 2010 at 8:53 am
Depends on how big things are, etc... That said - the time you end up save is how long it take to build BEFORE you start writing. Since this writes as you build, you don't have to wait.
I'd take a wild stab and say it's probably taking you about as long to build as it does to write (on simple tests I've done that seems to be the case), so look to cut that number in half.
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
June 9, 2010 at 2:26 am
Matt, Can you see anything I could do to speed up the code below?
using System;
using System.Data;
using Microsoft.SqlServer.Server;
using System.Data.SqlTypes;
using System.Data.SqlClient;
using System.IO;
public class SQLCLRIO
{
[Microsoft.SqlServer.Server.SqlProcedure]
public static void WriteToFile(String content, String filename)
{
SqlConnection conn;
conn = new SqlConnection("Data Source=(local);Initial
Catalog=capitec;Integrated Security=SSPI");
conn.Open();
//Querying Data
SqlCommand cmd = new SqlCommand("select export_record from
dotnet_test", conn);
SqlDataReader rdr = cmd.ExecuteReader();
int i;
i = 1;
while (rdr.Read())
{
//do not reference UI library i.e Console.WriteLine(rdr[0]);
use Pipe.Send instead
//SqlContext.Pipe.Send("writing to file " +
(rdr[0].ToString()));
try
{
// String test = (rdr[0].ToString());
if(i==1)
{
//SqlContext.Pipe.Send("header");
File.WriteAllText(filename, (rdr[0].ToString()) +
"\r");
}
if(i!=1)
{
//SqlContext.Pipe.Send("body");
File.AppendAllText(filename, (rdr[0].ToString()) +
"\t");
}
i = i + 1;
}
catch (Exception ex)
{
SqlContext.Pipe.Send("Error writing to file " +
ex.Message);
}
}
}
}
Viewing 13 posts - 1 through 13 (of 13 total)
You must be logged in to reply to this topic. Login to reply