Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 123»»»

Streaming Data Into SQL Server 2008 From an Application Expand / Collapse
Author
Message
Posted Tuesday, April 28, 2009 2:43 AM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Sunday, December 14, 2014 10:06 PM
Points: 379, Visits: 1,991
Comments posted to this topic are about the item Streaming Data Into SQL Server 2008 From an Application




SQL# - http://www.SQLsharp.com/
Post #705584
Posted Thursday, May 28, 2009 2:14 AM


Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Thursday, July 3, 2014 6:17 AM
Points: 8, Visits: 285
Nice topic Solomon.

I have some suggestion on the log growth:

in the sp: you could use select into, or enable minimal logged inserts on SQL server 2008, see http://msdn.microsoft.com/en-us/library/dd425070.aspx to get minimal logged inserts and reduce log growth. When you do this comparing the TVP streaming to bulk import methods becomes fairer, because they always are minimal logged.


Kind Regards,

DM Unseen AKA M. Evers
Post #724856
Posted Thursday, May 28, 2009 3:25 AM


SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Monday, July 22, 2013 11:59 AM
Points: 477, Visits: 719
I don't really see the major benefit over using the SqlBulkCopy class here, which is also backwards compatible 2000 up.

A minor benefit that I can see is that you don't need to have a 'loading table' in which data is placed before being transformed by a stored proc if that transformation is necessary.

But it's a really minor benefit to break compatibility from 2000/2005.


Atlantis Interactive - SQL Server Tools
My blog
Why I wrote a sql query analyzer clone
Post #724887
Posted Thursday, May 28, 2009 7:19 AM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Sunday, December 14, 2014 10:06 PM
Points: 379, Visits: 1,991
Hello "dm_unseen" and thanks for that suggestion.


Matt Whitfield (5/28/2009)
I don't really see the major benefit over using the SqlBulkCopy class here, which is also backwards compatible 2000 up.

A minor benefit that I can see is that you don't need to have a 'loading table' in which data is placed before being transformed by a stored proc if that transformation is necessary.

But it's a really minor benefit to break compatibility from 2000/2005.


Hello Matt. Regarding the similarity to BULK INSERT, SqlBulkCopy, and other methods, I would say that:

1) Using the IEnumerable streaming method that I described here provides the greatest benefit when doing transformations in the application layer. I realize that my example code was simplistic and did not show a transformation but this technique is more focused on the "sending" of the data than the "receiving" of it. And in these cases, bulk insert/copy operations do not address the issue of doing anything more than straight reads of the source data.

2) As far as using TVPs in general goes, regardless of the full-streaming feature when using IEnumerable they do, in my opinion, provide a much cleaner interface than using a staging or loading table. True, using them is not compatible with prior versions of SQL Server but many people do not need such compatibility. Unless there is a specific reason not to do so I believe that you should use all features available in a product. I have mainly worked at Software as a Service companies where we host the application and do not need to worry about what version of the DB (or even what vendor) a customer might have.


Take care,
Solomon...





SQL# - http://www.SQLsharp.com/
Post #725024
Posted Thursday, May 28, 2009 7:23 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Monday, January 30, 2012 5:12 AM
Points: 40, Visits: 143
From article: - "Prior to SQL Server 2008 the only way to send data to the database was to send it in one row at a time"

Really misleading statement. You can send XML with bunch of rows and translate it into a table inside of stored procedure. I were doing it for years.

But still, good article.

LP.
Post #725029
Posted Thursday, May 28, 2009 7:28 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Yesterday @ 5:43 AM
Points: 37, Visits: 664
I do the XML thing also to the stored procedure... But I am curious to know if there is any performance gain by using the table approach vs the xml approach... Also, will the table approach work with other languages than .net like php, etc...
Post #725033
Posted Thursday, May 28, 2009 8:05 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Thursday, August 29, 2013 10:47 AM
Points: 3, Visits: 115
Also, will the table approach work with other languages than .net like php, etc...

This Approach will most likely also work with PHP (Have a look at: http://php.benscom.com/manual/en/class.dotnet.php). .Net is a runtime environment which supports various languages and not a language itself.


ERP Software & IT-Beratung München
Post #725073
Posted Thursday, May 28, 2009 8:11 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Yesterday @ 5:43 AM
Points: 37, Visits: 664
Looking at the site you mentioned that would make me think it would have to be PHP on a windows IIS Machine... What if it's LINUX or a variation thereof hence no windows dll's... Will this table call method, still be available to PHP??
Post #725084
Posted Thursday, May 28, 2009 8:12 AM


SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Sunday, November 4, 2012 12:23 PM
Points: 1,893, Visits: 3,932
Great job, Solomon!

That's definitely a feature which was missing in my opinion.

I think BCP is a very powerful and nice tool but it is much less flexible. XML is a way I use since now but it produces a large overhead. Anyway, the performance of XML vs. TVPs would be really interesting .

Some little things (maybe optimizations) about your C# code:
General )
Since only the "Normal" approach requires a DataTable all other methods could be optimized by using a simple object[] and a List<> .
OldSchoolImport )
Well, creating 3,000,000 instances of a SqlCommand object instead of creating just one, prepare it and just bind the new values should be more realistic usage

Thanks for this work
Flo



The more I learn, the more I know what I do not know
Blog: Things about Software Architecture, .NET development and T-SQL

How to Post Data/Code to get the best Help How to Post Performance Problems
Post #725085
Posted Thursday, May 28, 2009 8:21 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Thursday, August 29, 2013 10:47 AM
Points: 3, Visits: 115
What if it's LINUX or a variation thereof hence no windows dll's...

You could try it with Mono but i'm not sure if it will work because of the referenced Microsoft.SqlServer-Assembly which might use PInvoke to native Windows-Specific DLLs.

[EDIT]
@Solomon: I think in the following part of the code is a small error which could cause an Exception:
            
try
{
using (dbConnection = new SqlConnection("Server=localhost;Database=Test;Trusted_Connection=True;"))
{
dbConnection.Open();

SqlCommand importProc = new SqlCommand("CHECKPOINT ; TRUNCATE TABLE dbo.ImportTest ; DBCC FREEPROCCACHE ; DBCC FREESYSTEMCACHE('ALL') ; CHECKPOINT ;", dbConnection);
importProc.CommandType = CommandType.Text;

Console.WriteLine("{0} -- Truncating the import table, clearing the caches, etc...", DateTime.Now);
importProc.ExecuteNonQuery();
Console.WriteLine("{0} -- Done setting up for the test", DateTime.Now);
}
}
catch (Exception exception)
{
Console.WriteLine("Error: {0}", exception.Message);
}
finally
{
dbConnection.Close();
}

If you wrap a object into a using scope it will automatically be Disposed (and therefore closed) if an exception is thrown. The call to the Close() method in your finally block will then cause another excetion because its called on a already disposed object.


ERP Software & IT-Beratung München
Post #725100
« Prev Topic | Next Topic »

Add to briefcase 123»»»

Permissions Expand / Collapse