Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12»»

T-SQL insert using SSIS Data Pump Expand / Collapse
Author
Message
Posted Monday, February 4, 2013 11:34 PM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Friday, August 8, 2014 7:35 AM
Points: 33, Visits: 291
Comments posted to this topic are about the item T-SQL insert using SSIS Data Pump
Post #1415582
Posted Tuesday, February 5, 2013 12:32 AM


SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Tuesday, August 26, 2014 8:56 PM
Points: 178, Visits: 571
I'd be interested in a performance comparison with a simple INSERT INTO / SELECT solution which does not use SSIS.
Post #1415608
Posted Tuesday, February 5, 2013 1:15 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Friday, August 8, 2014 7:35 AM
Points: 33, Visits: 291
You can try by yourself to compare .. as I have written, Insert Into is fully logged operations, requires more restrictive lock on target table and makes heavy use of transaction logs ..
If you insert into an empty table, you can use the hint with(tablock) and the insert will be, under certain conditions, minimally logged, but it puts an X lock on the table to insert into , while OleDb Insert requires only an IS lock and makes better use of transaction logs because commits inserted rows in batch..
Post #1415630
Posted Tuesday, February 5, 2013 2:40 AM


SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Tuesday, August 26, 2014 8:56 PM
Points: 178, Visits: 571
I probably won't have the opportunity to compare myself - thats why I read sites like this. But if I do have an insert performance issue, I might consider SSIS as a solution.

Perhaps you could also consider using the SQL Server destination instead of the OLE DB destination in appropriate cases for even mor performance.
Post #1415662
Posted Tuesday, February 5, 2013 2:43 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Friday, August 8, 2014 7:35 AM
Points: 33, Visits: 291
It depends how many rows and how many bytes per row you are going to insert.
If you start making DWH / ETL with SqlServer, you will probably face this kind of issues ..
Post #1415667
Posted Tuesday, February 5, 2013 4:46 AM


SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Tuesday, August 19, 2014 5:41 PM
Points: 2,693, Visits: 1,203
Sorry you lost me on a few fronts.

1) you reference another article stating, "The method SSIS Data Pump represents the most effective way to load large amount of data into a SQL Server database because the SSIS Data Pump makes optimal usage of the transaction logs, parallelism, locking, and pipelining. A more detailed explanation is in the referenced article.". Yet I see no such detailed explanation in the referenced article.

2) you call xp_cmdshell to execute your package. This won't be usable on systems that dis-allow usage of xp_cmdshell.

3) your xp_cmdshell calls are executing a bunch of commands you can do in SSIS with the FileSystem task, or at worst the Script Task.




Hope this helps
Phill Carter
--------------------
Colt 45 - the original point and click interface

Australian SQL Server User Groups - My profile
Phills Philosophies
Murrumbeena Cricket Club
Post #1415735
Posted Tuesday, February 5, 2013 5:31 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Thursday, January 2, 2014 7:30 AM
Points: 37, Visits: 220
Excellent article.
When somebody gets around to writing the VB code in C#, could you pls. post it?
Post #1415762
Posted Tuesday, February 5, 2013 6:13 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Yesterday @ 6:45 AM
Points: 239, Visits: 1,020
Nice article!

Luke C
MCSE: Data Platform, MCP, MCTS, MCITP - Database Administrator & Database Developer
Post #1415784
Posted Tuesday, February 5, 2013 9:08 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Friday, August 8, 2014 7:35 AM
Points: 33, Visits: 291
philcart (2/5/2013)
Sorry you lost me on a few fronts.

1) you reference another article stating, "The method SSIS Data Pump represents the most effective way to load large amount of data into a SQL Server database because the SSIS Data Pump makes optimal usage of the transaction logs, parallelism, locking, and pipelining. A more detailed explanation is in the referenced article.". Yet I see no such detailed explanation in the referenced article.

You are right , there is no such detailed explanation.
Purpose of my article was not explaining why SSIS Data Pump is the best method.
You can give a look to :
http://blog.staticvoid.co.nz/2012/8/17/mssql_and_large_insert_statements
and note that SSIS data pump is nothing else than a pipelined bulk copy.


2) you call xp_cmdshell to execute your package. This won't be usable on systems that dis-allow usage of xp_cmdshell.

How could I launch from T-SQL a SSIS package ? The only way I know is using xp_cmdshell.. purpose of the article was a T-SQL stored procedure able to launch a SSIS package, and this requires xp_cmdshell


3) your xp_cmdshell calls are executing a bunch of commands you can do in SSIS with the FileSystem task, or at worst the Script Task.



Yes, some of xp_cmdshell calls could be moved into BULK_TRANSFER_DATA.dtsx

Post #1415929
Posted Tuesday, February 5, 2013 9:14 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Monday, August 25, 2014 10:09 AM
Points: 13, Visits: 168
Nice article and I picked up some information I could use in our business. We have different applications that need client data dispersed throughout various locations, which have table table and column names which are not the same as our source database. Our approach was to store the application name, source table name, destination table name, source connection string, destination connection string, source column name, destination column name into a single SQL table. We then call the dynamic package passing the application that we want to load up and the package loops through this mapping table dynamically creating the connections and table and column mappings. Which works fine for table loads, I was looking for information on how to use stored procedures for the data flow task, to update data on destination sources.
Post #1415932
« Prev Topic | Next Topic »

Add to briefcase 12»»

Permissions Expand / Collapse