• Matt Miller (#4) (10/29/2013)


    300mb of data should load in no time flat, but you seem to be going at it the hard way.

    If the input file is to simply be inserted into a table, why not use a flat file data source and bulk load it directly into a table? going through and manually building out a huge series of single row inserts is not going to scale, especially when you have facilities to dump the data directly.

    Even if you end up having to clean up data or manipulate other things - your best bet is to get the data into the DB itself as quickly as you can, clean it up THEN move it around. Trying to clean it up row by row in transit is going to lead to pain.

    +1, though doing some clean-up in non-blocking components such as Derived Column transformations should not slow things down too much. Just keep the data pipeline fairly simple and avoid looping and blocking components. By using .NET datatables, you are throwing away a lot of the power of SSIS and may as well write the entire process as a C# app (that's not a recommendation :-))

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.