• Koen Verbeeck (10/30/2013)


    Matt Miller (#4) (10/29/2013)


    If the input file is to simply be inserted into a table, why not use a flat file data source and bulk load it directly into a table? going through and manually building out a huge series of single row inserts is not going to scale, especially when you have facilities to dump the data directly.

    It seems the problem the OP is having is the sources are dynamic, which won't work in a regular data flow.

    Matt Miller (#4) (10/29/2013)


    Trying to clean it up row by row in transit is going to lead to pain.

    Synchronous components in SSIS almost have no performance impact.

    @blasto_max: can you describe your sources a bit more? How come they are "dynamic"? Are they predictable?

    Thanks for those questions, I have dynamic sources. In brief - Source is like SomeID, Random Column Names....Destination is like - Sane column names...

    In source, for example Column1 is just a value holder. For SomeID = 1, Column1 could go to Dest1 and for SomeID = 7, Column1 could go to Dest19. This is how it is. So, I create mappings of what goes where, generate SQL code to fetch data and load data. For ID = 1, Column1 is always mapped to fixed Dest.

    After getting the mappings for ID = 1, I fetch HUGE data from the source tables for each Sub ID associated with an ID. Its a mess that I did not create.

    Is the situation clearer now ?