Multiple Destinations From Single Source

  • I am trying to transform a flat file with multiple groupings into a normalized data structure. How do I perform multiple inserts into the same destination table for each row of the source?

  • You can have multiple data transform tasks between the connections. You need a separate "query" for each to get different data.

    Steve Jones

    steve@dkranch.net

  • Steve: The orginal table field count is huge. I was hoping there was a scripting solution where I would only access the source once per row. Something like using multiple DTSTransformStat_InsertQuery on a single pass. Thanks.

    Gary

  • Any solution is going to through a load somewhere. I assume this is a one time thing. I wouldn't worry about the source load. Personally, I'd rather move the entire thing into one table and then denormalize using SQL rather than DTS.

    Steve Jones

    steve@dkranch.net

  • I have a similar load issue. I want to take one row of monthly budget data from a flat file and use a Data Driven Query to break that row into daily pieces (31 for July, 30 for September, 28 for February, you get the idea). I would like to be able to program a loop that inserts the appropriate portion for each of the 28/30/31 days from one row in my flat file. I could use T-SQL to do this, but I want to know the same thing Gary asked about: is there a way to 'call' DTSTransformStat_InsertQuery multiple times resulting in multiple inserts for one row of input data in a Data Driven Query? Steve's first answer seems to imply 'maybe', with multiple transform tasks, but his second answer implies 'no'.

    If there is no 'multiple inserts for one input row' functionality, would I have to create an intermediate file/table that creates my 28/30/31 'output rows' per input row, and then use those 'output rows' as input to a second DDQ?

    My example is a mostly one time thing, but budgets can change, so I'll also have to factor in the potential for updates, but I'll climb that mountain one I reach the top of this one.

    Thanks in advance for any thoughts/ideas.

    Dan

  • I do not think so, but I could be wrong. I haven't used the Data Driven Query very much, but I have seen than any transforms in DTS are slow. I have found for small data sets (< 1 M rows), dumping into a temp table and using SQL to move it is much faster and more recoverable if there is an error during the transfer.

    You can setup one source with multiple "Transform Data Tasks" that operate differently and send data to separate places. These can run in parallel in a DTS package.

    Steve Jones

    steve@dkranch.net

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply