Dynamically changing Tables in Transform Data Task

  • I have started to build a DTS Package to import the contents of a number of tables from one SQL Server to another (SQL Server 6.5 to SQL Server 7).

    I have done this by working through a list of Tables, dynamically setting the DestinationObjectName, SourceObjectName and SourceSQLStatement each time. I then get a problem as the pre-defined Transformations are now invalid.

    Whe you set up the Source and Destinations on the EM front end, it will create the default transformations automatically (I am transforming exactly as-is from one table to the other).

    Is there any way that I can do this in code?

    Also, is this the best method to be used or can anyone suggest another fast / robust way of transferring a list of 20 Tables (approx 30Gb) from SQL 6.5 to SQL 7.

    Many thanks,

    Nigel.

  • Is this one time or periodic?

    If one time, it's quicker to build each one by hand. (also simpler, KISS).

    Steve Jones

    steve@dkranch.net

  • Thanks for your suggestion.

    The job would need to run each day.

    The idea was to use DTS for Parallelism, setting it up with three 'streams' of tables to be copied so that each one can get on with copying over a number of tables, but only three would be transferred at any one time.

    I am aware (and concerned) about the complexity of the job, but originally thought that it may be better than having a Task for each table. Have also been considering whether to use a bcp Batch file for export and BULK INSERT for loading.

    Thanks again,

  • Okay, I've had another go, setting up a series of Tasks, one for each table and setting the maximum number of Parallel jobs. My problem now is that I want the job to be re-runnable - i.e. I need the facility to set the job going again to transfer only the tables that didn't transfer the first time.

    I have tried setting the 'Enable Step' Workflow property to no, but then any Tasks that need to run after the main data copy over will not run because of the Tasks that didn't complete.

    I've tried to show this in the following diagram:

    Clear Tables

    |

    Transfer Table 1 Transfer Table 2 Transfer Table 3

    | (Transfer) | (Disable) | (Transfer)

    Perform further Processing

    As the Disabled job (Transfer Table 2) has a Completion Workflow arrow, it must complete before the next stage can take place - it will not complete as it is disabled!

    I think I'm starting to see things now, so I'm going home for a lie down.

    Any assistance would be very gratefully received.

    I am considering trying Bulk Insert in a TSQL Stored Proc although I would need to use bcp to get the data out from the 6.5 system and work hard to get a Parallel import - but it may be worth it!

  • I'd use a flag to mark when a transfer is complete. either a global variable or a field in a table. then when a task succeeds, mark the variable complete and move to the next task. If it fails, don't mark the variable and move anyway.

    For one xfr task you would have:

    ActiveX or SQL (Check flag) - Set return value (success or failure) based on result)

    --- If success, go to xfr task 1

    --- If failure, go to next xfr task (2 or 4)

    Data Transfer Task 1

    |

    |--Success --> ActiveX or SQL task (mark complete --> On complete, go to next xfr task

    |

    |--Failure --> Move to next xfer task

    Hope this helps.

    Steve Jones

    steve@dkranch.net

  • Steve,

    Many thanks once again.

    Unfortunately, this transfer was going to get too complicated with about 20 tables to transfer.

    I have now opted for an alternative bcp / BULK INSERT solution.

    Unfortunately, this has caused another discussion point (in the Admin section) re: bcp transfer between v6.5 and 2000 using native mode. Any suggestions welcome!

    Cheers,

    Nigel.

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply