• prakashr.r7 (7/30/2013)


    Eric M Russell (7/26/2013)


    I wouldn't even attempt to use SSIS to make daily copies of tables with schemas that can change daily.

    First, you can query INFORMATION_SCHEMA.COLUMNS to determine if there have been any changes made to the column definition of a source table. If no changes, then just insert into target table from source table. If there have been schema changes in source table, then drop the target table, and SELECT INTO target table from source table.

    For example:

    drop table [TargetTable];

    select * into [TargetTable] from [SourceTable];

    As simple as that, you can create a new target table with schema and data.

    Okay Eric...How do we update the SSIS packages? You know, once I am done with pulling the data into local....i need to do some conversion, so when i use my SSIS...it's gettting error..So every time i have to open the package through BIDS and update it with latest schemas....Is there any way to avoid opening everytime?

    My opinion is that this table copy process should not be performed in SSIS, because the schemas often change. It should be done in pure T-SQL and perhaps scheduled as a job using SQL Agent, if it needs to be automated.

    If this table copy requirement is part of a larger process workflow that must run in SSIS, then you could also add the INSERT INTO or SELECT INTO statements inside a T-SQL task. If you need to do something with these tables in SSIS, and keep getting errors related to changes in table schema, then there is the 'Delay Validation' property, which can be set True, or you can temporarily set your package to Work Offline mode.

    http://msdn.microsoft.com/en-us/library/ms137625.aspx

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho