Click here to monitor SSC
SQLServerCentral is supported by Redgate
Log in  ::  Register  ::  Not logged in
Home       Members    Calendar    Who's On

Add to briefcase

How to load data to target tables (only if no dataflow task is fail in ssis package) Expand / Collapse
Posted Tuesday, January 8, 2013 10:57 AM


Group: General Forum Members
Last Login: Friday, November 18, 2016 2:03 AM
Points: 163, Visits: 328
Hello All,

In my ssis package, i have 25 data flow tasks which will load from some third party database in sql server to staging database in sql server(both db's are in same server)

the target tables has some primary key constraints,

this job runs daily once,
first step is it clears all tables in target DB,
second it loads data to tables,
so when ever one dataflow task fails(due with primary key violate or soem data type errs mostly) then some tables has data and some tables has no data,

so what i am planning is first store temp tables (normal tables) in target db then at last (if no errors) then load to actual tables.,

is this goodway or can i do any smart way to do this funcatioanlity (because i have to create 25 temp tables)?

so at the end, if any step fails they the tables should contain yesterdays data

Post #1404384
Posted Tuesday, January 8, 2013 4:38 PM

Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: 2 days ago @ 11:09 AM
Points: 1,071, Visits: 2,867
You certainly could use temp tables (though if you take that approach, I recommend using permanent staging tables to avoid potential issues with temp tables). However, if this all-or-nothing operation is critical, you must also account for the possibility that the operation loading from the temp (or staging) tables to the destination tables load could also fail.

To revert back to the previous copy of the data in the event of any error is a little more difficult. Unless you use SSIS transactions (which I don't recommend, by the way), you'll need to build in a mechanism for each table to store the current contents of that table for each operation. This would probably look like another set of staging tables, this time used just to persist the last known good set of data. In the event of any error, you'd essentially reverse that ETL process to load from that second set of staging tables back to the destination tables.


Tim Mitchell, Microsoft Data Platform MVP
Data Warehouse and ETL Consultant

Post #1404495
Posted Wednesday, January 9, 2013 10:27 AM



Group: General Forum Members
Last Login: Friday, December 2, 2016 3:25 AM
Points: 7,933, Visits: 14,355
I agree that permanent staging tables would be a better option than true temp tables (# tables).

One option to consider:

1. create 25 staging tables with the exact schema (including constraints, indexes and triggers) as your 25 destination tables, choosing alternate names where necessary
2. point your SSIS package at the staging tables
3. if the clearing and loading of all 25 staging tables succeeds (a Sequence Container could come in handy here)
4. rename the destination tables to in effect move them out of the way
5. rename the staging tables into the place of the destination tables
6. rename the renamed destination tables to have the original name of the staging tables so you're ready for tomorrow's import

There are no special teachers of virtue, because virtue is taught by the whole community. --Plato
Post #1404892
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse