• (1)-add code to save current data to a set tables (archive) before importing, if user decides to roll back, this set of archive tables set can be used for rollback purpose

    (2)-add new fields in affected tables called "load date", do not delete data historical data every time new data is imported, that way if user need to roll back, all I need to do is removed new data.

    I think option 1 is your best approach. I'd personally use SSIS within a SQL Agent job to archive the data, and then BCP in the data to the LIVE tables...

    These archive tables will be identical to the live tables minus all the indexes and such.

    If you go with option two, and non-clustered indexes you have on those tables will grow significantly as they will need to keep up with the "new" and "old" data

    You haven't said how large all this data is...only that it was "expensive"...how much data are we talking about? Thousands or rows? Millions?

    ______________________________________________________________________________Never argue with an idiot; Theyll drag you down to their level and beat you with experience