Do not fail package if the Data Flow Task fails

  • hi there,

    I have a specific requirement, but do not know how to implement.

    "Extract Data from DB server A into DB Server B, however if DB Server A is not availale or metadata has changed due to design change on DB A, then do not fail the package"

    I have set the delay validation property of the DB Connection A to true so that the package doesn't fail during validation, however when the data flow task will execute , the Sequence container and the package , as a whole, will show failure.

    I hope I'm able to explain the problem.

  • You can put the ForceExecutionResult property of the data flow to success, so the data flow never fails. However, if another error occurs in the data flow not related to the those you described, you won't notice this (unless you check the logging).

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • You could use a Script Task to check for a valid connection before your Data Flow and then take different actions depending on the result. For example a Success branch executing your data flow and a Failure branch redirecting to some kind of logging.

    Here's an example:

    SSIS Nugget: Verify a data source before using it

    It doesn't address the problem of altered metadata, though. I'd be thinking of some kind of error handler attached to the DFT object.

    ____________
    Just my $0.02 from over here in the cheap seats of the peanut gallery - please adjust for inflation and/or your local currency.

  • If you're still looking for an answer on this, I published a blog post on this topic earlier today:

    http://www.timmitchell.net/post/2013/08/05/continue-package-execution-after-error-in-ssis/

    Tim Mitchell, Microsoft Data Platform MVP
    Data Warehouse and ETL Consultant
    TimMitchell.net | @Tim_Mitchell | Tyleris.com
    ETL Best Practices

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply