SSIS-Track Execution Time of a Data Flow Task

  • Hi Everyone,

    I have this requirement,can any one give me an idea on how to implement this

    I have a FlatFile which is around 500mb and loads the data daily into SQL Destination,which is considered as important task for business and some times this load hangs because of other people using the same table.

    but my requirement is to send a email to specified people when the task is in progress and it

    takes more than 10 minutes,such that every one stops using that table.

  • Maybe you can add a Send Email Task before and after the data flow task?

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Hi Koen,

    thanks for your reply.

    I need to send an email only in-case the data load(DFT) takes more than 10 minutes(or a specified interval),and it is still in execution but not all times...any thoughts ?

  • Maybe you can monitor the log? Or kick off a monotoring script in parallel with your data flow.

    You insert a row into a log table before and after the data flow. The monitoring script sleeps for 10 minutes, than checks if the second row has been inserted. If not, a mail is sent.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Can you change the timing of the job, so that it runs when no one is using the table?

  • I used WAITFOR to achieve the same instead of script.

    thanks for your idea.

  • What about Kicking off a second DFT, with a timer in a script. while the package with the large file sets a variable at completion or whenever your time sink it passed, the variable would negate the timed package from completing, or by bypassing the email and just failing into nothing, or triggering the notification of your users.

  • Koen Verbeeck (2/17/2014)


    Maybe you can monitor the log? Or kick off a monotoring script in parallel with your data flow.

    You insert a row into a log table before and after the data flow. The monitoring script sleeps for 10 minutes, than checks if the second row has been inserted. If not, a mail is sent.

    Nice idea!

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • If the table is re-loaded every day, you could consider using view swapping/schema switching/table partitioning/rename table. I.e. you load a table the user can’t see, once the load is done, you swap the tables. That way you should never have to send an email.

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply