Ideas welcome on SSIS jobs, Agent and kicking off another SSIS job

  • Hi.

    I'm looking for some ideas. Currently I have 3 SSIS packages which are scheduled in SQL Server Agent. The run every 10 mins, and updates the DW. The first step in all 3 of them is to check if the latest flat file is in a directory, if not quit, else load the data into the DW, easy.

    All 3 packages runs every 10 mins, first 1 on every 10 th min, second on the 11th min and 3 on the 12th min.

    The packages can execute for an empty file in a few seconds, for daily data a few mins, and at the start of the month even a few hours. All 3 packages runs in parallel.

    The requirement now is for me to provide data to a validation system which will be used when the mainframe is down. The table which I’ll populate is on a different server, but the table is self will require data from all 3 tables which is updated via the 3 packages.

    So, therefore I need to make sure of the following:

    1)That the 3 packages have finished successfully

    2)That there is new data to be inserted to the validation table.

    What would be best way be of achieve this?

  • I would create a sql agent job with first 3 steps as the run of your three SSIS packages(i think the sp is sp_start_job) and chain them such that next is kicked off only if the first one succeeds.

    Then have another step where you take data from all the three tables and update the required table. Now, depending on where your destination table is(same db or another server other db), you can do that in sp or as a seperate ssis.

    Configure this step as the last step in your job. Again, this should be kicked off only if the first three steps have succeeded.

    This will ensure that the you update / insert your destination table only once the three SSIS packages have been indeed succeeded.

    Hope this helps.

  • yes i agreed with n79799,

    create a SQL job with Four Steps, first three will have the SSIS packages and the fourth one the command to update table. Make sure to set the option so that next step run only when the previous one run successfully.

    thanks

    Puneet

  • Thanks guys. I've implemented the jobs as one big job and they wait for each step to complete. I dont have any more locks, but now I find that I do have a problem with the time it takes to load of the data itself.

    I need to load the data and also export part of it again to downstream databases. Im running in to problems with it, but I'l going through all the loading steps and checking on how to speed up the loading, i.e. disabling index, using bulk loads, change updates and inserts to merges etc.

    I guess I cant have your cake and eat it...

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply