A Custom Execution Method – Level 19 of the Stairway to Integration Services

  • Comments posted to this topic are about the item A Custom Execution Method – Level 19 of the Stairway to Integration Services

    Andy Leonard, Chief Data Engineer, Enterprise Data & Analytics

  • Nice article Andy. I have recently written a stored proc which looks very like yours.

    The differences:

    1) It lives in a separate library-type database (I want it clean and fully source-controlled)

    2) It uses sp_executeSQL to execute procs in SSISDB

    (To avoid cross-database warnings in my database project. SSISDB is, no doubt for good reason, classed as a 'user' database and cannot therefore be added as a system database reference in a database project. I don't want to add SSISDB to source control, so we came up with this workaround.)

    3) It runs the package in SYNCHRONIZED mode, then grabs the execution status from catalog.executions and THROWs an error if the status is not 'Succeeded' (7)

    That's because I want the proc to fail if the package fails. Running the proc asynchronously does not capture that.

    This is important to us because the proc will be executing multiple packages to run ETL jobs.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • Phil, that sounds like an interesting workaround. It may be possible to edit the files in the SSISDB so that they show as a system database, but as it shows as a "User" database normally, I don't know that this would be ideal. I like the idea of having a separate project to store these and can see the usefulness of running in Synchronized mode for much of this.

    I appreciate the original article as well - much food for thought there and some good ideas for future development efforts.

  • Good article Andy. My only nitpick, is that while you can't use foreign keys to enforce referential integrity when using an additional database, technically you could create triggers in the additional database to partially enforce RI. You couldn't fully enforce RI with triggers without adding triggers to the SSISDB database. I'm not recommending you do this, just making the comment.

  • Hi Andy,

    I too have chafed at the restriction against executing a package from another project; like everyone, I have a small number of tasks and packages that perform those tasks that are truly universal in my framework; logging, compression and encryption, etc.

    My solution is a little different from yours : These pkgs don’t exist in a project or the catalog at all.  Instead, they are in the file system in a child folder of the Resources location that holds other supporting files such as binaries.

    Sure the package execution task requires a little more config, and the external packages themselves require a little extra work, primarily to get at passed parameters properly, but this approach has worked extremely well for me.

    It doesn’t add anything to the complexity of scripted deployment since, again, there are always other external dependencies that are deployed to the file system.

    My framework projects have a configurable value to point them to the overall Resource location, but each external resource package is allowed its own override, whether folder or down to the package name.  This is nice because, although the external tasks change very rarely, each can be versioned on its own and, if necessary, the projects can upgrade to new resources gradually.  (I have actually never needed to do that much granular upgrading, but always thought I might, so built it in.)

    What do you think?

    >L<

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply