SSIS for Multiple Environments

  • Comments posted to this topic are about the item SSIS for Multiple Environments

    Cheers
    http://twitter.com/widba
    http://widba.blogspot.com/

  • This topic definitely needs more attention. It wasn't/isn't a pleasant process. So great job on laying some ground rules.

    Being able to promote smoothly through environments (4 in our case) was high on the list on our newest project. I can't say it was pretty, in the end we went with XML config files with relative paths. What's good is that it works, and we never mess with anything but the configurations once it starts moving through the promotion process.

    I can really see the attraction of database configurations but I cannot ever get comfortable with environment variables... maybe it's just me.

    What I have yet to see is a good (hack or not) way of securely maintaining credentials etc for say connection managers which don't authenticate with Active Directory... e.g Oracle.

  • Interesting article. I have actually implemented something similar in my environment with SSIS 2005.

    In my case, my enterprise job scheduler executes my NT Batch files which use the DTSRUN command to call my SSIS packages. Instead of using a configuration database like the article explained, I used exclusively the DTSConfig file.

    Another way to call the DTSConfig regardless of the environment is by referencing it by an environment variable. That variable can be created directly in your batch file and will remain in-scope during your execution of the SSIS package. This is a clean way to have your batch scripts accept a parameter (like your PROD, QA, or DEV DTSConfig file), Pass it to the Environment Variable, and have it directly change connection strings, SSIS variables, etc.

    Please contact me if you want an example.

  • Hi,

    You say if I get promoted claim the idea as my own but what do I do if it got me fired?

    I'm joking of course, I like the approach a lot.

    Thanks

    Olly

  • Something that is not very clear for me on a first read:

    do you have one configuration table for all the environments or has each environment its own config table?

    If it is the latter one, why use a .dtsconfig file?

    In my set-up (each environment has its own config table), the environment variable points directly to the config table.

    Another question: why don't you store the packages in the MSDB folder? Do you have specific reasons to do that?

    And to end my reply: I think a nice addition to the article is an explanation on how to configure your packages using the configuration table. E.g. using the package configuration wizard in BIDS (of which you have 1 screenshot).

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • we have found that DTSConfig files work great for storing the parameters for the database, as you don't very often move your application from one database server to another except during upgrades and lease refreshes, so that file remains fairly static by environment.

    we use SSIS Configurations table in SQL to store all other "variable" data such as flat file paths and that gives us the ability to manage the values programmatically or via SSIS to move changes up into test and prod from dev. plus it's very easy to write an update query to change your path with a few well placed charindex and substring or replace statements

    we also call all our jobs via a production scheduler using dtexec passing the config file into the command line.

    This also allows you to only need control over dev and test, and when you we need something promoted to production you can just ask the DBA team to copy the dtsx file from test to prod, and leave the dtsconfig file alone.

  • "do you have one configuration table for all the environments or has each environment its own config table?"

    We maintain one configuration table per ETL server (because each physical machine can have only one environment variable - that has the config information in it)

    If it is the latter one, why use a .dtsconfig file?

    That portion is mostly useful for developers to easily repoint their configurations to a different table or server for testing if need be.

    Another question: why don't you store the packages in the MSDB folder? Do you have specific reasons to do that? My preference is the file system - and it just lays out nicely for the process we use.

    This article probably moved me to this method - http://bi-polar23.blogspot.com/2008/06/ssis-and-sql-server-agent-choosing.html

    And to end my reply: I think a nice addition to the article is an explanation on how to configure your packages using the configuration table. E.g. using the package configuration wizard in BIDS (of which you have 1 screenshot).

    (Noted - my thinking was this was covered pretty well elsewhere, but maybe not)

    Cheers
    http://twitter.com/widba
    http://widba.blogspot.com/

  • Allright. Thank for the reply and the link!

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • This is a really interesting article that demonstrates the power of thinking ahead and planning before jumping into the work. I expect I will read this article a few times to make sure I get it completely, and then I expect I will set up something quite similar in my environment.

    Thanks.

  • The approach of setting an environment variable in the scope of the batch I do like. Our relative path method for configs and logs works great for a one time file copy/ftp to a folder, no other intervention necessary except setting up the job.

    What I also find interesting is how many people are using DTSRun in a CmdExec. Like WI-DBA the same bi-polar article helped in making the case. One issue I have with DTSRun is that there is no way send it a kill signal remotely - anyone have any experience with that?

  • we use dtexec, which when run in the context of a scheduler will respond to a kill statement.

    if we kill the job in the scheduler it will kill the ssis package at the point it was processing in about 3 - 4 seconds.

  • Why storing the ssis at the File system level instead of the server? Storing packages at the file system is a security problem. A better way will be to store the package in the server also, store all metadata relate information in a table, this will provided a better more stable package to maintain.

  • Actually I reviewed by NT script and I do use DTEXEC instead of DTSRUN (I must had DTS packages on my mind!).

    Another great thing about using a NT Batch script is that you can evaluate the ERRORLEVEL code from the DTEXEC command. This is a great way to properly know if you package did succeed or not. This also allows you to properly pass the Exit code to your job scheduler (ie: IBM Tivoli Job Scheduler) to know if the job was successful.

    You can findout all about the DTEXEC command in BOL.

  • <ryan>

    Interesting, that's how we'd like to ours to behave.

    We use SQL Agent as our job scheduler. Stopping the job step leaves a thread of dtexec running the package and it won't stop until killed by pid. What job scheduler do you use?

    </ryan>

    As for storing packages in the filesystem: 1) it eases our deployments 2) you can't use relative paths in MSDB. Security needs to be taken care of with folder permissions etc. One disadvantage to this is also that we're not having the packages backed up as part of the msdb backups, but rely on our storage backups in the data center - which we are ok with.

  • zarathustra >>> we use CA Autosys

Viewing 15 posts - 1 through 15 (of 19 total)

You must be logged in to reply to this topic. Login to reply