Portable DTS Packages

  • Comments posted to this topic are about the content posted at http://www.sqlservercentral.com/columnists/cherring/portabledtspackages.asp


    ------------------------------
    The Users are always right - when I'm not wrong!

  • The example given works fine.  However, I have experienced issues with using the Dynamic Properties Task to set database connections in nested DTS Packages.  Sometimes it worked, sometimes it didn't.  With a complex environment, there is also a good opportunity to miss setting a connection that could cause large problems.  I have found that storing UDL's local to the different machines worked more consistently for connections. 

  • In my opinion, a DTS package is not portable unless the server from which it is moving can be taken completely out of the picture. This actually creates more overhead, in my opinion, because now you have a whole database that you have to copy over to a new server if you are moving a dts package and you have to manually edit the .ini file to satisfy complete removal of the other server.

    By his own admission, the author states that a requirment for his process to be successful is that "Be executed from a client machine as well as on the server – Does not reference any local files or connections.". Yet his whole process relies on the existence of a "Local Connection" withing the DTS Package that is pointing to a local mapped drive to the .ini file. So there are two local aspects in his equation and therefore does not satisfy his requirement.


    My blog: SQL Soldier[/url]
    SQL Server Best Practices:
    SQL Server Best Practices
    Twitter: @SQLSoldier
    My book: Pro SQL Server 2008 Mirroring[/url]
    Microsoft Certified Master: SQL Server, Data Platform MVP
    Database Engineer at BlueMountain Capital Management[/url]

  • We wrap our packages in what we lovingly call core scripts.  They are VBScript that handle all the variables.  We can literally port a package to any server at any time and the "core script" is smart enough to know where it's being executed from and change all variables accordingly.

  • I like that much better!! If you execute the DTS Packages via DTSRUN.exe, it is simple to pass in global variables as parameters that are set at run time within the VBS file.


    My blog: SQL Soldier[/url]
    SQL Server Best Practices:
    SQL Server Best Practices
    Twitter: @SQLSoldier
    My book: Pro SQL Server 2008 Mirroring[/url]
    Microsoft Certified Master: SQL Server, Data Platform MVP
    Database Engineer at BlueMountain Capital Management[/url]

  • Take a look at the series of posts I'm putting together on my blog about a very similar technique.

     

    --------------------
    Colt 45 - the original point and click interface

  • Maybe I am missing something but I don't think anything can be called portable unless you can actually move it around. For example, a DTS package can actually "port" data over from one server to another but a DTS package itself cannot be moved from one server to another (atleast not in my experience.)

  • The package is moved from one server to another all the time by generating a script from dev, executing in qa, then executing in prod.  The prod package still has all the settings from dev, but by using vbscript and rundts the variables in the package are overwritten at runtime by environmental variables (created and passed from the vbscript).  So, if you execute the dts package I create in prod from enterprise manager or sql scheduler, it will use the defaults from dev and won't be successful.  But, using Zeke and an app server to schedule and run the vbscript will work every time.

  • We run all of our jobs in several environments and still use the SQL Server Agent or EM to run the jobs.  The key is the synchronization of the DTS package ID's across the environments.  We have DTS package that performs the elevation of changes by archiving previous version and then moving the records that store the packages from the MSDB database to the target machine.  This way the package id remains the same for all environments and does not affect scheduled jobs or execution of packages from within EM.  Instead of managing VBScript, we manage the UDL's and ini files local to the machines.

  • Sure it can. I moved more than 100 DTS packages from one production server to another simply by moving the msdb database and then making a few edits to the syspackages table.


    My blog: SQL Soldier[/url]
    SQL Server Best Practices:
    SQL Server Best Practices
    Twitter: @SQLSoldier
    My book: Pro SQL Server 2008 Mirroring[/url]
    Microsoft Certified Master: SQL Server, Data Platform MVP
    Database Engineer at BlueMountain Capital Management[/url]

  • Paul, I wish you were my DBA.  The VBScripts are required here because our DBA's refuse to touch a package or ini files once it's tested and sign-off.  I can say I like the removal of risk on their part, but it is extra overhead for me.

  • Instead of using a .ini file to hold the server name, I use a .udl file to define the server connection then pass the path to the .udl file as a parameter when running the DTS package. That way I don't need to mess around with environment variables, something that I may not have access to do on the production server. If you use SQL Server security, the user name & password are also stored in the .udl file so you don't need to worry about them in DTS.

    Also, in an article like this you should have mentioned that table names in Data Pump tasks are recorded as [database].[owner].[tablename]. If you want those tasks to be database-independent you need to use a Dynamic Properties task to change them to [owner].[tablename].

    Don

  • We had problems with .ini file.  It wouldn't show all parameters in the .ini file, only top 15.  Does anyone else have the same problem?  We ended up to put parameters in a config DB.  We also use UDL file stored on the local machine, so it can be hidden from the dev people to see the logins in production.

  • Richard writes:

    We had problems with .ini file.  It wouldn't show all parameters in the .ini file, only top 15.

     

    That's good to know.  We have a VB script that uses an INI file that is assumed to be the same name as the package.  For deployment, we set only a gINIpath variable (UNC path) to tell it where to find the INI.  All settings are then in the INI, with no parameters.  Typical variables at the moment are source/target UDL path, production source, production server, test/prod folder paths, start/end date.  The packages all determine whether the source UDL connection is "production" and will export to production folders if it is.  We store the package and INI in VSS as it should look in production so we don't have to change anything for a production deployment.  It works pretty well.  Dynamic properties make me

    We don't have more than 15 parameters in the INI, but I'll bring it up as a potential risk.

    One would do well to realize that DTS doesn't need to "run on a server".  I'll be making a push soon to store them as .DTS and run them from a production folder path.  That comes right after obliterating our dependency on our Exchange server. 

  • Well, I'm a bit late to the party here, but have to say "portable DTS packages" is pretty much of an oxymoron. 

    I'm a former OO developer where we used DTS only to import/export data.  The packages were invoked from VB code and run-time properties set via the COM interface.

    I'm now doing data warehousing using DTS only.  We're not allowed to touch the production server, so we have to develop the packages on a Dev box and turn them over to the DBA.  Until I came on board and suggested Dynamic Properties, he was almost rewriting the fool things.  While the DPs have helped a lot, I still find them a PITA to use, especially through the DTS designer interface.  That's why I typically save the package to a .BAS module then load to an Access DB where I add the Dynamic Properties task via a VB proc I wrote.  Also the only sure way to test the property mapping is to execute the task, then go look at your objects/properties.  It's extremely annoying to me that when you view the DP task, it doesn't show you which object/property the DP is mapped to (using .INI files for connection info).  Also the hard-coded path to the .INI file names means we have to map our drives exactly as on the production server so the package can find them.  That's why I'm looking at replacing the DP/.Ini file with an ActiveX script that reads an XML file from whatever the DTS file path is and populates global variables.

    At least in SQL 2000, I find DTS is way too server and object-centric to be portable or scalable.  That's why in all of my past uses, the packages were kept dumb as a rock, no business rules at all were allowed to reside in them, just firehose data in and out, let the VB code and stored procedures do the brain work and use .UDL files for all DB connections.

     

Viewing 15 posts - 1 through 14 (of 14 total)

You must be logged in to reply to this topic. Login to reply