TSQL vs. ActiveX

  • I have been using SQL Server for a number of years since SQL Server 6.5.  I have passed two of my SQL Server tests and 3 windows tests and have to take one more Network test to get my MCDBA.

    Having said that, my experience in DTS packages has always been using store procedures in my steps rather than ActiveX scripts.

    I'm now at a new company and A LOT of their processing is done in ActiveX scripting (getting variables, sending mail, parsing data values, etc.)  I've always used TSQL in stored procedures.

    Could someone share a view on building DTS packages with heavy logic in ActiveX scripting vs. heavy logic in stored procedures?  Is there a resource to help compare the two methodologies?  Is there a optimizational impact of using one instead of the other?  We are using SQL Server 2000.

    Thanks,

     

    Tony

    Things will work out.  Get back up, change some parameters and recode.

  • The answer, as in many other cases, is "It depends.".  In general, if you can find a set-based way to do things in T-SQL it will probably the best approach.  On the other hand if the task consists of a lot of procedural logic that is difficult to express in T-SQL then ActiveX may be the best technique.

    There is a lot of overhead to running an ActiveX script in a DTS package.  For an ActiveX script task that runs once and does a lot of work this is insignificant, but transformation scripts in a data pump task must be instantiated over and over for every row.  If you have simple column validation scripts in a data flow that could be replaced by ISNULL, CASE, or other functions in the source SQL, then get rid of as many of the scripts as you can.  If they have very complicated behavior such as multiphase data pump scripting, it could be very difficult to duplicate in T-SQL.

    Stored procedures will run on the server where the data is usually located (ignoring linked servers for the moment), while ActiveX scripts execute wherever the package is run.  If the DTS package is executed on a client rather than the server there will be extra network traffic to move the data.  Usually this is a bad thing, but if there is really "heavy" logic involved, to the point that it bogs down the server, it can be an advantage to have it running on the client.

    There could be deployment issues.  A DTS package full of ActiveX scripts is self-contained and can easily be moved around by changing connections, but if it relies on stored procedures then extra effort is required to keep the package and procs in sync during moves.  This is less of an issue if the T-SQL logic is all in the package instead of using procedures.

    ActiveX scripts in DTS packages can't be migrated to SSIS when you move to SQL 2005.  The old packages can still be run and maintained on SQL 2005 for now, but at some point will become dead ends.  Execute T-SQL steps in DTS packages migrate very easily to SSIS.

    There could be institutional culture issues.  Is there a disparity in the skillset of the developers?  Are they a lot of VB experts that are weak in T-SQL, who would find it hard to understand or maintain the stored procedures if you rewrote some of the packages?  Is there a bias towards ActiveX because they haven't had a good T-SQL evangelist show them the light, or does someone have an active dislike for T-SQL?  Do they make a conscious decision to use ActiveX every time, or are their packages all descended from an uber-package once written with ActiveX that they keep extending?  If they think ActiveX is easier to use, how much faster would a T-SQL solution have to be to tempt them to change?

    I'm sorry I can't give you a simpler answer but hopefully this gives you something to work with.  Not knowing your environment makes it difficult to be more precise.  I think I personally would try to use T-SQL wherever I could and resort to ActiveX only when necessary.  If you expect a lot of resistance, choose your battles and substitute T-SQL in places where you think it will be an obvious winner.

  • Agreed, "it depends".

    I maintain a system that is heavy on ActiveX scripting & DTS packages that was developed by consultants who didn't know T-SQL and built the system with a beginners DTS book open at all times.

    Based on that experience, I built & maintain a data warehouse using DTS & ActiveX as little as possible.

    Issues with DTS/ActiveX

    - Version control. How do you version a DTS package, and additionally how do you "diff" a DTS package to determine what changed between versions ? With ETL in T-SQL procedures, the T-SQL source can be checked into VSS and versions can be easily tracked & diff'ed.

    - Deployment. Extreme caution must be taken with DTS to ensure that every path, filename, server connection property is dynamically set at runtime (self-modifying packages). The DTS system I maintained was initially built in such a way that the 1st attempted move from a DEV server to a QA server was disastrous, with hard-coded connections, hard-coded pathnames on drives that didn't exist on the QA server etc.

    - Performance/locking. DTS data pumps are row-by-row and those rows are handled by interpreted ActiveX script. They suck compared to T-SQL set-based performance. For non-SQL data sources, I will pull the data in as quick as possible to a SQL staging table, usually with a linked server or OpenRowset(), and use T-SQL downstream. I've also seen some really boneheaded DTS ActiveX datapump solutions, where the datapump blocks itself by referencing the same table on both the source & destination sides and the consultants weren't aware that the datapump opens 2 physically separate SQL connections that can block each other.

    My experience to date is that a developer who trends towards using a DTS ActiveX transform to move data between 2 SQL tables is probably coming from a procedural programming background with no grasp of SQL set-based operations. i.e. is likely to have the same mind-set that writes T-SQL cursors to loop through recordsets because the concept of Joins hasn't been grasped.

     

  • "referencing the same table on both the source & destination sides"

    As they say on http://www.thedailywtf.com, "My eyes!  The goggles, they do nothing!".  I guess I hadn't considered the possibility of that level of incompetence, and assumed there was some reason for the existence of these DTS packages in the first place.

    If you can easily replace their DTS packages (or at least the data pump tasks) with UPDATE or INSERT/SELECT operations, you owe it to mankind to do so.

  • Scott & PW,

    Thanks for the replies.  That is what I was looking for.  It seems to me that where plausible, you use T-SQL.  I can think of times where ActiveX would be OK.  I like Scott's point about the DTS package being all sufficient because it includes the ActiveX scripting.  As opposed to the stored procedures being separate.

    Based on your responses and my experience, I will continue program and put logic in stored procedures when I can.  Using the ActiveX scripting to only supplement my stored procedures.

    Thanks again for the replies. 

     

    Tony

     

    Things will work out.  Get back up, change some parameters and recode.

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply