DTS weirdness

  • I ran across a weird problem today (like many other days).....

    We have an ETL process written in DTS which is fairly complicated and has evolved into a nice example of spagetti code. In one of the sub packages (a package called by a package called by a package, etc) there were two tasks (sub packages) that would not get executed. The preceding steps were ActiveX scripts and they evaluated to a failure. The workflow from them was "on failure" but the packages never get executed (when running the main package manually, you can see the steps' status is "Not run"). I changed the scripts to evaluate to success, and changed the workflow to "on success". Still didn't work. I changed the workflow to "on completion". Still didn't work. Finally, I deleted the two tasks and readded them EXACTLY THE SAME AS BEFORE (same task name, executable, etc). Now they work. Go figure.

    Anyone have any clue what might have been happening beind the scenes?

  • Is it possible that DTS packages are cached somewhere? The behavior you experienced is disconcerting to say the least. Yesterday I had a somewhat similar experience in a DTS package...all workflows are set to on success, one of the steps failed, the next five completed successfully even though they shouldn't have run at all, and the last two failed because of precedence failure even though according to DTS manager the precedent steps completed successfully. This is in a package that has ran without error faithfully since I created it...go figure!

    Michael Weiss


    Michael Weiss

  • Could be cached but seems unlikely. We have had a problem with some status emails never getting sent out (which is the next step after these tasks). This has been in production for over a year, so the server has been rebooted many many times, which would have cleared up any caching issues.

    I'm rapidly getting more and more disgrunted with SQL Server. DTS isn't reliable. AS only works when it wants to. SQL Mail in SQL2k has been a disaster. I guess the database is ok but then again I am so busy fixing problems with AS, DTS, and SQL Mail I wouldn't know.

  • I hear you. I am getting to the point where I am almost afraid to do anything with AS...and documentation is virtually non-existent!

    Michael Weiss


    Michael Weiss

  • Yeah, the doc is rally lacking. I love <heavy sarcasm> the description of the AS properties settings. Really tells you a lot. We had a consultant come in and tweak our memory and processor settings based on what he had learned from MS. Everything was way overcommitted to the point of clashing with each other. In the following week, I tweaked the settings back down to reasonable values, and you know what - there was not one iota of performance difference.

    I took the 2074 OLAP course recently. It was good for the theory and practical knowledge of how to set up your cubes but the instructor was clueless about anything relating to the actual operation of AS. You could tell he had never seen it outside of a lab course. I find this is where a pool of knowledge is lacking. I post so many odd problems with AS that I think that I'm on the bleeding edge here. I find it hard to believe that my project is all that big or complex.

    Besides here, what other resources do you use or find useful?

  • I have found a problem running packages within packages that loads AS cubes. I did all the ETL processes to load my star schema tables and after that it kicked off another package to process the cubes. For some reason it never would execute in a scheduled run. However, if I ran the package by itself it ran fine. RESOLUTION: I removed the AS package from the main package and scheduled it to run separately by itself. I scheduled the AS package about an hour after the main package normally finishes in the event the main package ran a little longer than normal.

    Skip Gallagher

    Quest Industries, Inc.

    (479) 685-4546


    Skip Gallagher
    Quest Industries, Inc.
    (479) 366-0789

  • For Skip,

    I've had problems with cube building DTS packages called from other scheduled packages and overcome them by ensuring that all the steps of the cube package and in fact any ActiveX are always executed on the main package thread. The analysis services process cube task is written in vb. VB is not multi-threaded therefore you have to make sure that the tasks in your package are set to execute in the main package thread. To do this, right click on each task and select workflow, workflow properties, options tab and under Execution make sure that the Execute on main package thread is checked.

  • I think the more nested your packages get, the more possibility of error, AS or not. Basically I am doing the same thing -

    Job A:

    Step1 - Run ETL package

    Step2 - Run Cube reprocessing (full process)

    Step3 - Restore cubes from previous backup

    (in the event that Step2 fails, which is often)

    One thing, no TWO things that kill me are -1) cube processing sporadically fails for no good reason. It will run fine for weeks then decide to fail. Since everything is nested inside a job and DTS package, there is no relevant info to go on to determine the reason. If I process the cubes manually the next day with AS Manager, no problem. Bouncing the server usually works too. Sounds to me like DTS is full of bugs and/or memory leaks.

    2) Cube processing times waffle around like crazy. We have 3 main cubes and the processing times for them run from 23 to 68, 110 to 169, and 111 to 133 minutes respectively. The data grows daily but the processing times waffle around randomly. How can you effectively manage an critical overnight batch stream with such inconsistencies? It's a struggle to ensure everything is done before the backups kick off, and often it's not done, making the current backup useless!

    End of rant.

  • I've had similar problems when the package has been tranfered from another server, like it just didn't see that bit of the package.

  • Last week I ran into a similar scenario. It seemed as if the packages were not being evaluated as a whole and running in the correct order. Although there were only 8 steps throughout. The DTS did not seem to be able to figure out where to begin. I ended up deleting all of the packages and re-entered the critical path of the job. I then added secondary jobs and alternate paths.

    It now runs beginning at step 1 and continuing through. This is not based on any knowledge on my part. It was just a long shot.

  • Not sure if this is the case but any DTS task without a precedence task will start immediately. So, it's entirely possible for multiple tasks to run concurrently. Now if they were all daisy chained along a series of precedence tasks and ran immediately, that's a serious problem (as was mine with jobs never executing).

    I'm not a big fan of multitasking (or concurrent processing) in DTS jobs, I prefer to make the DTS job run everything sequentially, so I line them all up with a series of "on success" or "on completion" workflows. Makes it easier to follow and debug. I also like to put an activeX script named START at the beginning with just a

    Main = DTSTaskExecResult_Success

    to kick off the stream. Makes it easy to spot where the DTS stream starts.

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply