Case Study: Importing New Data Without Emptying Existing Tables

  • Comments posted to this topic are about the content posted at http://www.sqlservercentral.com/columnists/rmarda/importingnewdatawithoutemptyingexistingtables.asp

    Robert W. Marda
    Billing and OSS Specialist - SQL Programmer
    MCL Systems

  • As I see it, some of your choices are not the most straightforward ones. What was your argumentation for choosing this implementation?

    Why you are using separate 'dummy' tasks to select the type of update to run (full or daily)?

    Would a package 'Global Variable' not be a simpler alternative? In a script task you set the variable to the correct value and returns success / failure depending on the type. Using the workflow, you start the appropriate tasks on success / on failure. (You could even build a cascaded check, to have more control over real error handling.)

    Why did you build a separate table with start and end times of the job?

    For reporting a real failure, I would just use the 'Send Mail' task on failure. If you build more than one task, you could even include the type of failure in your mail.

    Of course, that table can be used as a kind of history on the job for reporting.

  • I'm not sure I agree with all the choices made either, but it was nicely written and it appears that it works well for you to solve the problem.

    I worked in a similar situation where we received tons of daily feeds from different vendors and had to import them. We too used a separate table to log the actions. Mainly because we could then track an import as it occurred. Some of these imports would copy data from the temp tables to real tables (massaging it along the way). Doing millions of rows could be slow, especially in a transaction, so we ran groups of 100 or so. By using the separate table, we had a log of start and finish times as well as the current count of rows. I think that's a great idea.

    Steve Jones

    sjones@sqlservercentral.com

    http://www.sqlservercentral.com/columnists/sjones

    http://www.dkranch.net

  • This is the way this DTS package evolved. I have recently learned that I can use a VBScript before each data pump gets executed to determine if it will execute the data pump or not. However this technique would cause the package to show all the data pumps as succeeded which I don't want.

    If the way you are talking about to use global variables to avoid the dummy tasks works differently from what I described above I would love to hear more. I'd love to have the package execute without errors and without logging a success for the data pumps that don't actually pump data.

    As Steve Jones mentioned this is one of the reasons I created the seperate table, to log the start and finish of each of the tasks. The other reason is that when I first started working with DTS packages I didn't know where to find the task name and so the error logs were meaningless to me.

    I did not mean to imply that the way I solved the problem described in the case study was the best. It was the best I could do at the time I developed each task. As I learn new techniques I do my best to implement them, however I don't always have time to put them. Unless the DTS package actually stops working or needs to be changed I often don't have time to improve them.

    None of what I said is meant as an excuse just as the reason why the DTS package is the way it is and thus this is the way it was described in the case study.

    Robert W. Marda

    SQL Programmer

    bigdough.com

    The world’s leading capital markets contact database and software platform.

    Robert W. Marda
    Billing and OSS Specialist - SQL Programmer
    MCL Systems

  • I know how these things grow, and I was just wondering if there are any issues with the DTS packages that made you choose this solution, to learn something more today .

    I don't have a readily available solution for not having failed tasks / successful datapump tasks that did nothing using the global variable. Have to think about that...

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply