Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

Case Study: Importing New Data Without Emptying Existing Tables Expand / Collapse
Author
Message
Posted Wednesday, January 8, 2003 12:00 AM
SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Tuesday, November 5, 2013 9:05 AM
Points: 976, Visits: 59
Comments posted to this topic are about the content posted at http://www.sqlservercentral.com/columnists/rmarda/importingnewdatawithoutemptyingexistingtables.asp



Robert W. Marda
SQL Programmer
Ipreo
Post #9178
Posted Thursday, January 9, 2003 10:41 AM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Monday, August 4, 2014 3:27 AM
Points: 2,654, Visits: 32
As I see it, some of your choices are not the most straightforward ones. What was your argumentation for choosing this implementation?

Why you are using separate 'dummy' tasks to select the type of update to run (full or daily)?
Would a package 'Global Variable' not be a simpler alternative? In a script task you set the variable to the correct value and returns success / failure depending on the type. Using the workflow, you start the appropriate tasks on success / on failure. (You could even build a cascaded check, to have more control over real error handling.)

Why did you build a separate table with start and end times of the job?
For reporting a real failure, I would just use the 'Send Mail' task on failure. If you build more than one task, you could even include the type of failure in your mail.
Of course, that table can be used as a kind of history on the job for reporting.



Post #51165
Posted Thursday, January 9, 2003 1:00 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: Administrators
Last Login: Yesterday @ 5:56 PM
Points: 33,202, Visits: 15,348
I'm not sure I agree with all the choices made either, but it was nicely written and it appears that it works well for you to solve the problem.

I worked in a similar situation where we received tons of daily feeds from different vendors and had to import them. We too used a separate table to log the actions. Mainly because we could then track an import as it occurred. Some of these imports would copy data from the temp tables to real tables (massaging it along the way). Doing millions of rows could be slow, especially in a transaction, so we ran groups of 100 or so. By using the separate table, we had a log of start and finish times as well as the current count of rows. I think that's a great idea.

Steve Jones
sjones@sqlservercentral.com
http://www.sqlservercentral.com/columnists/sjones
www.dkranch.net







Follow me on Twitter: @way0utwest

Forum Etiquette: How to post data/code on a forum to get the best help
Post #51166
Posted Thursday, January 9, 2003 1:17 PM
SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Tuesday, November 5, 2013 9:05 AM
Points: 976, Visits: 59
This is the way this DTS package evolved. I have recently learned that I can use a VBScript before each data pump gets executed to determine if it will execute the data pump or not. However this technique would cause the package to show all the data pumps as succeeded which I don't want.

If the way you are talking about to use global variables to avoid the dummy tasks works differently from what I described above I would love to hear more. I'd love to have the package execute without errors and without logging a success for the data pumps that don't actually pump data.

As Steve Jones mentioned this is one of the reasons I created the seperate table, to log the start and finish of each of the tasks. The other reason is that when I first started working with DTS packages I didn't know where to find the task name and so the error logs were meaningless to me.

I did not mean to imply that the way I solved the problem described in the case study was the best. It was the best I could do at the time I developed each task. As I learn new techniques I do my best to implement them, however I don't always have time to put them. Unless the DTS package actually stops working or needs to be changed I often don't have time to improve them.

None of what I said is meant as an excuse just as the reason why the DTS package is the way it is and thus this is the way it was described in the case study.

Robert W. Marda
SQL Programmer
bigdough.com
The world’s leading capital markets contact database and software platform.




Robert W. Marda
SQL Programmer
Ipreo
Post #51167
Posted Thursday, January 9, 2003 4:04 PM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Monday, August 4, 2014 3:27 AM
Points: 2,654, Visits: 32
I know how these things grow, and I was just wondering if there are any issues with the DTS packages that made you choose this solution, to learn something more today .

I don't have a readily available solution for not having failed tasks / successful datapump tasks that did nothing using the global variable. Have to think about that...



Post #51168
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse