SQL Clone
SQLServerCentral is supported by Redgate
 
Log in  ::  Register  ::  Not logged in
 
 
 


Case Study: Importing New Data Without Emptying Existing Tables


Case Study: Importing New Data Without Emptying Existing Tables

Author
Message
Robert W Marda
Robert W Marda
SSCertifiable
SSCertifiable (5.6K reputation)SSCertifiable (5.6K reputation)SSCertifiable (5.6K reputation)SSCertifiable (5.6K reputation)SSCertifiable (5.6K reputation)SSCertifiable (5.6K reputation)SSCertifiable (5.6K reputation)SSCertifiable (5.6K reputation)

Group: General Forum Members
Points: 5607 Visits: 140
Comments posted to this topic are about the content posted at http://www.sqlservercentral.com/columnists/rmarda/importingnewdatawithoutemptyingexistingtables.asp

Robert W. Marda
Billing and OSS Specialist - SQL Programmer
MCL Systems
NPeeters
NPeeters
SSCertifiable
SSCertifiable (5.5K reputation)SSCertifiable (5.5K reputation)SSCertifiable (5.5K reputation)SSCertifiable (5.5K reputation)SSCertifiable (5.5K reputation)SSCertifiable (5.5K reputation)SSCertifiable (5.5K reputation)SSCertifiable (5.5K reputation)

Group: General Forum Members
Points: 5507 Visits: 36
As I see it, some of your choices are not the most straightforward ones. What was your argumentation for choosing this implementation?

Why you are using separate 'dummy' tasks to select the type of update to run (full or daily)?
Would a package 'Global Variable' not be a simpler alternative? In a script task you set the variable to the correct value and returns success / failure depending on the type. Using the workflow, you start the appropriate tasks on success / on failure. (You could even build a cascaded check, to have more control over real error handling.)

Why did you build a separate table with start and end times of the job?
For reporting a real failure, I would just use the 'Send Mail' task on failure. If you build more than one task, you could even include the type of failure in your mail.
Of course, that table can be used as a kind of history on the job for reporting.



Steve Jones
Steve Jones
SSC Guru
SSC Guru (284K reputation)SSC Guru (284K reputation)SSC Guru (284K reputation)SSC Guru (284K reputation)SSC Guru (284K reputation)SSC Guru (284K reputation)SSC Guru (284K reputation)SSC Guru (284K reputation)

Group: Administrators
Points: 284283 Visits: 19914
I'm not sure I agree with all the choices made either, but it was nicely written and it appears that it works well for you to solve the problem.

I worked in a similar situation where we received tons of daily feeds from different vendors and had to import them. We too used a separate table to log the actions. Mainly because we could then track an import as it occurred. Some of these imports would copy data from the temp tables to real tables (massaging it along the way). Doing millions of rows could be slow, especially in a transaction, so we ran groups of 100 or so. By using the separate table, we had a log of start and finish times as well as the current count of rows. I think that's a great idea.

Steve Jones
sjones@sqlservercentral.com
http://www.sqlservercentral.com/columnists/sjones
www.dkranch.net

Follow me on Twitter: @way0utwest
Forum Etiquette: How to post data/code on a forum to get the best help
My Blog: www.voiceofthedba.com
Robert W Marda
Robert W Marda
SSCertifiable
SSCertifiable (5.6K reputation)SSCertifiable (5.6K reputation)SSCertifiable (5.6K reputation)SSCertifiable (5.6K reputation)SSCertifiable (5.6K reputation)SSCertifiable (5.6K reputation)SSCertifiable (5.6K reputation)SSCertifiable (5.6K reputation)

Group: General Forum Members
Points: 5607 Visits: 140
This is the way this DTS package evolved. I have recently learned that I can use a VBScript before each data pump gets executed to determine if it will execute the data pump or not. However this technique would cause the package to show all the data pumps as succeeded which I don't want.

If the way you are talking about to use global variables to avoid the dummy tasks works differently from what I described above I would love to hear more. I'd love to have the package execute without errors and without logging a success for the data pumps that don't actually pump data.

As Steve Jones mentioned this is one of the reasons I created the seperate table, to log the start and finish of each of the tasks. The other reason is that when I first started working with DTS packages I didn't know where to find the task name and so the error logs were meaningless to me.

I did not mean to imply that the way I solved the problem described in the case study was the best. It was the best I could do at the time I developed each task. As I learn new techniques I do my best to implement them, however I don't always have time to put them. Unless the DTS package actually stops working or needs to be changed I often don't have time to improve them.

None of what I said is meant as an excuse just as the reason why the DTS package is the way it is and thus this is the way it was described in the case study.

Robert W. Marda
SQL Programmer
bigdough.com
The world’s leading capital markets contact database and software platform.

Robert W. Marda
Billing and OSS Specialist - SQL Programmer
MCL Systems
NPeeters
NPeeters
SSCertifiable
SSCertifiable (5.5K reputation)SSCertifiable (5.5K reputation)SSCertifiable (5.5K reputation)SSCertifiable (5.5K reputation)SSCertifiable (5.5K reputation)SSCertifiable (5.5K reputation)SSCertifiable (5.5K reputation)SSCertifiable (5.5K reputation)

Group: General Forum Members
Points: 5507 Visits: 36
I know how these things grow, and I was just wondering if there are any issues with the DTS packages that made you choose this solution, to learn something more today .

I don't have a readily available solution for not having failed tasks / successful datapump tasks that did nothing using the global variable. Have to think about that...



Go


Permissions

You can't post new topics.
You can't post topic replies.
You can't post new polls.
You can't post replies to polls.
You can't edit your own topics.
You can't delete your own topics.
You can't edit other topics.
You can't delete other topics.
You can't edit your own posts.
You can't edit other posts.
You can't delete your own posts.
You can't delete other posts.
You can't post events.
You can't edit your own events.
You can't edit other events.
You can't delete your own events.
You can't delete other events.
You can't send private messages.
You can't send emails.
You can read topics.
You can't vote in polls.
You can't upload attachments.
You can download attachments.
You can't post HTML code.
You can't edit HTML code.
You can't post IFCode.
You can't post JavaScript.
You can post emoticons.
You can't post or upload images.

Select a forum







































































































































































SQLServerCentral


Search