Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

load five sources into five destinations Expand / Collapse
Author
Message
Posted Monday, December 22, 2008 9:55 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Thursday, November 11, 2010 4:10 AM
Points: 8, Visits: 14
Hi SSIS Gurus

I have 5 flat file Sources in a folder and i have to load this sources into five destinations.but condition is when the source data is updated automatically destination data will be update Previous data would not be added how we can do.

Thanks
Venki
Post #624062
Posted Monday, December 22, 2008 10:06 AM


SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Sunday, January 26, 2014 11:13 AM
Points: 2,242, Visits: 3,641
do u mean u have five flat files corresponding to five tables into which u want to load the data?
If yes, do u also mean that u're constantly adding records to the flat files and you want incremental data to be uploaded?






Pradeep Singh
Post #624067
Posted Monday, December 22, 2008 11:44 PM


SSCarpal Tunnel

SSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal Tunnel

Group: General Forum Members
Last Login: Today @ 6:57 AM
Points: 4,832, Visits: 11,199
So the sources are being updated continuously? Or just from time to time? You may encounter some locking problems.

One approach could be to move the source file to another folder, import it and then archive it - at least then you know that nothing will write to it while it is being processed, and that you will never process the same thing more than once. Hopefully, whatever is creating the source data will just recreate the file the next time it needs to write data.

You then need a way of monitoring the source folder for new data - in the example above, that just means new files. You could either just schedule the SSIS package to run frequently (exiting gracefully if there are no files to process) or write a Windows service to monitor the folder and run the package in (almost) real time when new files are created.

Phil



Help us to help you. For better, quicker and more-focused answers to your questions, consider following the advice in this link.

When you ask a question (and please do ask a question: "My T-SQL does not work" just doesn't cut it), please provide enough information for us to understand its context.
Post #624479
Posted Tuesday, December 23, 2008 5:56 AM


Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: Yesterday @ 8:51 PM
Points: 1,114, Visits: 2,195
Phil Parkin (12/22/2008)
So the sources are being updated continuously? Or just from time to time? You may encounter some locking problems.

One approach could be to move the source file to another folder, import it and then archive it - at least then you know that nothing will write to it while it is being processed, and that you will never process the same thing more than once. Hopefully, whatever is creating the source data will just recreate the file the next time it needs to write data.

You then need a way of monitoring the source folder for new data - in the example above, that just means new files. You could either just schedule the SSIS package to run frequently (exiting gracefully if there are no files to process) or write a Windows service to monitor the folder and run the package in (almost) real time when new files are created.

Phil


You can monitor folder for changes using the File Watcher Task.


---
SSIS Tasks Components Scripts Services | http://www.cozyroc.com/

Post #624636
Posted Tuesday, December 23, 2008 6:16 AM


SSCarpal Tunnel

SSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal Tunnel

Group: General Forum Members
Last Login: Today @ 6:57 AM
Points: 4,832, Visits: 11,199
Great tip - I did not know about that ... looks perfect for this.


Help us to help you. For better, quicker and more-focused answers to your questions, consider following the advice in this link.

When you ask a question (and please do ask a question: "My T-SQL does not work" just doesn't cut it), please provide enough information for us to understand its context.
Post #624652
Posted Tuesday, December 23, 2008 6:59 AM


SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Thursday, April 10, 2014 4:26 PM
Points: 400, Visits: 946
I like the idea of moving the files and we currently use the file watcher in one of our ETL's, works great. If you can't move the files, here's a couple options, both require either a timestamp or a unique key.
If the key is available you can re-read the file each time and only add records where the key does not exist in the destination. Do this by staging the flat file to a temp table then use a merge join set up as a left join.
Join the new data(left side of join) to existing data (right side of join) just include 1 piece of key data from the right side to check for null. Use a conditional split and only output data where the piece of right key data is null.
If the data is timestamped then do as above and stage the entire flat file. Check the destination data for the greatest time stamp value and select all records that have a timestamp > than the max of the destination. This may be prone to errors depending on the accuracy of the timestamp and frequency which records are added to the flat file. A way around this would be to select records from the staged data using a range, the low end would be > max of the destination and the high would be the max of the staged less 1 of the smallest time unit. So if are dealing with seconds take the max stage less 1 second, that way you know all entries with that time value are in the stage table. hope that made sense.
Post #624661
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse