Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

FTP Task to import files based on last processed file variable Expand / Collapse
Author
Message
Posted Tuesday, February 16, 2010 9:13 PM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Wednesday, October 23, 2013 5:50 PM
Points: 257, Visits: 601
Hi All,

I have to import files into adatabase from the data provider through FTP. Each day the vendor provides 3,000 files which I need to import. The idea is the package will run every 30 minutes and I should grab files greater than last processed id during each run

ex:

20100217 is the folder which contains upto 3000 files
ex 1.txt, 2.txt, 3.txt, 4.txt

if in my previous run i have processed files 1,2,3 I should copy files only from 4.txt.
I see that the FTP Task has the option to define remote path as variable but how can i achieve the >lastprocessed option

may be i can save the ftp variable as
/20100217/*.txt but it will fetch all the files and each time i do not want to get 3000 files and then discard already processed ones. Is there an efficient way of doing this.

If FTP task has limitations then probably i will have to write a batch file that might do the same and I have to explore the possibilities

Your help in this regard is appreciated.
Post #866712
Posted Wednesday, February 17, 2010 12:34 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 2:15 AM
Points: 5,317, Visits: 12,354
It is common practice to move a file to an archive folder of some sort after processing it - perhaps adding 'processed date' to the file name in the process.

If you adopt this process, your requirement changes to become: process and move all available files. A FOREACH container will help you do this quite easily.



Help us to help you. For better, quicker and more-focused answers to your questions, consider following the advice in this link.

When you ask a question (and please do ask a question: "My T-SQL does not work" just doesn't cut it), please provide enough information for us to understand its context.

It is better to keep your mouth shut and appear stupid than to open it and remove all doubt. (Mark Twain)
Post #866789
Posted Wednesday, February 17, 2010 11:19 PM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Wednesday, October 23, 2013 5:50 PM
Points: 257, Visits: 601
yes I process the files and move it to an archive directory once I get the files in my working directory and process them.

The problem was with i should be ftping into my working directory ( i cannot move files from FTP I am allowed to copy only). On a given day there are allways files from 1- 3000 files (no particular order of file names and are very huge sometimes) and if i have processed 1-50 files i did not want to copy files from 1-3000 all i want is 51-3000.

Any body who might run into this problem. Here is what I have done. There may be better ways of doing this and I would like to hear others suggestions.

I got a directory listing of ftp through a script task. similar to dir command in FTP (say it listed 1-150 files at a given instance).
I Saved the filenames in an array and imported into a table. I matched with what I have already processed and looped through only unprocessed files for the next run and copied the files(51-150).
Post #867753
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse