• if you need to process each remote file once and only once then a workflow like this would make sense to me:

    1. create a table in one of your databases that has columns "server name", "remote file path", "remote file name", "last modified date", "date downloaded"

    2. create two interface stored procedures to access the new table:

    ....proc 1 selects row based on "server name", "remote file path" and "remote file name"

    ....proc 2 inserts a new row

    3. perform a directory listing on the remote FTP server

    4. iterate over the list of remote files and check for the existence of a row in your table using your select-proc

    5. if the row exists in your table and the "last modified date" is the same it means you processed it already. if the row exists in your table and the "last modified date" is different it means a different file was posted, what to do depends on your business rules. if the row does not exist then you haven't processed it yet.

    6. once you successfully download the file then you can add a row to your table using the insert interface proc you wrote

    a program like this would be easier to implement outside SQL Server, for example as a .NET Console Application. I would recommend leveraging a tool like WinSCP for the FTP or SFTP logic. the developer provides a very solid implementation and a nice .NET wrapper class to help you interface with the WinSCP executable from a .NET programming context. commercial tools from GlobalScape are fantastic too but cost a bit of money.

    that would be all I would do in terms of attaining the files. I would keep the "file getting" piece separate from the ETL layer.

    There are no special teachers of virtue, because virtue is taught by the whole community.
    --Plato