Method for collecting files and storing in Azure storage containers

  • Hi,

    The desired solution
    We would like the final solution to offer a generic way of taking files from a source and copying them to our Azure Storage containers for consumption later on by other processes. We would have a configuration table in a database which said the source, the destination and the copy schedule. A process would then loop over this data, setting the source and destination accordingly and moving the files. The source would mainly be https and ftp, perhaps with some API requests returning JSON.

    To give us a kind of dumping ground for files (such as GPS data for our assets and weather data). This process is purely copying the files from the source, its not doing any sort of transformation or loading, that will be done later by bespoke packages.

    The proposed ideas (setting aside API requests for now).
    The initial idea was to use Azure Data Factory to do this but it presents us with a problem in that the source has to be set when the pipeline is built meaning we would have to build a pipeline for each file. This could leave us with 50+.
    The next idea is to use SSIS but I'm not sure if it can download files from https sources or would we have to write some code to do this and embed it in the package?
    Then I came across Microsoft AZ Copy which I can either embed in an SSIS package or possibly even use via xp_cmdshell, but I've not tried it yet so I'll have to see how it does.

    My question is really how might you go about achieving this? One of the above methods or am I missing something that could do this?

    Thanks,

    Nic

Viewing 0 posts

You must be logged in to reply to this topic. Login to reply