In Azure Data Factory, how does the Event trigger really work?

  • I'm planning a solution where I have a upload to Container triggered a pipeline which copies these files to SQL tables, and then deletes the files. I would prefer to have this pipeline run from a event trigger, since in case the data is not in place for whatever reason I can still accomodate late data. My worry is that, I'm expecting around 10 files in my container (all will be uploaded somewhat simultaneously) and I want to wait for all these files before running my pipeline. If I have a pipeline that has a file uploaded trigger and it gets uploaded 10 files "at once", will it run 10 times?

  • The trigger will be fired 10 times as there are 10 'blob created' events.

    How do you know when the set of files is ready?  Is there a particular file name for the last file?

    You'll need some logic that checks the number of files in the folder or the file names or something and then only continues if you have the complete set

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply