September 11, 2018 at 5:49 pm
I am basically new to Azure platform and the whole paradigm of cloud servers, so I need help figuring out my workflow.
I have this project that I have been putting together in pieces. I run a series of procedures to process data that I need to write in several tabs within an excel file.
I have it all running locally, problem is, I need to do all this in Azure, running the packages in Data Factory, and saving the file to Box.
The way I do it locally is: I run my package using a date variable for the report date, I have and Excel template file stored in a local folder, I make a copy of it and rename using the date variable. I run three Data Flow tasks, each one of them with a Stored procedure that uses the date variable as the parameter, and I save the data from each of the tasks in a different tab with an Excel Destination component.that points to the new excel file saved before. After the three tabs are written and saved, I upload the file to Box using FTP, and everything finishes nicely.
Now I need to accomplish this using Data Factory, and I am not sure what would be the best strategy to follow here. Is it even possible? I'm thinking maybe the template should reside in a Blob container, from which I would get it, rename it, write the data in it, and then save it and send it to Box. But that's probably too simplistic, I am not sure it can be done. Before I research further, I would like to know, if you can share your knowledge, what are my options, and what is the best strategy to accomplish this whole task.
I appreciate in advance any help you can provide.
Viewing post 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply
This website stores cookies on your computer.
These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media.
To find out more about the cookies we use, see our Privacy Policy