The requirement is to transfer the data from a SQL Server table to an excel spreadsheet. The caveat is that it is not a one item activity. Instead, everytime we dump data into the spreadsheet the previous data should be overwritten. Before going into the solution, here is all that I tried with no luck:-
- Tried using BCP utility. It worked in transporting the data, however on opening the excel sheet you get some formatting error.
- Using SSIS Package, I was not able to use Excel destination because it supports till office 2007 only.
- So had to use OLEDB destination by changing the configuration so that it picks up the destination excel file.
- Was not able to use the "Delete from [sheet$]" command (because we need to get rid of the data before the new data comes in) as it is not supported by Excel and OLEDB connection manager.
- Using something like "Update Sheet$ set field = Null" could be a workaroundound for using delete. However, the new data that comes in starts populating from the next row number of the data that was already in there. For example, if the excel file initially had 100 rows and if we had used update command to make the data blank, the next data load will start from row number 101. (However, this needs to be confirmed whether this behaviour is 100 percent true or not)
The Solution: Not a perfect one but does the job. I used File Process Task to archive the excel file once data load is done and create a new copy of the original file using a predefined template:
TargetFile - Excel file to which data needs to be uploaded
TargetFolder - Location of the target excel file.
ArchiveFolder - Folder having the target file template (or exact copy of the target file but empty)
- Copy the existing file to an archived folder with today's date stamp.
- Delete the target file which has all the data in it.
- The archive folder will have a template for the original file meaning it has the same number of columns and the name of the file is exactly same as of target excel file.
- Copy this template file and put it into target destination folder.
- Use Data Flow to load the data
Let me know your comments and how you would have approached the task?