• Phil Parkin - Wednesday, August 23, 2017 10:16 AM

    BI_Dev - Wednesday, August 23, 2017 9:56 AM

    I'm designing a new package where the very first step I need to copy a flat file from a share and put it in an official share folder where our packages pick up other flat files for other purposes.  This flat file gets overwritten during every copy. It has to run once daily. My question is; for run time and performance optimization, which would be the recommended task?  A file copy task to copy/overwrite the flat file, or a data flow task to import flat file; get row count inserted into a variable for auditing purposes, and last, loading the data into a flat file on the SSIS bound flat files.

    I'm leaning file copy task, but I'd like to hear from the experts.

    File Copy, using a File System Task or Script Task, gets my vote.
    Your alternative suggestion is not something I've ever encountered 'in the wild' before and it would puzzle me if I ever did ("Why did they do it that way ...?").

    I agree. I think the reason a DFT was used because the total row count had to be passed to a variable which in turn, gets inserted into a sql audit table.  How would I be able to do this for the file copy task?