• Thank you Erland. I will take a look at the approach you recommended. My concern as you mention in your article is performance. Some of the files I load are larger in nature. I had tried using something similar but found the performance lacking for the large files.

    I am currently developing a process that will load each file's record into one field. (gets the file in SQL without any fuss)

    Then based on a user defined set of layouts (Using a Web UI I developed) parse the field into multiple fields (in another table). A poor man's ETL so to speak.

    Obviously I can write SSIS packages to do the same thing, but the method I have developed is:

    1. More dynamic and requires less knowledge from the end user perspective.

    2. Faster than SSIS based on what I have read and experienced.

    3. More portable, having all my code in SP's instead of SSIS packages.

    SQL Server 2012's FileTable functionality allows me to drop files in the UNC directory from an ftp process. Use T-SQL to loop through the files in the directory (FileTable) and kick off a Bulk Load for each file. Without publishing a single SSIS package.

    Ran into issues when trying to use the UNC path.

    I have a work around in place at the moment, but using the UNC path is much cleaner.