We have the following scenario: We receive CSV files every month for which SSIS packages were built to process the data. The following problems occur from time to time:
1. The structure of the CSV file changed (e.g. column added or removed)
2. There were no footers in the data, but now footers started to appear
3. Date format changed (e.g. used to be mm/dd/yyyy, but became mm.dd.yyyy)
4. Number format changed (e.g. from 2000 to 2,000)
Currently we have person who manually opens each file, and using our "validation document" validates to ensure none of these or similar problems occur. We would like to move away from this manual process if possible and are looking for suggestions.
I understand that items 3. and 4. could be caught by loading data into a staging table with VARCHAR data types, and performing validation before moving it any further.
Item 2 is a bit questionable (meaning depending on the footer size SSIS load could fail or not).
Item 1, however, is a sure fail of the SSIS package that directly loads the data into a table.
Thus I feel the two possible options are:
1. Create a custom script that will run through the file, row by row, apply all the necessary validations and report an error or continue if all checks out
2. Use some 3rd party tool to validate the files (semi-manually) before kicking off the SSIS processing.
My questions are:
1. If you've had encountered a similar problem, how did you resolve it? If you did build a custom script, could you share, or do you know of some Framework that was built that could be used somewhat as plug and play?
2. Does anyone know of good 3rd party tool(s) to assist in this process?
Thanks in advance!
I have had a similar process and the csv files were actually in the multiple gigabytes in size so the stakes were high in terms of knowing the file was in the correct format before trying to load it so we did not waste a bunch of processing time only to find out the file was not good.
I wrote a PowerShell script to do the validation of the file. In my case it was usually that the sender would add a new column to the file, however they would add it in the middle of the column-list somewhere. I would read line one of the file which contained the column headers and then save it off to a new file. Then I would read the file from the previous file and compare it to this file's column header list to see if anything new was introduced. If it was then I would stop processing.
I did not have date formats changing, luckily, but if I did I would probably validate that in SSIS as it was being loaded into the initial staging table. If I were worried I could fix it up using a Derived Column or maybe a Script Component setup as a Transformation as .NET has some good date-parsing functions built-in and .NET is a little easier to debug than are SSIS Expressions.
__________________________________________________________________________________________________There are no special teachers of virtue, because virtue is taught by the whole community. --Plato