March 7, 2018 at 9:59 am
I have an integration that works great and took a while to get it setup to work properly. I now need to modify the Foreach Loop Container to point to different files that have different columns. I modified the Foreach Look to point to the new files (eg. NewFiles*.csv). The flat file source Data Task within the Foreach Loop container is not reflecting the new columns. There is no option of refreshing the columns.
How do I refresh the source when the foreach loop changes? I'd really like to avoid having to redo everything.
March 7, 2018 at 10:11 am
If you're asking if SSIS will automatically update the source definition when the file format changes it won't.
March 7, 2018 at 10:45 am
Luv SQL - Wednesday, March 7, 2018 9:59 AMI have an integration that works great and took a while to get it setup to work properly. I now need to modify the Foreach Loop Container to point to different files that have different columns. I modified the Foreach Look to point to the new files (eg. NewFiles*.csv). The flat file source Data Task within the Foreach Loop container is not reflecting the new columns. There is no option of refreshing the columns.How do I refresh the source when the foreach loop changes? I'd really like to avoid having to redo everything.
Column names and properties are set at design time only, so you will have to do this in VS and not at run time.
Even if SSIS could read the modified source columns, it would not know how to process them or where to map them.
March 7, 2018 at 10:57 am
I'm doing the changes in VS. I don't see any options to update column definitions. If I wasn't using the ForEach Loop it wouldn't be an issue.
March 7, 2018 at 12:32 pm
Luv SQL - Wednesday, March 7, 2018 10:57 AMI'm doing the changes in VS. I don't see any options to update column definitions. If I wasn't using the ForEach Loop it wouldn't be an issue.
Are you using a File Connection? Did you update that?
March 7, 2018 at 1:04 pm
I've deleted both my flat file source and ole db destination and re-added. Since I'm using a ForEach Loop the flat file source references the flat file connection and not an individual file. When I run it now after the changes, it no longer runs grrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr. The data flow is inside the forloop so am not sure why it doesn't work anymore.
Started: 2:49:31 PM Error: 2018-03-07 14:49:32.13
Code: 0xC020200E Source: Data Flow Task Flat File Source [2] Description: Cannot open the datafile "". End Error Error: 2018-03-07 14:49:32.13
Code: 0xC004701A Source: Data Flow Task SSIS.Pipeline Description: Flat File Source failed the pre-execute phase and returned error code 0xC020200E.
End Error
DTExec: The package execution returned DTSER_FAILURE (1). Started: 2:49:31 PM Finished: 2:49:32 PM Elapsed: 0.187 seconds. The package execution failed. The step failed.
March 7, 2018 at 1:12 pm
Luv SQL - Wednesday, March 7, 2018 1:04 PMI've deleted both my flat file source and ole db destination and re-added. Since I'm using a ForEach Loop the flat file source references the flat file connection and not an individual file. When I run it now after the changes, it no longer runs grrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr. The data flow is inside the forloop so am not sure why it doesn't work anymore.Started: 2:49:31 PM Error: 2018-03-07 14:49:32.13
Code: 0xC020200E Source: Data Flow Task Flat File Source [2] Description: Cannot open the datafile "". End Error Error: 2018-03-07 14:49:32.13
Code: 0xC004701A Source: Data Flow Task SSIS.Pipeline Description: Flat File Source failed the pre-execute phase and returned error code 0xC020200E.
End Error
DTExec: The package execution returned DTSER_FAILURE (1). Started: 2:49:31 PM Finished: 2:49:32 PM Elapsed: 0.187 seconds. The package execution failed. The step failed.
You've got that round the wrong way. The source component in your dataflow references the flat file source. The only way that you can process multiple files in a FEL like this is if the files all have the same format (Column names, number of columns, data types etc).
March 7, 2018 at 1:30 pm
They do have all the same column names....I'm simply changed it from a bunch of *Summary*.csv files to *Detailed*.csv files (the destination has already been changed to reflect the new column names).
All of the Summary and Detail files are identical with the same columns, same number of columns, same data types. It worked yesterday with my summary files and now won't work with the detail files. Even with a SINGLE csv file (so there is no chance of any differences between the files) it does not run anymore.
March 7, 2018 at 1:35 pm
Luv SQL - Wednesday, March 7, 2018 1:30 PMThey do have all the same column names....I'm simply changed it from a bunch of *Summary*.csv files to *Detailed*.csv files (the destination has already been changed to reflect the new column names).All of the Summary and Detail files are identical with the same columns, same number of columns, same data types. It worked yesterday with my summary files and now won't work with the detail files. Even with a SINGLE csv file (so there is no chance of any differences between the files) it does not run anymore.
Can you confirm that you are using separate flat file sources for Summary and Detailed files and that you have one FEL for each (two FELs in total)? Also, that you have filtering in place on your FELs to avoid attempting to process files of the wrong type?
March 7, 2018 at 1:36 pm
Fixed it. Forgot to add the Expression into the Source Connection Manager for my variable. D'OH!
March 7, 2018 at 1:41 pm
Luv SQL - Wednesday, March 7, 2018 1:36 PMFixed it. Forgot to add the Expression into the Source Connection Manager for my variable. D'OH!
Good stuff. That's quite important!
March 7, 2018 at 1:44 pm
Very lol
Viewing 12 posts - 1 through 12 (of 12 total)
You must be logged in to reply to this topic. Login to reply
This website stores cookies on your computer.
These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media.
To find out more about the cookies we use, see our Privacy Policy