Viewing 15 posts - 211 through 225 (of 681 total)
You might be able to use a conditional split transformation for this purpose. This object will take an input stream and will split it out to two or more...
October 15, 2009 at 11:12 pm
If your package runs for several hours even in your local development environment, I assume that there's a lot of transformation or a huge number of records being affected. ...
October 15, 2009 at 4:47 pm
You'll need to provide more details than simply "it's not working". What is the expected output?
October 14, 2009 at 9:11 pm
You can set the data types in the flat file connection manager under the Advanced tab of that control. Clicking each column will allow you to view or change...
October 13, 2009 at 9:47 pm
If your remote environment is the live environment, there may be an issue of resource contention. If you have a lot of users accessing the system while your SSIS...
October 13, 2009 at 9:29 pm
When you run the package locally, are you accessing a different database than when you try to execute it in your remote environment?
October 13, 2009 at 2:51 pm
Steve, I did receive your e-mail. I'll dig into it later today and see what I can find.
Thanks,
Tim
October 13, 2009 at 1:10 pm
If the format of your input files is static and consistent, you could change the data types in your flat file connection rather than explicitly convert them later in the...
October 13, 2009 at 12:13 pm
Sure - it's tdmitch [at] gmail [dot] com. The forum software won't allow you to upload a DTSX file, but I think it let's you load a zip file.
October 13, 2009 at 10:45 am
Interesting... I haven't encountered this behavior before. There's no requirement to explicitly release the file after processing in the data flow, so it's not clear to me why...
October 12, 2009 at 7:30 pm
It looks like you've got the data flow objects set up correctly, but the behavior you describe sounds like both of the file system tasks are trying to run at...
October 12, 2009 at 2:53 pm
Take a look at the attached and see if it might do what you need. Each quality check is done in turn, and the ones that fail the check...
October 12, 2009 at 2:21 pm
You didn't mention what you're doing with the Derived Column transform for each data file. What is it that this transform is doing? How long does it take...
October 12, 2009 at 1:25 pm
Can you attach the package, or a screenshot of your control flow?
October 12, 2009 at 12:58 pm
A CSV file is, by definition, a file that separates distinct values with a comma. So any non-qualified comma marks the beginning of the next field.
So, for your expected...
October 12, 2009 at 12:52 pm
Viewing 15 posts - 211 through 225 (of 681 total)