What kind of value tests are you looking to do against the rows? As mentioned above, you can do things with lookup tests and the like to test for foreign key violations and the like during the datastream, also you can use derived columns and conditional splits (sometimes in tandem, if necessary) to usually do most of your tests to split the stream into success/review/fail streams as needed.
Additionally, there's always the transformation script component for more complex checks during the datastream, allowing you to directly choose which output stream to deliver an inbound stream to. You'll need asynchronous for that one, but it's not much harder than a synchronous transformation.
The things you'll want to avoid during this process however is anything that requires the stream to 'pause', as that'll kill any optimization you might get out of using the datastream instead of dumping to staging table and running bulk T-SQL against it. Those include sorts and aggregates, as those are the usual culprits for something like this. Fuzzy Grouping is another one. Basically anything that needs to grab a group of rows before it can do something. Avoid those like the plague, they're usually best off performed in T-SQL unless you're multicasting for final counts or similar.
On a side note, Jeff wasn't being sarcastic or trying to imply anything other than what he directly said. He's incredibly efficient in T-SQL and try as I might I can't convince him of the value of SSIS in datastream optimizations... mostly because he can usually counter with an equivalent speed method in T-SQL. :w00t: I do occassionally get him though, but it comes down to 'It Depends' scenarios.
- Craig Farrell
Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.
For better assistance in answering your questions[/url] | Forum Netiquette
For index/tuning help, follow these directions.[/url] |Tally Tables[/url]