Massive Data Flow object too big to work with? Almost 5000 errors swamping the real time checker?

  • I've got a large Data Flow object with dozens (over 100 each?) of conditional splits, destination objects and derived columns. It's all sourced from a single custom SELECT statement. When I need to break an "early" link in order to adjust the flow, it incurs tens of minutes of "VB is busy" time trying to figure out how many errors there actually are, now that all the downstream objects have no source definition.

    In fact, it's still thinking about things as I type this out.

    There's something fundamentally sour about waiting for a computer to do something. CPU is at 4%. Memory is fine. Minimal disk activity. How can I tell Visual Studio "Hey pal, I know I'm about to incur some undefined unpleasantries... don't give me too much guff about it.", so that I might keep working?

  • I'm sure you already tried this, but just incase you didn't, does setting DelayValidation to true on each step help at all? And maybe ValidateExternalMetadata to false as well?

  • I am more than a little unsure about the methodology you are employing in your package but it sounds like it is nearly impossible to maintain.

    As part of your initial select can you do any pre-processing to cut down on the conditionals and derived columns? I know it isn't always possible but whenever I can I let the DB handle some of those items.


Viewing 3 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply