How to handle "big data" or "large data" with SSIS ?

  • I am running dynamic queries that often return HUGE (300MB - 1GB) result sets initially. Later, it should not be this big (not sure about that though) because I will be using delta loading. These result sets are then loaded into a C# data table. A script loops over these rows and then generates a query (stored in SSIS variable) to load them to the appropriate destination columns (determined by other scripts).

    For small result sets, my package runs properly. But, for big ones, it simply fails due to out of memory error.

    How do I resolve this problem ? Can you suggest some strategies. I guess I could fetch smaller parts of the

    data at a time and then load into target. Not sure how to go about it though. Is there a recipe for this ?

  • Duplicate post. See the following thread:

    http://www.sqlservercentral.com/Forums/Topic1509553-364-1.aspx

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply