October 29, 2013 at 1:39 pm
I am running dynamic queries that often return HUGE (300MB - 1GB) result sets initially. Later, it should not be this big (not sure about that though) because I will be using delta loading. These result sets are then loaded into a C# data table. A script loops over these rows and then generates a query (stored in SSIS variable) to load them to the appropriate destination columns (determined by other scripts).
For small result sets, my package runs properly. But, for big ones, it simply fails due to out of memory error.
How do I resolve this problem ? Can you suggest some strategies. I guess I could fetch smaller parts of the
data at a time and then load into target. Not sure how to go about it though. Is there a recipe for this ?
October 30, 2013 at 9:14 am
Duplicate post. See the following thread:
http://www.sqlservercentral.com/Forums/Topic1509553-364-1.aspx
Viewing 2 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply
This website stores cookies on your computer.
These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media.
To find out more about the cookies we use, see our Privacy Policy