I am running dynamic queries that often return HUGE (300MB - 1GB) result sets initially. Later, it should not be this big (not sure about that though) because I will be using delta loading. These result sets are then loaded into a C# data table. A script loops over these rows and then generates a query (stored in SSIS variable) to load them to the appropriate destination columns (determined by other scripts).
For small result sets, my package runs properly. But, for big ones, it simply fails due to out of memory error.
How do I resolve this problem ? Can you suggest some strategies. I guess I could fetch smaller parts of the
data at a time and then load into target. Not sure how to go about it though. Is there a recipe for this ?