Click here to monitor SSC
SQLServerCentral is supported by Redgate
Log in  ::  Register  ::  Not logged in
Home       Members    Calendar    Who's On

Add to briefcase

Memory management in script task ? Expand / Collapse
Posted Monday, January 20, 2014 4:14 PM


Group: General Forum Members
Last Login: Thursday, May 15, 2014 8:51 PM
Points: 183, Visits: 480
I have no choice but to load multiple result sets into memory via script task. Then, I will use the script task to read each of these results and combine the relevant columns (as per rules) into a txt file. There is no certainty on how big the data set will be or how many tables will have to be handled. My ETL will simply see which tables to get data from and then store them in memory for processing.

The worst case could be that I run out of memory. So, is there a simple way I can page a resultset to the harddisk and extract it in parts ? Is there a better way to do what I am trying ?

Post #1532810
Posted Tuesday, January 21, 2014 2:09 AM


Group: General Forum Members
Last Login: Wednesday, January 27, 2016 7:06 AM
Points: 2,657, Visits: 1,657
You can use a Raw File Destination in a Data Flow Task which writes data to a .raw file on the filesystem in SQL Server native format. You can also use the same .raw file as a Raw File Source in a subsequent Data Flow Task.
There are three properties of a Data Flow task that you might want to look at:

BufferTempStoragePath - the location of files (can be multiple locations) that temporarily store buffer data that can no longer fit in memory.
DefaultBufferMaxRows and DefaultBufferSize - these two properties are related and while they can't increase the amount of total memory available, adjusting them can help you make optimum use of the memory that you do have. It's a bit of a black art mind! See this TechNet link for more details.

Post #1532909
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse