Memory management in script task ?

  • I have no choice but to load multiple result sets into memory via script task. Then, I will use the script task to read each of these results and combine the relevant columns (as per rules) into a txt file. There is no certainty on how big the data set will be or how many tables will have to be handled. My ETL will simply see which tables to get data from and then store them in memory for processing.

    The worst case could be that I run out of memory. So, is there a simple way I can page a resultset to the harddisk and extract it in parts ? Is there a better way to do what I am trying ?

    Thanks.

  • You can use a Raw File Destination in a Data Flow Task which writes data to a .raw file on the filesystem in SQL Server native format. You can also use the same .raw file as a Raw File Source in a subsequent Data Flow Task.

    There are three properties of a Data Flow task that you might want to look at:

    BufferTempStoragePath - the location of files (can be multiple locations) that temporarily store buffer data that can no longer fit in memory.

    DefaultBufferMaxRows and DefaultBufferSize - these two properties are related and while they can't increase the amount of total memory available, adjusting them can help you make optimum use of the memory that you do have. It's a bit of a black art mind! See this TechNet link for more details.

    Regards

    Lempster

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply