DefaultBufferMaxRows and - Size

  • Hi SSC,

    I have a little trouble wrapping my head correctly around this to solve my OOM exception.

    Here's what I got:

    -XML DataSource, most string columns are defined as DT_WSTR,255

    - roughly 80 Destination Tables from this XML DS

    Now there is this supposedly nice option called DefaultBufferMaxRows, which rows are exactly being meant? Source? Destination? Both (as sum)?

    Related to this the DefaultBufferSize, does it apply per source, destination or row (as in row length * DefaultBufferMaxRows)?

    Each destination aswell has it's own maximum insert commit size, how's the relation here in regards to DefaultBufferMaxSize?

    I can't say if every row would produce an entry in every single table or not. This was previously working just fine on SQL Server 2014 but on SQL 2017 I'm constantly running OOM (by using the values which were there prior to the upgrade) on this package when on 2017. For funs sake I've even reduced max Server Memory Limit by another 10 GB on the DEV Environment to avoid really running OOM. During execution used memory rises from 89% to 90% (which equals to 2.56 GB used by SSIS) and that's about it.

    AutoAdjustBufferSize set to Auto didn't migitate running "OOM". Even when he fails he says like he failed to allocate 35 - 70 MB RAM which is an utterly crappy Statement when looking at total system memory.

    Noteworthy mention: It doesn't run OOM right away but rather after something like 5 minutes of execution time.

  • Well I've read that but things don't add up for me somehow. In the old world I had DefaultMaxBufferRows set at 10000 and things were fine, on 2017 it's still failing with 1000 rows for the new environment.

    I have to do a partial full load on the first run as in the first time there's more files to be processed than on subsequent runs, however this shouldn't affect processing as with the DefaultMaxBufferRows I'm specifying how many rows of the total I want to process within a batch? It's not failing right away either but after like almost 3 hours of processing. There is no Script Component or something involved, literally just get data from source XML, add the Import ID and send all of this straight to the destination.

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply