Failed to allocate a managed memory buffer of [x] bytes -- Despite tons of memory being available.

  • It starts w/the proverbial:

    [Notes - F1 [107]] Error: An error occurred with the following error message: "System.OutOfMemoryException: Insufficient memory to continue the execution of the program. (SSIS Integration Toolkit for Microsoft Dynamics 365, v10.2.0.6982 - DtsDebugHost, v13.0.1601.5)".

    But even in it's own diagnostics, it shows that plenty of memory is available (yes, that's 32GB I have on my system):

    Error: The system reports 47 percent memory load. There are 34270687232 bytes of physical memory with 18094620672 bytes free. There are 4294836224 bytes of virtual memory with 981348352 bytes free. The paging file has 34270687232 bytes with 12832284672 bytes free.

    The info messages report memory pressure:

    Information: The buffer manager failed a memory allocation call for 506870912 bytes, but was unable to swap out any buffers to relieve memory pressure. 2 buffers were considered and 2 were locked. Either not enough memory is available to the pipeline because not enough are installed, other processes were using it, or too many buffers are locked.

    are locked.

    I currently have the max rows set at 500 w/the buffer size at 506,870,912 in this example. I've tried the maximum buffer size, but that fails instantly, and the minimum buffer size still throws errors. I've fiddled w/various sizes, but it never gets anywhere close to processing the whole data set. The error I get when I set the DefaultBufferSize lower is:

    [Notes - F1 [107]] Error: An error occurred with the following error message: "KingswaySoft.IntegrationToolkit.DynamicsCrm.CrmServiceException: CRM service call returned an error: Failed to allocate a managed memory buffer of 536870912 bytes. The amount of available memory may be low. (SSIS Integration Toolkit for Microsoft Dynamics 365, v10.2.0.6982 - DtsDebugHost,

    I've looked for resources on how to tune this, but cannot find anything relevant to having a 64bit Window 10 machine (as opposed to a server) that has 32GB of RAM to play with.

    For a bit more context, I'm migrating notes from one CRM D365 environment to another using Kingsway. The notes w/attachments are the ones causing the issue.  There should be about 32K rows in the source, and they all have ntext representing the file.  I parsed the records that don't have a file in a different DTF.  The vast majority of the file sizes are < 1MB, with only 880ish being > 1MB.

    Here is the overall structure of the DTF:

    Here are the properties:

    Source:

    Dest:

  • I would try out the following
    on dataflow properties

    • set defaultbuffersize to 100000000 (nearly max)
    • set defaultbuffermaxrows to 100
    • set blobtempstoragepath to a folder on a drive with enough space
    • set buffertempstoragepath to a folder on a drive with enough space

    on destination component

    • change multithread to 2
  • frederico_fonseca - Friday, November 2, 2018 3:19 AM

    I would try out the following

    • set defaultbuffersize to 100000000 (nearly max)
    • set defaultbuffermaxrows to 100
    • set blobtempstoragepath to a folder on a drive with enough space
    • set buffertempstoragepath to a folder on a drive with enough space

    • change multithread to 2

    Just gave that option a whirl, and it was another no go.  I'm thinking the issue revolves somewhere around 32 bit restrictions, despite being on a 64bit machine and running in 64 bit debug mode.

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply