• In my case this was caused due to the following properties

    [Data Flow Task]

    DefaultBufferMaxRows

    DefaultBufferSize

    [SQLServerDestination]

    MaxInsertCommitSize

    You need to play around with those three parameters and get the correct values for your hardware/software/database configuration. First try the MaxInsertCommitSize parameter - this is set to 0 by default which means during a bulk insert that it performs one transaction (COMMIT) for the bulk insert. This might cause buffer issues if you don't have enough memory for a large dataset. In my case it was a transfer of 11 million rows that failed and I have now set the parameter to 250,000. MaxRows is set to 10,000 and BufferSize is set to 104857600 (100MB). As the articles below advise, try to avoid swapping to disc at all costs.

    Read these articles for some very useful tips:

    Top 10 SQL Server Integration Services Best Practices

    Improving the Performance of the Data Flow