Use of Maximum Insert Commit Size property in fast load option of OLE DB Destination?

  • HI,

    I want to know what are advantage and disadvantages of Maximum Insert Commit Size and Rows per batch properties in fast load option of data access mode in OLE DB Destination?

    please any body help me!!!

  • There are many reasons to manage these values. For instance, if you have millions of rows being loaded by your ETL process, the time to commit everything in a single transaction may be too long. You may want to commit portions of the data in the event of a failure in the package before all of the data is loaded. You may simply want to have small transactions so your log file does not grow out of hand.

  • Hi,

    Thanks for the reply.

    But I have package which is updating/inserting around 12 billion records and I want to test this package only on small records to ensure only success of package. can i use this option in this case??

  • I don't think this is what you want to do. If you are just trying to test the entire package on a sample set, you probably want to reduce the set of data being run through the package. To do this, add a TOP clause (SELECT TOP 1000 * FROM) to your data source. That will run only a small number of records through the package.

    If your source is not a SQL database, you can do this in other ways - let me know what your data source is and I can help.

    These max commit and batch size options in the destination really split up the chunks of data as they are inserted into your destination. The package will still move all of the 12 million records, it will just commit them in smaller groups. With a very large groups of records like you have, you may find you do need to use this option (in fact, I would bet on it), but for testing the package, reduce the size of your source data.

  • Thanks a lot for this info...

  • Hi,

    Do you know any good books and sites in which i can get to learn more on SSIS as i am working on same?

  • When I am executing DTS Package on SQL 2008 that time I am getting error - Description: The Maximum insert commit size property of the OLE DB destination "OLE DB Destination" is set to 0. This property setting can cause the running package to stop responding. For more information, see the F1 Help topic for OLE DB Destination

    Where is OLE DB destination in Management Studio, so I can change Maximum insert commit size property from 0 to any much value.

    Please help

    Thanks

    Prakash Gupta

  • From MSDN:

    A value of 0 might cause the running package to stop responding if the OLE DB destination and another data flow component are updating the same source table.

    To prevent the package from stopping, set the Maximum insert commit size option to 2147483647.

    Best regards

    Helmut Krampl

  • Try taking the course from Pragmatic Works - pragmaticworks.com

  • Buy this book: Professional Microsoft SQL Server 2008 Integration Services

    You will also find some valuable information on this site: sqlis[/url]

    Regards,

    Willem
    http://wschampheleer.wordpress.com[/url]

  • I need to load a flat file of 12 million records to a table. I want it to be All or None as an success or failure.

    So if I set Maximum insert commit size = 1,000,000 ( one million) so it will need to commit 12 batches to finish.

    My question is: If package failed on the 3rd batch, does it mean that my table would have 2 million rows committed even though I have set the Data Flow's TransactionOption = 'Required'? or would it rollback everything and my table had 0 row.

    My goal is that I want the package to be ALL or NONE but I also don't want lock up the table for too long. Any help is appreciated.

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply