SSIS data flow fails but inserts a few rows

  • Hello,
    This happened a few days ago, a data flow from a flat file source to a SQL table failed when it encountered a different data type in a column, but to my surprise, it didn't roll back the transaction. Several rows were left inserted into the table. Is this normal behavior? Does SSIS split up a data flow into multiple transactions? The row count for this dataflow is about 230000..
    Thanks!

    IsolationLevel is set to Serializable, TransactionOption is supported.
    I found this comment: The IsolationLevel property in SSIS components only applies when distributed transactions are used (package or other container has TransactionOption=Required). To me that means you have to set TransactionOption=Required for every package, otherwise you can't control what's happening in a data flow?

  • Yes, it's quite possible for this to happen, especially for a data error in the source. 
    There is a way to control the records per transaction, fFor example, in the ADO NET Destination task of a Data Flow, there is a BatchSize property.  The OLE DB Destination task has a Rows Per Batch setting in the task editor:
    http://www.developer.com/db/top-10-methods-to-improve-etl-performance-using-ssis.html

  • From what I'm reading, this setting is only a hint used by the Optimizer:
    ROWS_PER_BATCH =rows_per_batch
    Indicates the approximate number of rows of data in the data file. +

    By default, all the data in the data file is sent to the server as a single transaction, and the number of rows in the batch is unknown to the query optimizer.

    But that's what the FastLoadMaxInsertCommitSize does, and I have it set to the default (2147483647):
    "Some fast load options are stored in specific properties of the OLE DB destination. For example, FastLoadKeepIdentity specifies whether to keep identify values, FastLoadKeepNulls specifies whether to keep null values, and FastLoadMaxInsertCommitSize specifies the number of rows to commit as a batch. Other fast load options are stored in a comma-separated list in the FastLoadOptions property. If the OLE DB destination uses all the fast load options that are stored in FastLoadOptions and listed in the OLE DB Destination Editor dialog box, the value of the property is set to TABLOCK, CHECK_CONSTRAINTS, ROWS_PER_BATCH=1000. The value 1000 indicates that the destination is configured to use batches of 1000 rows. "

    So I went and checked FastLoadOptions in the advanced editor and there is nothing in the list, which to me means that this shouldn't happen?

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply