February 9, 2012 at 11:49 am
I have an SSIS package (2008R2) that reads a CSV file and loads a SQL Server table using an OLE DB connection (Data access mode: is Table or view - fast load).
The Maximum insert commit size: is the default value = 2147483647
The destination table has a column defined as decimal (15,4).
Each csv file has 144 rows in the file.
If 1 record has some bad data, I do not want any of the 144 records loaded.
Here's what happens once in a while.
Because of a piece of bad data, I get an OLEDB Error
SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0" Hresult: 0x80004005 Description: "Invalid character value for cast specification".
There was an error with input column ""PgPgOut_S"" (253) on input "OLE DB Destination Input" (197). The column status returned was: "Conversion failed because the data value overflowed the specified type.".
The csv file has a value 429496730000 which is too big for the database column.
The error happens on a row in the middle of the file, lets say row 50.
The 49 previous records still get inserted into the destination table.
Why is that?
How do I prevent any of the data from this file from inserting if there's a problem with one of the records in the the OLEDB Destination?
I thought that having the "Maximum insert commit size:" so high that it would keep all the destination records in 1 transaction and not commit it if there was a failure?
Am I missing something here?
Please help.
Thanks
February 10, 2012 at 1:47 am
It's the maximum commit size, so it only defines the maximum. Not a minimum.
Try setting also the rows per batch. And enable transactions to be sure.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
February 10, 2012 at 11:02 am
Koen,
Thank you for you response.
After some testing, the only thing that works here is Enabling the Transaction in the container.
I changed the TransactionOption to Required on the ForEachLoop container.
We haven't used transactions before in SSIS.
In development, I'm running bids on my local workstation and the table I am inserting into is a remote database server.
The only way I was able to get the "Transaction" to work is to use a local database connection to a local table. Then the distributed transaction works.
I run BIDS
Run the package locally
The database is local
Then the Distributed Transaction works and the records are not committed to the database if there's a failure.
But what I want to be able to do is develop and run the package locally with a remote database connection.
That's the piece that is not working.
How do the 2 Distibuted Transaction Coordinators need to be configured so the database connection used is a remote database?
Thanks in advance.
February 13, 2012 at 12:50 am
I'm not really experienced in configuring MSDTC, but I found this article:
Troubleshooting Problems with MSDTC
What always should work is encasing your dataflow into a SQL transaction. This article explains the concept:
SSIS Nugget: RetainSameConnection property of the OLE DB Connection Manager
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
February 13, 2012 at 7:25 am
I'll work with the DBAs and server admins to get it configured correctly.
Thanks for your help!
Viewing 5 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply