Recently I was designing SSIS for loading data from OLTP to DW database. As per the requirement the bad data has to be captured and the count should be checked. I decided to capture the SQL exception also that occurs in OLED Destination by redirecting the rows to error table. But I din’t get the expected result of error records. For example, if there are 1000 records and 50 records are bad after running SSIS and error record count exceeds 50. I am
surprised when some error records from error table are selected and inserted in SSMS and record inserted successfully. My second test of loading the same set of data in SSIS returned successful. Again when I tried to load the whole 1000 records in SSIS, same error count exceeds 50 as before. So I decided to investigate more is SSIS properties and recalled the effect of defining Maximum Insert commit Size in OLEDB Destination. This setting is important in terms of performance and accurate error data capture.
The demo illustrates the above scenario.
Create SQL Objects.
create table Department(
deptid int not null primary key,
-- I am using AdventureWorks DB to quickly populate the data
INSERT INTO Department
WHERE DepartmentID <= 10
-- Source data has 120 records with 20 records no reference key
-- ie. 20 records with deptid > 10 is considered as error records
create table SourceEmp (
INSERT INTO SourceEmp
SELECT TOP 120 [EmployeeID],'AAAA'
create table Employee (
Deptid int references Department(deptid)
create table Error_Employee (
Now create SSIS package with Dataflow task. Define OLEDB source with SQL command select * from SourceEmp order by 3
Drop OLEDB destination and connect the output from source with default settings. Point the destination table to Employee and assign column mapping
Drop another OLEDB destination (rename to error log) and connect the red arrow from OLEDBDestination and in the Configure error output popup window select “Redirect” in the Error column. Point the destination table to Error_Employee and assign column mapping
Now run the SSIS package and you will be surprised to see all 120 records are pushed to error table
Now right click OLEDB Destination and select Edit. Change the default value for Maximum Insert commit Size(MICZ) from 2147483647 to 15 and execute the package.
Now surprise to see 30 records are pushed to error table. The reason is data is buffered to count 15 and bulk inserted to destination. Records are loaded into destination from 1 to 90. But the batch 90 to 105 there are 5 error records which gets SQL exception and the whole buffer data is redirected to error log.
Now experiment with different MICZ and check the output.
The result shows that the minimum MICZ size the accurate error log. But this becomes bottleneck in performance.
Appreciate to know if there is any other best way to track the error records from OLEDB destination.