Records Are Getting Rejected During Loading Data Into OLEDB Destinations

  • During the execution of SSIS Package which is populating huge data into OLEDB Destination from OLEDB Source, then some of the records are getting rejected. Again if we are executing it for second time, the rejected records are getting inserted.

    Wants to know, why the records are getting rejected? Target table does contain only primary key constraints.

    Please help!!!!!!

  • Between the source and destination, are you doing any lookups?

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • Hi,

    While inserting into destination table through SSIS, some times I am getting a peculiar error. Some times few rows are rejected without any reason. While re-executing those DFTs those rejected records got inserted. Destination table has only Primary Key Constraints. I have checked those rejected records are not duplicates or NULL.

    Can you give some hints for probable reason? Or any solution for this situation...

    Thanks and regards,

    Debanjan

  • debanjan.ray (7/17/2012)


    Hi,

    While inserting into destination table through SSIS, some times I am getting a peculiar error. Some times few rows are rejected without any reason. While re-executing those DFTs those rejected records got inserted. Destination table has only Primary Key Constraints. I have checked those rejected records are not duplicates or NULL.

    Can you give some hints for probable reason? Or any solution for this situation...

    Thanks and regards,

    Debanjan

    You have not answered my question - please re-read my initial post.

    Please post the full text of the 'peculiar error'.

    Are the same rows rejected each time?

    There will be a reason - there always is. But sometimes that reason can be very difficult to find.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • There is always a reason. What is the error message?

    I usually see something like "violated integrity constraints" as we do not allow nulls and sometimes one attempts to slip in.

    Confirm table properties against a data viewer that is positioned right before insertion.

  • We execute this package through job agent. Since rejected rows are redirected to a rejection table, so that we cant get proper rejection message. While re-executing that packages through debug mode, those rejected records got inserted.

    Can you give some suggestion how to track or how to resolve this issue??

  • debanjan.ray (7/20/2012)


    Can you give some suggestion how to track or how to resolve this issue??

    Yes. Answer Phil's and herladygeekedness' questions. What is the peculiar error you were talking about?

    Explain in excruciating detail everything that you do inside the package.

    Is the source data the same on the server as in the development machine?

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • By the name of peculiar error, I want to mean the valid records getting rejected and redirected to rejection table.

    In the package we will fetch data from different database but same server. We will replace the NULL values with some default values. After that we will do a loop up on destination table. Then records are inserted into destination.

    Data in development environment is same with server.

    I hope now I can explain the process..

  • Since you are using a Lookup, are you sure the rejected rows are inserted in the second row, or are they discarded by the Lookup component?

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • No.. Rows are not discarded by look up. Because this happened for initial load also. If rows are discarded by look up, then it should not get inserted for second time when we run through debug mode.

  • Are you sure the package on the server is the same version as the one used in debug mode? Check the build numbers.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • yes... both the package are of same versions.

  • At which component is the rejection table linked?

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Kindly check the Destination.Below may be the cause

    Right click on OLE DB Destination, select properties window. Find AccessMode in the properties, you can see that the AccessMode value is OpenRowSet using FastLoad. This will cause that all records from input insert with a fast load transaction at one time. So this will cause error.

    Change AccessMode to OpenRowSet . This option force OLE DB Destination to insert data row by row.

    Note that this is not good option when you insert large amount of records, because insertion with fast load has better performance, but in this sample I just want to show you how to handle failed rows in insert.

    check the detailed explanation in http://www.experts-exchange.com/articles/3422/How-to-handle-Failed-Rows-in-a-Data-Flow.html

  • When you divert records on error (with a red linking line between transformations) then the last two columns available are error code and error column. You can google the error code for the meaning or try MSDN then use the advanced editor on the component that starts the red arrow line and look at the input and output properties and look though each column of both external and output (or input) columns to find the column id that matches the given error column.

Viewing 15 posts - 1 through 14 (of 14 total)

You must be logged in to reply to this topic. Login to reply