DTS commit error

  • Hi all. I've got a really strange error that I'm having trouble isolating.

    I have a DTS Package that is run by another DTS Package that is run as a scheduled job. The first DTS package inserts a few rows into a table. Then it calls the second package that inserts even more rows into the same table. This process works just fine every day that it runs except Mondays. The last few mondays, I have been getting an error when the second package runs. Here is the error in my log file:

    Step Error Source: Microsoft Data Transformation Services (DTS) Data Pump

    Step Error Description:The number of failing rows exceeds the maximum specified.

    (Microsoft OLE DB Provider for SQL Server (80004005): Warning: Fatal error 3624 occurred at May 23 2005  3:02AM)

    (Microsoft OLE DB Provider for SQL Server (80004005): Location:  pageref.cpp:909

    Expression:  (xdes != NULL) && !xdes->IsReadOnly ()

    SPID:   64

    Process ID:  600)

    Step Error code: 8004206A

    Step Error Help File:sqldts80.hlp

    Step Error Help Context ID:0

    Step Execution Started: 5/23/2005 3:05:04 AM

    Step Execution Completed: 5/23/2005 3:05:51 AM

    Total Step Execution Time: 47.14 seconds

    Progress count in Step: 24000

    If you notice, it says the progress count in the step is 24000. The Insert Batch Size on the Datapump task is set to 25000. I have tried to run this manually and it always fails when it hits the 25000th row. And if I modify that batch size to a different value it will fail on that row. So, it is obviously something to do with the package trying to commit the data to the database.

    Other packages run and write data to other tables in this DB with no problem. So, it has to be specific to this table. I have checked the permissions on the table and the user and none of them are set to read-only. I have run this package all by itself(independent of the first package) and it will still fail. And then, of course, it will suddenly start working. And then we'll have no problems when it runs T-F. But come Monday, we'll have the problem again.

    Anyone have any suggestions? Thanks in advance!

    Scott

     

  • As per Bob Geldof  - "Tell me why I don't like Mondays". Sorry, but could not resist that one.

    Anyhow, just a guess, but you may want to check that nothing is happening on the server on Mondays. IE any maintenance jobs or truncating of log files etc.

    Another thing to look at is the actual data. Check to see if there are any fundamental differences between Monday and the rest of the week.


    ------------------------------
    The Users are always right - when I'm not wrong!

  • Amen to that!

    Thanks for the suggestions. I will check, but I don't understand why other jobs, that run both before and after this job, don't have any problems. They run to other tables, but I would think if it was a maintenance job, it would affect other jobs too. But I will check the timing of those. 

    And, unless I'm missing something, I don't think it's the data. I can run this package to our DEV server and it runs fine. So, it's SOMETHING about the production server.

    I have looked at the table properties and it doesn't show that it's read only or anything. Or maybe that message is wrong and leading me astray?

    Thanks for the post.

    Scott

Viewing 3 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply