Problem with DeadLock Transaction

  • Hi,

    I have this procedure, that does a lot of processing, from an input file. It loads the file, does the processing, and if an error occured, writes the error to disk, and rollbacks.

    The transaction is not made by the SQL Server, since it is managed by the Application directly, as everything that is done with a sql connection. So when I have to build my code to work in a transaction that is managed by the application. If I return an error, the application will rollback the transaction, and if no error occur, the transaction will be committed.

    So here is the pseudo code of the procedure:

    Create proc Import

    AS

    BEGIN TRY

    -Import the File in working tables

    -Process the file (import or update in real tables)

    -ERROR is generated in the process file (For testing purposes), so the CATCH is triggered.

    END TRY

    BEGIN CATCH

    Write the error to a log table (Along with "Warnings" that are already written in the log), So I cannot just write THIS error in the log table, I need the warnings that were generated in the processing of the file.

    --Then write the errorLog to the Disk. (This is the deadLock)

    sp_executeSQL 'bcp "Select * from GICSPF.dbo.GICSPFPCUpdateError with (READUNCOMMITTED)" queryout c:\Send\SPF\UpdateErrorLog' + '.log

    --Then raise the error (so the application can rollback this transaction)

    RAISERROR (N'%s', -- Message text.

    21, -- Severity,

    1, -- State,

    @ErrorMessage) WITH LOG

    END CATCH

    ...

    So, here is my explanation of the problem: When I execute this procedure within a transaction:

    1. An error occurs in the processing of the file.

    2. The catch traps the error and inserts in the log table (the error)

    3. When I try to use the BCP utility, (I've read a lot on this), it opens a new connection, and tries to access the log table, which is currently locked by my transaction. The transaction then waits for the completion of BCP before unlocking the table, and BCP waits for the table to get unlocked. So, I've created a Deadlock.

    1. How can I get through this Issue, since I cannot handle the transaction that is managed by the application?

    2. Is there another way to write the query to the disk, that will work for a complete table? Like a Select * from ?

    3. I've tried reading the table even if it was locked, using WITH (READUNCOMMITTED), and with (NoLock) which seem to be the same in the Microsoft Documentation, is there something I'm missing to be able to select the data from the table?

    Any answer will be greatly appreciated, I don't know how to fix this issue, and I have no choice but to make it work.

    Thanks,

    J-F

    Cheers,

    J-F

  • Ah yes, the notorious "Undetectable BCP Deadlock". The usual fix involves restructuring the transaction so that you can do your logging outside of the transaction boundaries. If you cannot do that then you do not have many options left.

    Logging through Service Broker might work, but most people do not want to bite off on that much initial effort. The only other thing that I can think of that might work is to put BEGIN WORK .. COMMIT WORK around your "Write to Error Log table" step. I am really uncertain if that will work, so just try it and let us know. 🙂

    [font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
    Proactive Performance Solutions, Inc.
    [/font]
    [font="Verdana"] "Performance is our middle name."[/font]

  • Oops, I got that wrong. Try it like this:

    BEGIN TRAN Logging

    ...(write to log table)

    COMMIT TRAN Logging

    Hmm, looking at BOL, I am really doubtful that this will work...

    [font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
    Proactive Performance Solutions, Inc.
    [/font]
    [font="Verdana"] "Performance is our middle name."[/font]

  • Thanks for the reply, unfortunately, the idea does not work, the query still freezes when I try adding a new transaction in the current one. The top transaction is blocking the access to the table anyhow.

    Anything else you guys can think of?

    Thanks in advance,

    J-F

    Cheers,

    J-F

  • Use OPENROWSET and Bulk export the Logging data as XML?

    [font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
    Proactive Performance Solutions, Inc.
    [/font]
    [font="Verdana"] "Performance is our middle name."[/font]

  • Hmm. Well, now I am not sure that that is possible either. BOL implies that you can use OPENROWSET like that, but every reference that it makes to exporting XML just points back to BCP.

    This is annoying, why aren't there any output facilities built into the data engine?

    [font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
    Proactive Performance Solutions, Inc.
    [/font]
    [font="Verdana"] "Performance is our middle name."[/font]

  • That would be interesting, since the BULK or OpenRowSet would use the same connection, so the same transaction, and access the "Uncommited rows".

    Can you give me a link or a piece of code showing how to export a select * to a file? Using Bulk export?

    Thanks.

    J-F

    Cheers,

    J-F

  • At this point, the only two things that I am fairly certain would work are:

    1. A SQLCLR proc that recieves an XML parameter and writes it to a file (85% sure that this will work). and..

    2. Sending the Logging info to a Service Broker queue that Receives then BCP's it out (75% sure that this will work).

    Other than that, you will have to do it outside of the prior transaction. Which means either rolling-back the transaction where you are and then writing out the data (you can then start another transaction to fool the caller's Transaction Management), or passing back the Logging info, and let the caller handle it.

    [font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
    Proactive Performance Solutions, Inc.
    [/font]
    [font="Verdana"] "Performance is our middle name."[/font]

  • None of this would be a problem if you first inserted the data into a staging table and validated it.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • It is actually a requirement to try importing the data as is, because we want to know if errors occured. That update will happen in 100 stores, and we need to know if there were some errors, to fix the sent updates, and send them back. This is why we need to rollback the transaction, and write to a file, that will be sent here, and tell us we need to send the update again.

    I'm looking in CLR Integration right now, I did not know how it worked at all, I only knew you could integrate VB code in a procedure, but that's it.

    Anyways, I'm looking into the CLR, as this looks really interesting, and seems I can get something to work with it, thanks a lot guys!

    J-F

    Cheers,

    J-F

  • Glad we could help. FYI, for CLR you will probably need the EXTERNAL_ACCESS permission and SAFE is not enough for writing to files.

    [font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
    Proactive Performance Solutions, Inc.
    [/font]
    [font="Verdana"] "Performance is our middle name."[/font]

  • jfbergeron (9/4/2008)


    It is actually a requirement to try importing the data as is, because we want to know if errors occured. That update will happen in 100 stores, and we need to know if there were some errors, to fix the sent updates, and send them back. This is why we need to rollback the transaction, and write to a file, that will be sent here, and tell us we need to send the update again.

    I'm looking in CLR Integration right now, I did not know how it worked at all, I only knew you could integrate VB code in a procedure, but that's it.

    Anyways, I'm looking into the CLR, as this looks really interesting, and seems I can get something to work with it, thanks a lot guys!

    J-F

    My suggestion of using a staging table still works for that... and, if there are any failures, not only do you NOT have to do a rollback, you also have a handy table to create an errata sheet from.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

Viewing 12 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic. Login to reply