High {no} availability and UPDLOCK

  • Here's the big picture. I'm working as an outside ressource for a tire distributor with the ERP MS Dynamics Nav 2009 SP1. My job is to build and maintain a transactional website in ASP .NET c# VS 2008 to take orders from the clients.

    We're on SQL 2008 Standard 64 bits windows 2008 4 bits.

    When MS Nav needs to start a posting sequence like for an order it starts by locking all required tables with an UPDLOCK.

    Under normal circumstances it works OK. The posting is generally done under 1 sec even if 400 queries are required for the process :sick:.. So in the absolute worst cases my queries will take 2 seconds to complete rather that 250 MS.

    Now when the other consulting company comes in to work and debug the application, the UPDLOCK remains in place for minutes at a time while they manually debug the application (IN PRODUCTION). This causes my web application to constantly timeout because it can't access the data. I've seen as much as 150 timeouts in 60 minutes.

    Now I've gone as far as kick their *** out the door (server's door) but since I have no authority on the matter they keep coming back and causing more damage.

    I have a somewhat limited understanding of locking and transaction isolation levels. I've read the articles on SSC but I'm still not conviced there's a way around that problem.

    Is there any way other than NOLOCK if possible to go around that problem?

    We're concerned about dirty reads but at the same time dirty read is better than no read at all for that application (this assumes short windows of such reads... like 1-2 hours a month). I was thinking about switching to nolock when I get a timeout so that I at the very least return some data to the client (which I can revalidate without nolock at the next postback).

  • My question is why are they debugging in production? If there is a problem, they should have a test environment in which to test and debug issues so they don't affect production processing.

  • You have stated that you have SQL 2008 with OS 2008. Here is something for you think about if it will solve the issue. Row Level Version. If you enable that, it does not matter if other people are also updating the table or reading from the table. It will just read the last committed value.

    -Roy

  • One of the snapshot isolations?

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • Lynn Pettis (5/10/2010)


    My question is why are they debugging in production? If there is a problem, they should have a test environment in which to test and debug issues so they don't affect production processing.

    I've been over that with them and my client 3 times... showed 'em how to do backup and restore to move the problem there.

    Sill they don't get it. The client doesn't seem to care [much] even if they lose 10 000$ in orders while he does that (for now)... rush time is in about 5 months when the first snow fall hits... then the cost is about 100K an hour.

  • Personally I would disable their login until they fully understand what they're doing.

    Consider it could take forever...

    -- Gianluca Sartori

  • Lynn Pettis (5/10/2010)


    My question is why are they debugging in production? If there is a problem, they should have a test environment in which to test and debug issues so they don't affect production processing.

    Any link to an article I can read?

    I meant to quote Roy, sorry for the mixup.

  • GilaMonster (5/10/2010)


    One of the snapshot isolations?

    [set beginner lvl on]

    What?

    [set beginner lvl off]

    I've read about transaction isolations but I don't know what you're asking.

  • Gianluca Sartori (5/10/2010)


    Personally I would disable their login until they fully understand what they're doing.

    Consider it could take forever...

    I did shut them out (after taking screenshots of the damage and proof of the culprit) while they were killing the site (and erp app too). They got pissed and went to my boss. He asked to reinstate the login so they could continue the work when the site is "less" busy.

    I've even pointed out that if someone did the same thing intentionally he could be sent to jail for piracy among other things... and never work again in that line of work.

  • Ninja's_RGR'us (5/10/2010)


    GilaMonster (5/10/2010)


    One of the snapshot isolations?

    [set beginner lvl on]

    What?

    [set beginner lvl off]

    I've read about transaction isolations but I don't know what you're asking.

    Snapshot isolation level or read committed snapshot isolation. They're optimistic concurrency levels. Readers don't block writers and vis versa. Select queries don't take or require locks.

    Downside is that it impacts tempDB fairly heavily.

    It has to be explicitly enabled on the database, once done you can request snapshot isolation like any other isolation level.

    SET TRANSACTION ISOLATION LEVEL SNAPSHOT

    Within that, you can query data without worrying about getting blocked. Your queries will use the appropriate row version from TempDB

    If read committed snapshot isolation is enabled (also on a database level), the database's default isolation level becomes read committed snapshot. Much same as above. I don't recommend this if you're sharing the DB with other apps, but it should (test carefully) be ok to enable snapshot isolation and then explicitly request that in your procedures.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • Thanks Gail. I'll read up on the subject and see if I can set the isolation level for my code.

    Do you have any articles to recommend? Written one yourself perhaps?

    What impact can I expect for tempdb with a server that doesn about 1 M queries / hour?.. The ERP's code seems to do as little as possible at each query so a lot of queries are required to do very simple tasks.

    We have only 1 DB on that server and it's around 2 GB after 3 months of use (on the low end of sales).

    98% of that data is from transaction ledgers, sales, etc.

    Server has 16GB ram, 14GB for SQL server.. current avg CPU use is around 2-3 %

  • Ninja's_RGR'us (5/10/2010)


    Do you have any articles to recommend? Written one yourself perhaps?

    No and no. Not an area that I've dived deeply into yet.

    What impact can I expect for tempdb with a server that doesn about 1 M queries / hour?.. The ERP's code seems to do as little as possible at each query so a lot of queries are required to do very simple tasks.

    It depends.

    Sorry, that's the only answer I can give. Depends on the size of data that the queries affect, the length of transactions and probably the direction the pigeons are flying.

    You'll need to do some careful and comprehensive testing.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • GilaMonster (5/10/2010)


    Ninja's_RGR'us (5/10/2010)


    Do you have any articles to recommend? Written one yourself perhaps?

    No and no. Not an area that I've dived deeply into yet.

    What impact can I expect for tempdb with a server that doesn about 1 M queries / hour?.. The ERP's code seems to do as little as possible at each query so a lot of queries are required to do very simple tasks.

    It depends.

    Sorry, that's the only answer I can give. Depends on the size of data that the queries affect, the length of transactions and probably the direction the pigeons are flying.

    You'll need to do some careful and comprehensive testing.

    The pigeons are not flying today (snowing). :w00t:

    I don't know how relevant this is, but on average, the t-log backups of the last month were about 500 mb in size. So unless the tempdb work is a 1000 fold that, the server should be able to handle it... that being said how can I efficiently test this?

    Is it possible to take a backup from production, then start a trace on prod. Then a couple hours later do a restore to test server and replay the trace there?

  • Roy Ernest (5/10/2010)


    You have stated that you have SQL 2008 with OS 2008. Here is something for you think about if it will solve the issue. Row Level Version. If you enable that, it does not matter if other people are also updating the table or reading from the table. It will just read the last committed value.

    Any links to an article (assuming this is different than what Gail is talking about).

  • Actually this is what Gail was talking about. Row level version is set up using Snapshot isolation level.

    Anyways here is an article[/url] that I wrote here. But I cant recommend that is the best one... 🙂

    -Roy

Viewing 15 posts - 1 through 15 (of 28 total)

You must be logged in to reply to this topic. Login to reply