Cube processing lock conflicts SSAS 2008 R2

  • We have a cube that is updated and processed hourly. We started seeing our job die due to "locking conflicts" during the PROCESS step. A little research has taught me that "locking conflicts" is the secret code for deadlock in SSAS and my job was chosen as the victim.

    According to the documentation, a cube must take an exclusive lock at the very end of the Processing step in order to get rid of the old files and replace them with the freshly processed files. Essentially, this is a windows operation but the lock manager is involved because it must lock a server level file called master.vmp.

    This particular server is dedicated to SSRS and SSAS so there are few users anyway and this is consistently failing at 5:30 am when little to no user activity is occurring. Normally, the processing of my cube only takes about 2 minutes so I'm a little bewildered as to why it's still running 30 minutes later as it has NO other competition in this time frame.

    We do know that another cube builds 30 minutes after this one starts and we have captured the deadlock statistics in Profiler and can see that this later build is what is causing my job to fail.

    Our CommitTimeout value is 0 and our ForceCommittimeout value is 30000. According to everything I've read, that second cube should wait indefinitely for my job to finish as a CommitTimeout = 0 is essentially wait until the lock is released. (Right?)

    Does anyone have any idea why this is NOT happening? What this is telling me is that I can't process more than one cube at a time without deadlocks occurring.

    "Beliefs" get in the way of learning.

Viewing 0 posts

You must be logged in to reply to this topic. Login to reply