Understanding and Improving Query Plans

  • girac127

    SSC-Addicted

    Points: 454

    Hi everyone.

    I have a process that is taking a long time to execute 12 hours down from 16 after addition of index and maintenance.  There are about 180+ million records of financial info in this table (the last 4 years). Theses records cannot be archived just yet.

    Running an invoice process is taking longer and longer. There are no blocks and less than 10% CPU utilization.  Looking at the SQL execution plan it looks pretty clean, I am noticing high costs.

    Can someone tell how to understand these plans and how to improve upon them?

    Attached is one example.

    SQLExPl

     

    • This topic was modified 1 month, 3 weeks ago by  girac127. Reason: Image not attached
  • Jeffrey Williams

    SSC Guru

    Points: 88448

    This is almost certainly a coding problem - but to be sure we would need to see the code and the execution plan.  The actual execution plan saved as a .sqlplan file.

    180 million rows seems like a lot - but really isn't so large that it should take 12+ hours for a query.

    Jeffrey Williams
    Problems are opportunities brilliantly disguised as insurmountable obstacles.

    How to post questions to get better answers faster
    Managing Transaction Logs

  • girac127

    SSC-Addicted

    Points: 454

    The is a Microsoft Dynamics AX ERP system and this is custom code that runs this invoicing process. There are roughly 80-100K transactions that get processed in this run so its not just one query.

    How can I save the .sqlPlan file?

  • Mr. Brian Gale

    SSC-Insane

    Points: 22765

    If I am looking at that execution plan correctly, the left-most operation is a cursor.  Cursors will impact performance as they are row based operations. If you can remove that cursor, you will likely see a performance improvement.

  • Jeffrey Williams

    SSC Guru

    Points: 88448

    I was not able to see the picture before - and since this is a cursor it is definitely a coding problem.  If you can provide the code - then someone will be able to help optimize that code.

    Jeffrey Williams
    Problems are opportunities brilliantly disguised as insurmountable obstacles.

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Greg Edwards-268690

    SSC-Insane

    Points: 20572

    Without the code, it is just a guess. But reducing the 180M rows down to a smaller number before running through them will be key.

    There might be an index on the Ledger that can be used. Date or Status come to mind. Might be the earliest date on the small side, or open status on the Ledger.

    An Exists query is something I have used at times to tackle similar issues.

    If you thought indexes got a big improvement, you might find a lot more yet to come. And when they say ‘row by agonizing row’ with a cursor, you really can feel the pain.

     

Viewing 6 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply