• MyDoggieJessie (3/6/2014)


    "It depends"

    This would be specific to your environment, your databases, size, volume of data being moved around, etc.

    At our shop, I typically look for anything over 250,000 reads. I look closely at the procedure to determine what may be causing the reads, determine if it can be optimized (typically it's a bad JOIN, implicit conversions, keylookups, etc.) but sometimes it's just a giant query needed by an application that returns 800+ columns, and I have no way to really optimize it. Then again some queries have high reads which run rather quickly (unusual, but does happen), and vice-versa. The ones that also have a long duration typically are the ones I begin looking deeper at.

    We have some queries that are as high as 550,000,000 reads :crazy:

    I recommend you keep baselining over time, to really determine what's "bad" for your server, then create some threshold that suits your needs.

    thank you 🙂 so if it depends on those various factors I guess baselineing overtime is the best way to do it like you say You mention a "ballpark" of an IO figure per query ...

    Do you have a ballpark youd start at for total LIO per server or for a level of percentage increase where youd start to think ouchie

    many thanks