• "It depends"

    This would be specific to your environment, your databases, size, volume of data being moved around, etc.

    At our shop, I typically look for anything over 250,000 reads. I look closely at the procedure to determine what may be causing the reads, determine if it can be optimized (typically it's a bad JOIN, implicit conversions, keylookups, etc.) but sometimes it's just a giant query needed by an application that returns 800+ columns, and I have no way to really optimize it. Then again some queries have high reads which run rather quickly (unusual, but does happen), and vice-versa. The ones that also have a long duration typically are the ones I begin looking deeper at.

    We have some queries that are as high as 550,000,000 reads :crazy:

    I recommend you keep baselining over time, to really determine what's "bad" for your server, then create some threshold that suits your needs.

    ______________________________________________________________________________Never argue with an idiot; Theyll drag you down to their level and beat you with experience