I'm attempting to make sense out of a performance issue I'm seeing with one of our applications. Basically, there's a screen in the app that displays equipment records, and a search filter that our users can make use of. Throughout the day, several users are running searches on this equipment screen. Some of these searches might return 2 records, some might return 2,002.
As the day progresses, the searches on this screen, which may perform reasonably well at one point during the day, hit a point at which they become very sluggish and start timing out.
The underlying SQL that appears to be performing erratically is a call to the system sp_cursoropen procedure. The select statement for this equipment screen and the where clause containing the filter values are passed into this sproc call as the statement argument.
I have noticed that rebuilding indexes on a couple of the tables referenced in that query statement can improve performance temporarily. Based on my limited SQL knowledge, my current assumption is that the issue is related to parameter sniffing and the caching of bad execution plans, where are getting invalidated by the rebuilding of indexes.
At this point, I'm looking for any suggestions on how one might generally resolve such an issue. The application is by a third-party, and we are limited in our ability to customize it.
As a short term band aid, is sp_recompile, when used on a table, a preferable and less disruptive means of invalidating a bad execution plan when compared to rebuilding indexes during work hours?
Does the behavior I'm describing sound like a bad execution plan that might be attributable to parameter sniffing, or might there be other possibilities I should be looking to explore?
Do queries with a greater number of joins have a greater likelihood of ending up with a bad execution plan?
I greatly appreciate the help!