• I also had a similar problem at work.

    I found out, the Procs were being run with totally unnormal (extreme) parameters at first to test the proc. When finally the proc was used, the query plan was optimized for the extreme parameters and not for the day to day standard things that it normally received.

    After the tests were rebuilt, to first execute the procs with *standard* params everything worked fine, except for the odd occasion, where the proc actually receives real extreme data (of course the query plan is then optimized for the normal data and the queries take longer) but as that is once in a blue moon I can live with that. I don't know if running a proc by default with normal params every minute would really help ... if the proc is THAT important it's plan shouldn't be chucked out too often anyway should it???

    If the difference in execution is sooo extreme, then I would rather have 2 stored procs - one for extreme data and one for normal, that way each would be optimized as far as can be for the data they receive - of course there would have to be a switch on the front end ... but then again, if it makes the users happy

    nano