Force Parameterized Queries vs Missing Indexes

  • Working with a third party that would like me to enable the server option Force Parameterized Queries on MS SQL server 2008 R2 Standard Edition with Max Mem set to 10 GB and 4 CPU cores licensed. Only 3GB of memory is currently in use and CPU % utilization is at 20% max in the last 96 hour cycle. They also would like us to set Max degree of parallelism to 1. Optimize for Ad hoc workloads is enabled.

    There are three other application databases on this server that have not been tested with theses options.

    This is an open call for any opinions or personal experience with these setting being changed to what is listed above.

    We are being told by a application vender that this will resolve the issues the server is having with Missing Indexes that are causing parameterized queries to do a full Table Scan on every execution. I can't find documentation that supports this claim, and they can not provide any.

    Has anyone one heard of enabling Force Parameterized Queries improving performance of queries that are causing table scans?

  • Hrm. I wouldn't think that enabling forced parameterization would fix a table scanning issue; from my understanding, that helps in cases where your workload is almost entirely ad-hoc queries that aren't in stored procedures, so you're using literal values on most of your queries. Even in those cases, it might hurt more than it helps, depending on a variety of factors.

    I'd be inclined to say that merely substituting a parameter value in place of a literal value won't fix the core issue of a missing index, as you're just using a stand-in for the literal value. It will make plan reuse easier, but if your queries don't have an index to look up specific values, they're going to have to scan the table every time to find what they need.

    A good dissection of how forced parameterization can be problematic can be found here.

    From that, you can see that, because of parameter sniffing, forced parameterization can utterly crush performance against indexes. Of course, if you don't have indexes, that's a minimized issue :hehe:. However, if you do have indexes, just not ones that are optimal, it could make things even worse!

    Naturally, if you can test your system on a testing environment against representative workloads first, do so; that will be the best means of finding out whether the setting will truly work out or not. Otherwise, activating the setting could be quite problematic.

    Also, setting MAXDOP to 1 server-wide is usually a rather poor suggestion as well; typically, increasing the cost threshold for parallelism is a much better idea. Setting the cost threshold to 30 as a testing base is typically a good idea, but, again, experimentation in a testing environment to determine the best fit is preferable.

    - πŸ˜€

  • As far as I'm concerned your vendor hasn't provided any evidence for making any changes on your server.

    Are you experiencing performance issues and how do you know?

    Are users complaining about performance?

    What are the issues/symptoms that led to these suggestions from the vendor?

    Having said that and asked those questions here's what I think about the recommendations:

    I wouldn't set forced parameterization on to fix an issue with table scans. The fact that queries aren't parameterized isn't causing table scans, the problem is incorrect or non-existent indexes.

    As far as MAXDOP goes, first I'd want to verify that parallelism is causing a problem. Then I'd start with configuring it accrding to this, http://support.microsoft.com/kb/2806535 and set Cost Threshold for parallelism to something higher than the default. Brent Ozar Unlimited recommends 50 - http://www.brentozar.com/archive/2013/09/five-sql-server-settings-to-change/ but I think that’s a little high. And Jonathan Kehayias has query he uses to find parallel plans and adjust it based on that. http://www.sqlskills.com/blogs/jonathan/tuning-cost-threshold-for-parallelism-from-the-plan-cache/.

  • Andrew Kernodle (7/16/2014)


    Hrm. I wouldn't think that enabling forced parameterization would fix a table scanning issue; from my understanding, that helps in cases where your workload is almost entirely ad-hoc queries that aren't in stored procedures, so you're using literal values on most of your queries. Even in those cases, it might hurt more than it helps, depending on a variety of factors.

    I'd be inclined to say that merely substituting a parameter value in place of a literal value won't fix the core issue of a missing index, as you're just using a stand-in for the literal value. It will make plan reuse easier, but if your queries don't have an index to look up specific values, they're going to have to scan the table every time to find what they need.

    A good dissection of how forced parameterization can be problematic can be found here.

    From that, you can see that, because of parameter sniffing, forced parameterization can utterly crush performance against indexes. Of course, if you don't have indexes, that's a minimized issue :hehe:. However, if you do have indexes, just not ones that are optimal, it could make things even worse!

    Naturally, if you can test your system on a testing environment against representative workloads first, do so; that will be the best means of finding out whether the setting will truly work out or not. Otherwise, activating the setting could be quite problematic.

    Also, setting MAXDOP to 1 server-wide is usually a rather poor suggestion as well; typically, increasing the cost threshold for parallelism is a much better idea. Setting the cost threshold to 30 as a testing base is typically a good idea, but, again, experimentation in a testing environment to determine the best fit is preferable.

    Thanks for your reply. This is everything that I said, and everything any documentation has said, but they are telling everyone we are wrong.

  • Gah. Yep, that sucks. Our company uses vendor software, and the vendors insist the same thing about my commentary on NOLOCK being added to everything in the coding :(.

    I'm not sure if it would help your case any at all, but if you can do the testing with forced parameterization, capture the difference in performance across several different types of workloads, and present it to the vendor, you may be able to argue your case better. Or they may totally ignore you and cite something nonsensical as "proof" of the soundness of their advice, painfully enough.

    Largely, it may come down to how far you're willing to tussle with the vendors (or how much leeway you have with management to do so). It's disheartening that things can be that way in some shops, but if you find that the results are going to be disastrous, try your best to argue for your case. If all else fails, should the software comes crashing down later, you can point out (humbly!) that you were trying to prevent such a thing from happening!

    - πŸ˜€

  • may be a mute point has the vendor or have you done any index analysis? If you look at brent ozars blitzindex one of the best tools yuou can find to do this. May or may not help your argument, but may help decipher another issue.

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply