April 17, 2025 at 4:53 pm
Why would using variables in place of hard coded numbers impact a query's performance? The below code is just an example of what I'm asking. In my case I have the same query twice with the difference being that one has the values hard coded in the criteria and the other is using variables that store those same numbers. In my case the query with teh hard values ruins in <1 second where as the one using variables takes around 12 seconds. If I use the Execution Pan feature in SSMS , it says the query (using the variables) is missing an index and of course it makes a suggestion on what to add.
I'm just baffled by this b/c I have never seen the use of variables in place of hard values do this. I did find a post on this site from 2016 where Gail Shaw said it was "It's parameter sniffing, or more specifically it's lack of parameter sniffing" but there was no resolution shared in teh thread. There were some suggestion to use an SP with parameters but that's it. Jeff Modem (ion case you read this Jeff) posted a few months later asking if a solution was ever found and if it could be shared but that's it. I'm think since it's been almost 120- years since then maybe one of you has the answer to this?
FYI: In case it's relevant, the data type they variables are using, NUMERIC(18,0), is teh same as T.Key
Thanks in advance for any assistance you can provide
DECLARE @MinVal NUMERIC(18,0), @maxVal NUMERIC(18,0)
SELECT @MinVal = 123456780, @maxVal = 123456789
SELECT T.Col1, T.Col2, T.colX
FROM MYTABLE T
WHERE T.KEY BETWEEN @MinVal AND @MaxVal
SELECT T.Col1, T.Col2, T.colX
FROM MYTABLE T
WHERE T.KEY BETWEEN 123456780 AND 123456789
Kindest Regards,
Just say No to Facebook!April 17, 2025 at 7:49 pm
Are the execution plans different? I would look there first. And test it out on a dev instance so you can do terrible things like DBCC FREEPROCCACHE, which you should not do on a live server.
April 17, 2025 at 8:14 pm
Thanks for replying. Unfortunately we have only a live environment and I have DB level access only because this ia a cloud based setup and while we have a private cloud with this vendor we still are limited to DB level access only.
I'll have to go back and re-check the plans. We close early today and I'm headed out for Easter holiday so I'll have to look at the plans later.
Thanks again
Kindest Regards,
Just say No to Facebook!April 20, 2025 at 3:09 pm
SQL Server can maintain statistics that "tell" it how many rows are in given ranges of key data (histogram data). For hard-coded, SQL can of course then pick a plan that matches the specific value(s) given. For a variable, SQL has to use a more generic plan.
For example, say the hard-coded value(s) shows only 1 row. SQL then chooses to do a row lookup/seek. But if a variable result has 100,000 results, SQL will still do lookups, but a join or some other method would have been more efficient in that case.
I think OPTION RECOMPILE helps with that; at the cost of RECOMPILE time, naturally. I've found it's usually worth it.
SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".
April 29, 2025 at 11:44 pm
"If I use the Execution Pan feature in SSMS , it says the query (using the variables) is missing an index and of course it makes a suggestion on what to add."
My question is, what index does the "hard values" query use? I would think that they would both use the same index unless there is a filtered index that's being used by the "hard values" version. I find it unlikely that you would have a filtered index for the range of 123456780 to 123456789, but maybe that key range is significant for you all and is used enough to create a filtered index for it. Filtered indexes only work with hard-coded values, not variables, as the criteria has to match what is in the filtered index exactly.
I think a filtered index is unlikely, but comparing the index used in the "hard values" version to the suggested/missing index in the variable version may shed some light on what is going on. Plus, adding the suggested/missing index, or a version of it, may resolve your performance issues. I usually compare the suggested/missing index to the existing indexes for the table to see if any of the existing indexes can be modified to serve this query's needs while still fulfilling the original needs of the existing index.
I was about to suggest checking statistics for the KEY column and its sample percentage, but that wouldn't help with variables being used as the variables aren't evaluated during the execution plan creation.
If index work doesn't help then ScottPletcher's advice is solid and may help your issue.
May 1, 2025 at 10:56 am
it might be parameter sniffing.
May 1, 2025 at 12:18 pm
I don't recall the index because I ran this test on a number of large tables and the outcome was always the same. It could be that al the tables had similar configured indexes but I believe the real issues is the parameter sniffing explanation from Gail Shaw
Kindest Regards,
Just say No to Facebook!Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply