April 6, 2015 at 7:10 am
Hi,
I have a SQL statement that is running against a VERY large table that has an index on the "fmentityobjectid" column.
Here is my where:
WHERE mytable.fmentityobjectid IN ( 99898, 393848, 383883, 12345, 738979, 83839, 11111111)
The in list can have up to 200 items. I am finding my query is going into Parallelism causing it to run VERY slow. I have changed
my query to join the list using:
JOIN (SELECT * FROM (Values (99898), (393848), (383883), (12345), (738979), (83839), (11111111))
entityobjects(fmentityobjectid)) x ON mytable.fmentityobjectid = x.fmentityobjectid
The query optimizer seems to pick a better plan and the query does run faster (no parallelism). In general is it better to join list of values rather than using Where ... IN (...)? Any feedback would be greatly appreciated.
Thank,
Rob
April 6, 2015 at 8:45 am
I'll go one further. I'd suggesting taking a look at Jeff Moden's article on Tally Tables[/url] and using that to take the comma delimited list and turn it directly into a table.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
April 6, 2015 at 12:03 pm
Quick questions, can you post the actual execution plans for the two queries? Are the statistics up to date?
😎
There are quite few things that can affect this kind of query, including the statistics, total cardinality, match/hit ratio, number of columns in the output vs. number of columns covered by an index etc., my default choice would be WHERE IN, that is IF everything is up to date.
April 6, 2015 at 12:21 pm
Grant Fritchey (4/6/2015)
I'll go one further. I'd suggesting taking a look at Jeff Moden's article on Tally Tables[/url] and using that to take the comma delimited list and turn it directly into a table.
+1
If you can get the contents of the list into a table's indexed column, you should be able to get generally good performance.
Don Simpson
April 6, 2015 at 12:29 pm
DonlSimpson (4/6/2015)
Grant Fritchey (4/6/2015)
I'll go one further. I'd suggesting taking a look at Jeff Moden's article on Tally Tables[/url] and using that to take the comma delimited list and turn it directly into a table.+1
If you can get the contents of the list into a table's indexed column, you should be able to get generally good performance.
Quick thought, the Constant Scan will have a pretty good cardinality estimation where the Tally table estimation is more of a lottery, thinking about it, maybe OPTION RECOMPILE is worth exploring?
😎
April 6, 2015 at 12:56 pm
Just to be sure to get the best lookup, insert the individual values in the IN list into a table uniquely clustered on the value. Then, again just to be sure, update stats on that table. Then do the join to the big table. If you need to, explicitly specify "OPTION(MAXDOP 1)" in the final query to prevent a parallel plan.
CREATE TABLE #in_list ( fmentityobjectid int PRIMARY KEY )
INSERT INTO #in_list
VALUES (99898), (393848), (383883), (12345), (738979), (83839), (11111111)
UPDATE STATISTICS #in_list WITH FULLSCAN
SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".
Viewing 6 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply