Using a Variable for an IN Predicate

• Comments posted to this topic are about the item Using a Variable for an IN Predicate

• Todd, thank you for article.

Did you try UDF that return a table?

Cheers

• too keep it simple whith less code (but only for short in-lists)

Declare @inStr varchar(100)

Set @inStr = '1,2,4,6'

select * from testtable

where ','+@inStr+',' like '%,'+cast(column as varchar)+',%'

• Nice article contains good examples, description.

Thanks

• But if you are having to use a table, would you not just do a join now instead of doing an IN ?

• You should absolutely positively NOT use the CTE for this scenario unless you are guaranteed to have VERY few rows split out AND you do not have any signficant data skew. If you have either or both of those you will get screwed with a bad plan at least some of the time. Either a nested loop index seek plan with a kajillion rows or a scan/hash plan with a few rows. The optimizer can't have any idea how many rows are coming out of the split. Nor can the optimizer have statistics on the VALUES of the row(s) coming out of the split.

Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012

• Thanks Todd - great article. My company uses a similar function for splitting, but I simply have not had the need/opportunity to use it.

I have an upcoming task in which I have to modify a few stored procedures to accept a list for several of their arguments rather than a single value so this will work great. Very timely article for me! 😀

• Hey Todd

Re: splitting a string to a table variable "array" - give this a try:

DECLARE @values TABLE (value VARCHAR(10))

DECLARE @xml AS XML,@str AS VARCHAR(100),@delimiter AS VARCHAR(10)

SET @STR='A,B,C,D,E'

SET @delimiter =','

SET @xml = CAST(('<X>'+REPLACE(@str,@delimiter ,'</X><X>')+'</X>') AS XML)

INSERT INTO @values (value)

SELECT N.value('.', 'varchar(10)') AS value FROM @xml.nodes('X') as T(N)

SELECT * FROM @values

Cheers,

Ken

PS I can't claim to have come up with this - I found it somewhere on the net.

• And of course those of us with SQL 2008 can pass tables as stored procedure parametes now 🙂

--Mark Tassin
MCITP - SQL Server DBA
Proud member of the Anti-RBAR alliance.
For help with Performance click this link[/url]
For tips on how to post your problems[/url]

• Good article.

I still think though using User-Defined Table Types is easier and faster

• irozenberg (6/20/2011)

Todd, thank you for article.

Did you try UDF that return a table?

Cheers

The Delimited8KSplit is a UDF that returns a table. The last example bypasses the creation of a temp table and uses the UDF in a CTE. I could have modified the UDF to return an INT and used it directly, but I thought it would be better just to use Jeff's UDF the way it was, convert to INT and use that.

Todd Fifield

• Sarus-127369 (6/21/2011)

too keep it simple whith less code (but only for short in-lists)

Declare @inStr varchar(100)

Set @inStr = '1,2,4,6'

select * from testtable

where ','+@inStr+',' like '%,'+cast(column as varchar)+',%'

Saurus,

I've used that type of LIKE before. Sometimes it works just fine, but it doesn't scale very well and the examples I used all had indexes on the column. The LIKE operator with '%' on the left won't use indexes.

Todd Fifield

• TheSQLGuru (6/21/2011)

You should absolutely positively NOT use the CTE for this scenario unless you are guaranteed to have VERY few rows split out AND you do not have any signficant data skew. If you have either or both of those you will get screwed with a bad plan at least some of the time. Either a nested loop index seek plan with a kajillion rows or a scan/hash plan with a few rows. The optimizer can't have any idea how many rows are coming out of the split. Nor can the optimizer have statistics on the VALUES of the row(s) coming out of the split.

Kevin,

I thought about that before I tested the CTE version. For the small number of table elements it worked just fine. As I stated in the article, testing should be done to check performance. I realize that CTE's don't have statistics.

In a production environment where it's likely that there could be many elements I probably wouldn't even bother to bench mark the CTE version. I just brought it up as a possibility.

Todd Fifield

Hey Todd

Re: splitting a string to a table variable "array" - give this a try:

DECLARE @values TABLE (value VARCHAR(10))

DECLARE @xml AS XML,@str AS VARCHAR(100),@delimiter AS VARCHAR(10)

SET @STR='A,B,C,D,E'

SET @delimiter =','

SET @xml = CAST(('<X>'+REPLACE(@str,@delimiter ,'</X><X>')+'</X>') AS XML)

INSERT INTO @values (value)

SELECT N.value('.', 'varchar(10)') AS value FROM @xml.nodes('X') as T(N)

SELECT * FROM @values

Cheers,

Ken

PS I can't claim to have come up with this - I found it somewhere on the net.

Ken,

Thanks for your input. I'll give it a try.

Have you ever bench marked it against the technique I'm using?

Todd Fifield

• CELKO (6/21/2011)

Some version of this mis-use of SQL gets invented again every few months. It is a slow, dangerous mess. Are you familar with the long parameters list idiom? SQL is a data base language, not a tool for writing parsers.

Joe,

I'm very familiar with the fact that SQL is not a language for writing parsers. I worked on an early word processor in PDP-11 assembly language and I've written 2 text editors in C. I've also had to do parsing in COBOL. Believe me, C is the language of choice for parsing.

However, a delimited array of parameters is a very common SQL problem. The idea is to make it easy to program and still be robust both in the front end and back end. I've seen this requirement many, many times and this technique is rock solid.

This technique, in any case, is NOT a parser. All of the parsing is done at the beginning, which converts the string array into a TABLE, which is what SQL Server handles best. That's the whole point of the article - let SQL Server do what it does best. Join to tables and such.

Todd Fifield

Viewing 15 posts - 1 through 15 (of 48 total)