Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase «««34567»»

The Multi-valued Parameters problem in Reporting Services Expand / Collapse
Author
Message
Posted Friday, August 13, 2010 1:44 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, May 11, 2011 12:24 PM
Points: 5, Visits: 16
I have used this code many times since I first read this article. Now, how do I create multiple variables within the same stored proc?
Post #969240
Posted Friday, August 13, 2010 1:49 PM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Wednesday, September 17, 2014 5:58 PM
Points: 27, Visits: 306
Just declare as many variables as you need, convert report parameters to delimited string send to sproc ... use the same scenario. Let me know if you have any problem.
Post #969246
Posted Friday, August 13, 2010 1:52 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, May 11, 2011 12:24 PM
Points: 5, Visits: 16
Thanks for the quick reply. Having a little trouble getting the order correct with the code. Get the declare and set statements to work but the CTE is tripping me up.
Post #969248
Posted Friday, August 13, 2010 2:17 PM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Wednesday, September 17, 2014 5:58 PM
Points: 27, Visits: 306
INSTALL function (code below) and use it to split a value

CREATE FUNCTION [dbo].[StringToTable]
(@RawString nvarchar(4000), @Delimiter nvarchar(10) )
RETURNS
@VList TABLE (RowID int IDENTITY(1,1),SplitedValue varchar(200))
AS
BEGIN
WITH CSVCte (StartPos, EndPos) AS
( SELECT 1 AS StartPos, CHARINDEX(@Delimiter , @RawString + @Delimiter) AS EndPos
UNION ALL
SELECT EndPos + 1 AS StartPos , CHARINDEX(@Delimiter,@RawString + @Delimiter , EndPos + 1) AS EndPos
FROM CSVCTE WHERE CHARINDEX(@Delimiter, @RawString + @Delimiter, EndPos + 1) <> 0)

INSERT INTO @VList
SELECT SUBSTRING(@RawString, StartPos,EndPos - StartPos) FROM CSVCte

RETURN
END
GO

-- sample call: select SplitedValue from dbo.StringToTable('A,B,C,D,E,F,G', ',')
Post #969263
Posted Saturday, August 14, 2010 11:23 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 10:33 PM
Points: 35,259, Visits: 31,740
Alex Grinberg-230981 (8/13/2010)
INSTALL function (code below) and use it to split a value

CREATE FUNCTION [dbo].[StringToTable]
(@RawString nvarchar(4000), @Delimiter nvarchar(10) )
RETURNS
@VList TABLE (RowID int IDENTITY(1,1),SplitedValue varchar(200))
AS
BEGIN
WITH CSVCte (StartPos, EndPos) AS
( SELECT 1 AS StartPos, CHARINDEX(@Delimiter , @RawString + @Delimiter) AS EndPos
UNION ALL
SELECT EndPos + 1 AS StartPos , CHARINDEX(@Delimiter,@RawString + @Delimiter , EndPos + 1) AS EndPos
FROM CSVCTE WHERE CHARINDEX(@Delimiter, @RawString + @Delimiter, EndPos + 1) <> 0)

INSERT INTO @VList
SELECT SUBSTRING(@RawString, StartPos,EndPos - StartPos) FROM CSVCte

RETURN
END
GO

-- sample call: select SplitedValue from dbo.StringToTable('A,B,C,D,E,F,G', ',')


That's a different take on using a recursive CTE to do it. I'll have to check the performance, though, because recursive CTE's tend to be nothing more than "Hidden RBAR" with all the performance problems associated with RBAR.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #969407
Posted Saturday, August 14, 2010 12:00 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 10:33 PM
Points: 35,259, Visits: 31,740
Ok... here we go with a test for Alex's function. I was right about the CTE still being "Hidden RBAR" even though it uses all variables and is an InLine Table Valued Function (iTVF).

Here's the data generation code again...
--=================================================================================================
-- Create a CSV test table with 10000 rows and 10 random CSV elements per row.
-- The randomness of the elements also prevents the delimiters for showing up in the same
-- position for each row. SQL Server would figure that out and cache the information making
-- some splitting techniques seem faster than they really are.
-- This section of the code takes just a couple of seconds to run because XML concatenation
-- is very fast (especially when compared to XML splitting or shredding).
-- Jeff Moden
--=================================================================================================
--===== Conditionally drop the test table to make reruns easier
IF OBJECT_ID('TempDB..#CsvTest','U') IS NOT NULL
DROP TABLE #CsvTest

--===== This creates and populates a test table on the fly containing a
-- sequential column and a randomly generated CSV Parameter column.
SELECT TOP (10000) --Controls the number of rows in the test table
ISNULL(ROW_NUMBER() OVER (ORDER BY(SELECT NULL)),0) AS RowNum,
(
SELECT CAST(STUFF( --=== STUFF get's rid of the leading comma
( --=== This builds CSV row with a leading comma
SELECT TOP (10) --Controls the number of CSV elements in each row
','+CAST(ABS(CHECKSUM(NEWID()))%100000 AS VARCHAR(10))
FROM Master.sys.All_Columns ac3 --Classic cross join pseudo-cursor
CROSS JOIN Master.sys.All_Columns ac4 --can produce row sets up 16 million
WHERE ac3.Object_ID = ac1.Object_ID --Without this, all rows would be the same
FOR XML PATH('')
)
,1,1,'') AS VARCHAR(8000))
) AS CsvParameter
INTO #CsvTest
FROM Master.sys.All_Columns ac1 --Classic cross join pseudo-cursor
CROSS JOIN Master.sys.All_Columns ac2 --can produce row sets up 16 million

--===== Let's add a PK just for grins. Since it's a temp table, we won't name it.
ALTER TABLE #CsvTest
ADD PRIMARY KEY CLUSTERED (RowNum) WITH FILLFACTOR = 100

--===== Sanity check... let's see what we've got in the test table
SELECT * FROM #CsvTest


Here's the test code comparing the method I've demo'd several times with Alex's recursive CTE...
--===== DelimitedSplit8k solution
SELECT test.RowNum, split.ItemNumber, split.Item
INTO #Result1
FROM #CsvTest test
CROSS APPLY
(
SELECT ItemNumber, Item
FROM dbo.DelimitedSplit8k(test.CsvParameter,',')
) split
;
GO
--===== Alex's Recursive CTE solution
SELECT test.RowNum, split.RowID, split.SplitedValue
INTO #Result2
FROM #CsvTest test
CROSS APPLY
(
SELECT RowID, SplitedValue
FROM [StringToTable](test.CsvParameter,',')
) split
;
GO


And, here are the results from the Profiler run... I'd strongly recommend that recursive CTE methods be avoided at all costs because they cost so much and are so very slow... this difference was only on 10,000 rows by 10 elements (100,000 rows total output) which can easily represent the number of hits on a busy server in a second or two.



Of course, this is one of the places where a CLR comes in handy. It'll be about twice as fast in this case.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems


  Post Attachments 
new-2.gif (135 views, 3.28 KB)
Post #969409
Posted Saturday, August 14, 2010 12:09 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 10:33 PM
Points: 35,259, Visits: 31,740
Ah... sorry, my apologies. I thought I'd posted the 8k splitter code on this thread before. Guess not. So, here it is. It's mostly comments to explain the history and operation of the code. The code itself is actually very short and very, very fast...

 CREATE FUNCTION dbo.DelimitedSplit8K
/***************************************************************************************************
Purpose:
Split a given string at a given delimiter and return a list of the split elements (items).

Returns:
iTVF containing the following:
ItemNumber = Element position of Item as a BIGINT (not converted to INT to eliminate a CAST)
Item = Element value as a VARCHAR(8000)

CROSS APPLY Usage Example:
---------------------------------------------------------------------------------------------------
--===== Conditionally drop the test tables to make reruns easier for testing.
-- (this is NOT a part of the solution)
IF OBJECT_ID('tempdb..#JBMTest') IS NOT NULL
DROP TABLE #JBMTest
;
--===== Create and populate a test table on the fly (this is NOT a part of the solution).
SELECT *
INTO #JBMTest
FROM (
SELECT 1,'1,10,100,1000,10000,100000,1000000' UNION ALL
SELECT 2,'2000000,200000,20000,2000,200,20,2' UNION ALL
SELECT 3, 'This,is,a,test' UNION ALL
SELECT 4, 'and so is this' UNION ALL
SELECT 5, 'This, too (no pun intended)'
) d (SomeID,SomeValue)
;
GO
--===== Split the CSV column for the whole table using CROSS APPLY (this is the solution)
SELECT test.SomeID, split.ItemNumber, split.Item
FROM #JBMTest test
CROSS APPLY
(
SELECT ItemNumber, Item
FROM dbo.DelimitedSplit8k(test.SomeValue,',')
) split
;
---------------------------------------------------------------------------------------------------
Notes:
1. Optimized for VARCHAR(7999) or less. No testing or error reporting for truncation at 7999
characters is done.
2. Optimized for single character delimiter. Multi-character delimiters should be resolved
externally from this function.
3. Optimized for use with CROSS APPLY.
4. Does not "trim" elements just in case leading or trailing blanks are intended.
5. If you don't know how a Tally table can be used to replace loops, please see the following...
http://www.sqlservercentral.com/articles/T-SQL/62867/
6. Changing this function to use VARCHAR(MAX) will cause it to run twice as slow. It's just the
nature of VARCHAR(MAX) whether it fits in-row or not.
7. Multi-machine testing for the method of using UNPIVOT instead of 10 SELECT/UNION ALLs shows
that the UNPIVOT method is quite machine dependent and can slow things down quite a bit.
8. Performance testing shows using "TOP" for the limiting criteria of "N" is actually
slower and slightly more CPU intensive than the traditional WHERE N < LEN(@pString) + 2.
9. Performance testing shows using ORDER BY (SELECT x) where "x" is anything is actually
slower and slightly more CPU intensive than the traditional ORDER BY (SELECT N).

Credits:
This code is the product of many people's efforts including but not limited to the following:
cteTally concept originally by Iztek Ben Gan and "decimalized" by Lynn Pettis (and others) for a
bit of extra speed and finally redacted by Jeff Moden for a different slant on readability and
compactness. Hat's off to Paul White for his simple explanations of CROSS APPLY. Finally,
special thanks to Erland Sommarskog for his tireless efforts to help people understand
what you can actually do with T-SQL. I also thank whoever wrote the first article I ever saw
on "numbers tables" which is located at the following URL ...
http://sqlserver2000.databases.aspfaq.com/why-should-i-consider-using-an-auxiliary-numbers-table.html

Revision History:
Rev 00 - 20 Jan 2010 - Concept: Lynn Pettis and others.
Redaction/Implementation: Jeff Moden
- Base 10 redaction and reduction for CTE. (Total rewrite)

Rev 01 - 13 Mar 2010 - Jeff Moden
- Removed one additional concatenation and one subtraction from the SUBSTRING in the
SELECT List for that tiny bit of extra speed.

Rev 02 - 14 Apr 2010 - Jeff Moden
- No code changes. Added CROSS APPLY usage example to the header, some additional credits,
and extra documentation.

Rev 03 - 18 Apr 2010 - Jeff Moden
- No code changes. Added notes 7, 8, and 9 about certain "optimizations" that dont'
actually work for this type of function.

Rev 04 - 29 Jun 2010 - Jeff Moden
- Added WITH SCHEMABINDING thanks to a note by Paul White. This prevents an unnecessary
"Table Spool" when the function is used in an UPDATE statement even though the function
makes no external references.
***************************************************************************************************/
--===== Define I/O parameters
(
@pString VARCHAR(7999),
@pDelimiter CHAR(1)
)
RETURNS TABLE
WITH SCHEMABINDING
AS
RETURN
--===== "Inline" CTE Driven "Tally Table” produces values up to
-- 10,000... enough to cover VARCHAR(8000)
WITH
E1(N) AS ( --=== Create Ten 1's
SELECT 1 UNION ALL SELECT 1 UNION ALL
SELECT 1 UNION ALL SELECT 1 UNION ALL
SELECT 1 UNION ALL SELECT 1 UNION ALL
SELECT 1 UNION ALL SELECT 1 UNION ALL
SELECT 1 UNION ALL SELECT 1 --10
),
E2(N) AS (SELECT 1 FROM E1 a, E1 b), --100
E4(N) AS (SELECT 1 FROM E2 a, E2 b), --10,000
cteTally(N) AS (SELECT ROW_NUMBER() OVER (ORDER BY (SELECT N)) FROM E4)
--===== Do the split
SELECT ROW_NUMBER() OVER (ORDER BY N) AS ItemNumber,
SUBSTRING(@pString, N, CHARINDEX(@pDelimiter, @pString + @pDelimiter, N) - N) AS Item
FROM cteTally
WHERE N <= LEN(@pString)
AND SUBSTRING(@pDelimiter + @pString, N, 1) = @pDelimiter
;
GO




--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #969411
Posted Monday, August 16, 2010 9:53 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, May 11, 2011 12:24 PM
Points: 5, Visits: 16
Thanks Jeff for your help! Code working pretty well. Only issue I'm having is getting one parameter to be dependent on the other in a stored proc that populates a drop down list in Reporting Services.
Post #969909
Posted Monday, August 16, 2010 8:20 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 10:33 PM
Points: 35,259, Visits: 31,740
gary.bland (8/16/2010)
Thanks Jeff for your help! Code working pretty well. Only issue I'm having is getting one parameter to be dependent on the other in a stored proc that populates a drop down list in Reporting Services.


Thanks for the feedback, Gary. I just don't know what you mean by "getting one parameter to be dependent on the other" in this case without an example. And, remember, I admitted to not even knowing how to spell "SSRS" so I'll need a pretty good example of what you want the stored proc to do.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #970112
Posted Tuesday, August 17, 2010 11:09 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, May 11, 2011 12:24 PM
Points: 5, Visits: 16
What I am trying to accomplish is when I choose a parameter I need to pass the result into another parameter in order to limit the number of available selections. An example would be State and City. Selecting a State would then limit the selection of Cities.
Post #970614
« Prev Topic | Next Topic »

Add to briefcase «««34567»»

Permissions Expand / Collapse