Multi-Select Parameters for Reporting Services

  • Comments posted here are about the content posted at http://www.sqlservercentral.com/columnists/ADavel/3138.asp

  • Have you tried to have it stored in a table with that function of yours and do a select statement but instead of join using @parameter in (select parameter from @table) to see if that boost the performance?

    And could you publish de code of said function?

    Thanks in advance

  • Adriaan,

    I'm rather surprised at the blow-out to ten minutes for that basic shred-the-tokenised-string-to-a-table approach.  I've done this several times as a way of dealing with SSRS multi-select, and found it fine for performance (although really it's a bit crude and hack-y).

    Was there anything obvious in the  query plan (of the porly performing query) to indicate why this was such a bad thing to do? 

     

  • Thanks for the post Adriaan. It would be great if you could paste some sample code here!

    Nick

  • no need for dynamic SQL, just use something like this:

    WHERE charindex(',' + field1 + ',',',' + @param1 + ',') > 0

     

    ... where @param1 is a string of comma-delimited values.

  • If you use parameterized dynamic sql in the stored procedure, it shouldn't differ from the query within the report.

    What I'd try is dump the SSRS parameters, do it with my own code, and pass an xml full of parameters (sql 2005).

  • I would love to see a code example from Adriaan's article as well.

  • i approach it a little differently. I use a function to split a parameter value into its different pieces

    I also provide an All option with -1 being the value passed to the stored procedure.

    my stored procedure has this line:

    AND (s.ORDERCODE IN (SELECT Item FROM dbo.Split(@OrderCode,',')) OR '-1' IN(@OrderCode))

    and function Split looks like this:

    SET ANSI_NULLS ON

    GO

    SET QUOTED_IDENTIFIER ON

    GO

    CREATE FUNCTION [dbo].[Split]

    (

    @ItemList NVARCHAR(4000),

    @delimiter CHAR(1)

    )

    RETURNS @IDTable TABLE (Item VARCHAR(50))

    AS

    BEGIN

    DECLARE @tempItemList NVARCHAR(4000)

    SET @tempItemList = @ItemList

    DECLARE @i INT

    DECLARE @Item NVARCHAR(4000)

    SET @tempItemList = REPLACE (@tempItemList, ''' ''', '')

    SET @i = CHARINDEX(@delimiter, @tempItemList)

    WHILE (LEN(@tempItemList) > 0)

    BEGIN

    IF @i = 0

    SET @Item = @tempItemList

    ELSE

    SET @Item = LEFT(@tempItemList, @i - 1)

    INSERT INTO @IDTable(Item) VALUES(@Item)

    IF @i = 0

    SET @tempItemList = ''

    ELSE

    SET @tempItemList = RIGHT(@tempItemList, LEN(@tempItemList) - @i)

    SET @i = CHARINDEX(@delimiter, @tempItemList)

    END

    RETURN

    END

  • We faced this problem a while back at our company as well.  And one of our DBAs came up with the following solution:

    WHERE (@CommaDelimtedList + ',' LIKE '%'+ TableColumnToCompareTo + ',%' OR @CommaDelimtedList = '')

    Just thought I'd share with everyone how we handle this situation.

  • Adriaan,

    When you create the temp table, was included a index for the join column ?, the performance between a table without index vs an indexes table in this case can be very big.

    Example: The customer table have the next structure:

    create table customer (

      id         int primary key

    , state    int

    , LName   varchar(100)

    , rname    varchar(100))

     

    and supose that table have an index on the state column

    create index IDX_Customer_State on Customer (state)

    so, we temp table have the next code

    declare @TempJoinTable (IdState    int   primary key) //note the primary key, we can index a @temptable like anyone

    the next step is to populate the temp table with this code

    -- the @FilterValue contain the filter values with comma

    DECLARE @posicion     INT

    DECLATE @Value        VARCHAR(100)

    WHILE (Len(@FilterValue))

    BEGIN

         SET @Posicion = PATINDEX('%,%',@FilterValue)

         IF (@Posicion = 0)

         BEGIN

              SET @Value = @FilterValue

              SET @FilterValue = ''

         END

         ELSE

         BEGIN

              SET @Value = SUBSTRING(@FilterValue,1,@Posicion-1)

              SET @FilterValue = SUBSTRING(@FilerValue,@Posicion+1,LEN(@FilterValue)

         END

         INSERT INTO @TempJoinTable VALUES (@Value)

    END

    finally, just join the tables and compare the execution time

    SELECT *

    FROM customer CUS

         INNER JOIN @TempJoinTable TT

             ON CUS.State = TT.IdState   

     

    Sebastián Rodríguez R.
    Licenciado en Informática de Gestión
    sebas_rod@hotmail.com
    (569) - 98626471

  • Hello,

    I have been using all these solutions and finally settled for a CLR function that I found on the Internet.

    It works great for me.

    Regards,

    Philippe

    using System;

    using System.Data;

    using System.Data.SqlClient;

    using System.Data.SqlTypes;

    using Microsoft.SqlServer.Server;

    public partial class UserDefinedFunctions

    {

    [Microsoft.SqlServer.Server.SqlFunction(IsDeterministic = true, IsPrecise = true, TableDefinition = "SQLStr nvarchar(4000)", FillRowMethodName = "FillSplitCommaDelimitedStringToStr")]

    public static System.Collections.IEnumerable clrfn_SplitCommaDelimitedString(SqlString str)

    {

    string x = str.Value;

    if (!string.IsNullOrEmpty(x))

    {

    return x.Split(',');

    }

    else

    {

    return null;

    }

    }

    private static void FillSplitCommaDelimitedStringToStr(object obj, out SqlString str)

    {

    if (obj != null)

    str = (String)(obj);

    else

    str = String.Empty;

    }

    };

    ''I call it like that

    Where (@PTI2 = '##' or b.PTI2_Cd in (select ltrim(SQLstr) from ONGlobals.dbo.clrfn_SplitCommaDelimitedString(@Pti2) ) )

    ''The ## is used for cases where we want all values.

    BI Guy

  • If you want to use an IN try this?

    WHERE RealTableColumn IN (SELECT TempTableColumn FROM TempTable)

    EXISTS can be fast too but not sure in this scenario.

    WHERE EXISTS (SELECT TempTableColumn FROM TempTable WHERE TempTableColumn = RealTableColumn)

    However a JOIN on an indexed TempTableField should be as fast as anything in my experience.

    I would normally use the IN option as it is clearer.

  • If you want to avoid dynamic SQL, you could use an UDF that will convert comma-delimited string to a table-type variable, and then use IN as usual.

    SELECT * FROM customers

    WHERE ISNULL(@cust_id_multiple, '') = '' OR customers.cust_id IN (SELECT value FROM dbo.LIST_TO_TABLE(@cust_id_multiple))

     

    Below is a sample UDF that does the trick.

    --converts list of values to a temporary table. Used to avoid dynamic SQL statements when IN clause is used as filter expression

    --sList: list of numeric values in comma-separated form, to be converted to table

    --return value: table-type variable containing one column. The table is filled with numeric values from @sList

    CREATE FUNCTION LIST_TO_TABLE (@sList AS VARCHAR(255))

    RETURNS  @retTable TABLE(value VARCHAR(50))  AS 

    BEGIN

     IF @sList IS NULL RETURN

     --preparing the input list, removing IN keyword and parenthesis

     SET @sList = REPLACE(@sList, 'IN(','')

     SET @sList = REPLACE(@sList, 'IN (','')

     SET @sList = REPLACE(@sList, '(','')

     SET @sList = REPLACE(@sList, ')','')

     SET @sList = LTRIM(RTRIM(@sList))

     --scrolling thru values in @sList and populating temporary table

     DECLARE @index INT, @Delimiter CHAR(1)

     DECLARE @Result VARCHAR(255)

     SET @Delimiter = ','

     WHILE @sList <> ''

     BEGIN

      SET @index = CHARINDEX(@Delimiter, @sList)

      IF @index <> 0

      BEGIN

       SET @Result = LEFT(@sList, @index - 1)

       SET @sList = SUBSTRING(@sList, @index + 1, LEN(@sList))

      END

      ELSE

      BEGIN

       SET @Result = @sList

       SET @sList = ''

      END

      INSERT @retTable SELECT @Result

     END

     RETURN

    END

     

  • Hi Jose,

    We tried to split the string into records, and insert it into a temp table yes, and it had some perfomance boost but the cost of splitting the string and inserting into a temp table cause performance to be less than using dynamic sql...

  • Hi SDM,

    It could well be that my results were affected by our table size and parameter length as it was a fairly large table and many parameter options (strings) selected. I'm sure a smaller table and less parameters would have different results. We decided to stick with the dynamics SQL as to advantages out weigh the disadvantages.

    In the query plan computations stood out most (as expected) and the join was very in-efficient, which is what I expected as well.

Viewing 15 posts - 1 through 15 (of 23 total)

You must be logged in to reply to this topic. Login to reply