Converting parameters to fix performance problems

  • mikeg13

    SSCommitted

    Points: 1851

    We recently upgraded to SQL Server 2017 from SQL Server 2012.  This includes a change from SSRS 2012 to Power BI Reporting Services 2017 (Sept 2019) and we are now having a weird parameter problem.  I am not sure if it is SSRS related or Power BI RS related or database related, and not sure how to troubleshoot which it is.

    Some reports are now taking 30 seconds or more to run and they used to take about 1 second.  They also take 1 second when I run the queries with the parameters in my SQL developer tool (TOAD).  According to the statistics, the time is all spent in the data retrieval portion, not in report processing or rendering.

    I can fix the performance problem by converting the datatype of the parameter in the SQL used in the dataset.  For instance, I have a parameter that has integer values, but is set to be the default of Text datatype.  In the SQL I say WHERE @Parm = 1.  The query works but takes forever (30 seconds).  So I change the datatype to integer on the parameter in the report.  Still takes forever.  It takes 1 second if I change the SQL to WHERE CONVERT(int, @Parm) = 1.  This same behavior occurs when I am using text values in a text parameter.  I have to CONVERT(varchar,@Parm) in order to use the parameter.

    Any ideas why this is happening or how to narrow the possibilities?

    Mike

  • jonathan.crawford

    SSCertifiable

    Points: 6577

    I have had to pass the parameter in to an internal variable that matches it exactly, and just assigns the value over. That made a HUGE difference when I did it, although I have no explanation for why that should impact anything.

    Doesn't seem to happen all the time, thankfully.

    -------------------------------------------------------------------------------------------------------------------------------------
    Please follow Best Practices For Posting On Forums to receive quicker and higher quality responses

  • Jeffrey Williams

    SSC Guru

    Points: 88587

    If @Parm is defined as varchar(nn) - then try changing the value in the code to @Parm = '1' instead.  The ideal solution is to make sure the parameter definition matches the column definition.

    If the column being matched is defined as varchar(20) - and your parameter is defined as varchar(100) then SQL Server has to implicitly convert the parameter to match the column.  What makes it worse is data type precedence...if the column is defined as varchar and your parameter is defined as integer - then SQL Server will implicitly convert the column to integer and now your index on that column cannot be used.

    The best fix for this is:

    1. Insure the data type of the parameter matches the column's data type
    2. Insure the report parameter is defined correctly for that data type

    If you don't want to fix the reports - then fix the stored procedures.  If you must allow string values to be sent to the procedure - you can always create an internal variable that is converted to the appropriate data type.

    For example:

    CREATE PROCEDURE dbo.MyProcedure
    @parm1 varchar(100)
    AS

    DECLARE @iParm1 int = cast(@parm1 as int);

    SELECT ...
    FROM ...
    WHERE @iParm1 = 1

    Jeffrey Williams
    Problems are opportunities brilliantly disguised as insurmountable obstacles.

    How to post questions to get better answers faster
    Managing Transaction Logs

  • mikeg13

    SSCommitted

    Points: 1851

    I have heard of passing to an internal parameter, but this is not using a stored procedure.  It is just straight SQL from RS dataset.  And I have changed the RS parameter datatype to integer and still had to CONVERT(int,@Parm) for good performance.

    And why would this work fine until we upgraded to PBIRS 2017?

  • Jeffrey Williams

    SSC Guru

    Points: 88587

    How exactly is the parameter being used in the code?  That is the missing piece here...

    Why would it change?  Because you upgraded from 2012 - and there was a significant change in 2016 that affects cardinality estimates which is probably why your code is affected.  Prior to that change - an implicit conversion would not have necessarily caused a cardinality issue - but after that change there would be...and if the cardinality estimate is off then a less than optimal execution plan may be generated.

    There are also changes related to parameter sniffing - which could be impacting your queries.  We would need to see the actual code - and the actual execution plan to see if any of these is happening.

    Jeffrey Williams
    Problems are opportunities brilliantly disguised as insurmountable obstacles.

    How to post questions to get better answers faster
    Managing Transaction Logs

  • mikeg13

    SSCommitted

    Points: 1851

    I am not sure I can get the actual SQL, since I do not have access to SQL Profiler, but I am trying to work with IT to get that.  I recently also found that it works slowly in SSMS if I declare my parameter as nvarchar and then compare to varchar field.  Someone said that SSRS defaults "text" parameters to nvarchar, so it is now doing implicit conversion from nvarchar to varchar.

    In my query, the field that I can quickly change performance with is a simple WHERE Field = @Field.  Works fine with WHERE Field = CONVERT(varchar,@Field).  I am not sure if that answers your question about how the parameter is being used in the code.

  • Grant Fritchey

    SSC Guru

    Points: 396603

    jonathan.crawford wrote:

    I have had to pass the parameter in to an internal variable that matches it exactly, and just assigns the value over. That made a HUGE difference when I did it, although I have no explanation for why that should impact anything.

    Doesn't seem to happen all the time, thankfully.

    Sounds like parameter sniffing might be causing problems. What you did by setting it to a local variable is remove the parameter sniffing and instead have it use an average value from the statistics of the column(s) involved in the WHERE clause.

    Take a look at the execution plans and compare them between using no local variable and using a local variable.

    By the way, if the statement recompiles for any reason, local variables can be sniffed (it's actually predicate push-down, but the effect is the same) and you may see intermittent problems from this.

    ----------------------------------------------------
    The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood...
    Theodore Roosevelt

    The Scary DBA
    Author of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd Edition
    Product Evangelist for Red Gate Software

Viewing 7 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply