The Elusive Conditional WHERE Clause

  • MarbryHardin (4/19/2013)


    There is also the option of using a table type parameter as a container for variable and/variable length parameter lists.

    Build a type containing name - value pairs, populate it in code, and you can simply join directly to it something like this.

    ...

    Unless we're also talking about a large paginated data set being returned LINQ has made it much more convenient in many cases to handle sorting once the data is returned from SQL rather than jumping through hoops in the query to accommodate that.

    Marbry, that is definitely an interesting twist.

  • MarbryHardin (4/19/2013)


    There is also the option of using a table type parameter as a container for variable and/variable length parameter lists.

    Absolutely! And testing will show that shipping a TVP to the server will usually outperform the methods of shipping delimited strings or XML, and allow for a more intuitive coding experience as well which is a nice bonus. SQL was built to work with tables.

    Unless we're also talking about a large paginated data set being returned LINQ has made it much more convenient in many cases to handle sorting once the data is returned from SQL rather than jumping through hoops in the query to accommodate that.

    Sorting should be moved outward towards the presentation layer whenever possible. In the large, caching results and sorting in an application layer will most certainly outperform a round-trip to the data layer with a dynamic ORDER BY.

    There are no special teachers of virtue, because virtue is taught by the whole community.
    --Plato

  • Nice article! I like how you incorporated the tally table logic - "RBAR IS BAD". I like the holiday logic too using "merge". Someone already suggested it but try incorporating WITH RECOMPILE - you might want to explore that but I would not consider it unless you have to.

  • Maybe this is a matter of personal style, but I believe the article could be improved by removing the specifics of your particular application of the technique. I was able to fast forward through the middle chunk of your article while you explained the use case and just picked up reading at the final couple code examples and that was all I needed to understand your technique. If you felt that the details of your example are something that others would like to read, I'd suggest separating the articles. Otherwise, I would just use a simpler, contrived example that simply demonstrated the conditional where clause technique.

    Regardless, I appreciate you taking the time to share something with the rest of us. Though some have noted that this is not a new technique, there's always someone out there that has never thought of this before and may not have run across it in a book or article yet.

    Thanks.

  • John, Could you post the actual execution plans resulting from the 3 major conditions types listed in the article? @pCondition = 1-7,9 @pCondition =8 and @pCondition = 10.

    Even using WITH RECOMPILE or OPTIMIZE FOR UNKNOWN, it still only generates a single execution to handle variable mechanisms....in the example as written, where only 1 possible case exists for each execution, it might perform OK...however, if you have any situation where more than one case could be true (eg. @pCondition1 = 'x'...@pCondition2 = 'y' ) essentially none of them will be optimized correctly removing indexing selection and ultimately utilizing the CPU to filter the different cases (as noted by the link to Gail's article posted by other people on this board)...

    I think this is an excellent pattern when only one possible set of arguments could evaluate to true (though I would recommend the RECOMPILE hint)...if you have multiple conditions in the where clause, this pattern will not scale. the example should scale OK (WITH RECOMPILE) for all arguments because ultimately you only have 1 condition, that being the value of @pCondition.

  • Knut Boehnert (4/18/2013)


    As long as there is always the same column used then it is fine and won't need a recompile.

    As always, it depends. If the distribution of values in the tables is very variable, you can get quite heavily stung by parameter sniffing with code like this. At the least it's probably worth an OPTIMIZE FOR hint, but if I was involving several columns I'd probably lean towards the RECOMPILE instead.

  • Thanks to everyone who has commented. I am sorry that it has taken me so long to respond, but I have been a bit busy this morning.

    When I wrote the article two years ago, I wasn't aware of the RECOMPILE option and I would often re-declare parameters as local variables, passing the param values into the locals. I forgot to do that, though, in the article. :w00t:

    I do agree with using RECOMPILE, however.

    John Hick

  • craig 81366 (4/18/2013)


    John Hick-456673 (4/18/2013)


    I have tested it against large tables (1 billion plus rows) and performance is much better than with dynamic SQL.

    I'm going to be blunt here.

    I find that extremely hard to believe.

    +1

    Dynamic SQL is difficult to beat in this case.

    I would go as far as to say that suggesting this technique is a bad advice.

    -- Gianluca Sartori

  • spaghettidba (4/17/2015)


    craig 81366 (4/18/2013)


    John Hick-456673 (4/18/2013)


    I have tested it against large tables (1 billion plus rows) and performance is much better than with dynamic SQL.

    I'm going to be blunt here.

    I find that extremely hard to believe.

    +1

    Dynamic SQL is difficult to beat in this case.

    I would go as far as to say that suggesting this technique is a bad advice.

    You are two years behind the eight-ball, there. But, thank you for playing. 😀

  • For dynamic ordering use the same approach but make the field to be ordered by a case statement.


    Kindest Regards,

    Martin

  • John Hick-456673 (4/17/2015)


    spaghettidba (4/17/2015)


    craig 81366 (4/18/2013)


    John Hick-456673 (4/18/2013)


    I have tested it against large tables (1 billion plus rows) and performance is much better than with dynamic SQL.

    I'm going to be blunt here.

    I find that extremely hard to believe.

    +1

    Dynamic SQL is difficult to beat in this case.

    I would go as far as to say that suggesting this technique is a bad advice.

    You are two years behind the eight-ball, there. But, thank you for playing. 😀

    You might have missed that your article is featuring on the main page of SQL Server Central.

    Still bad advice, even two years later.

    -- Gianluca Sartori

  • This is a powerful link to explain why, and demonstrate that, a single query with multiple functionality has a single plan and that that plan can have dramatically degraded performance.

    http://www.sommarskog.se/dyn-search.html

  • spaghettidba (4/18/2015)


    John Hick-456673 (4/17/2015)


    spaghettidba (4/17/2015)


    craig 81366 (4/18/2013)


    John Hick-456673 (4/18/2013)


    I have tested it against large tables (1 billion plus rows) and performance is much better than with dynamic SQL.

    I'm going to be blunt here.

    I find that extremely hard to believe.

    +1

    Dynamic SQL is difficult to beat in this case.

    I would go as far as to say that suggesting this technique is a bad advice.

    You are two years behind the eight-ball, there. But, thank you for playing. 😀

    You might have missed that your article is featuring on the main page of SQL Server Central.

    Still bad advice, even two years later.

    That might be - but after the vicious attacks from the first go-round, I am not feeling particularly merciful. Therefore, I presume that you have never written for SCC. I was informed a few weeks ago that the article was to be republished. I braced myself for another round of flames. Surprisingly, there have been only a couple of you who have had the bad grace to treat this as a first publishing - and that as though I am maliciously guiding people astray.

    I had hoped that this article, as well as myself, could just disappear and be forgotten.

    But, maybe I should embrace triangular joins and cursors - no, wait! NESTED CURSORS with extraneous function calls within! THAT'S MY NEW PARADIGM! RBAR, HERE I COME!

    HEY, EVERYBODY! THIS HOW YOU CAN BRING DOWN A SYSTEM! JUST LISTEN TO ME!

    Really, is this what you want? Someone writes an article in the spirit of trying to help and the community wants to burn him at the stake? Really? OK, I get it. Bye!

    jhick signing out on SQL Sever Central, permanently. I'm done. I used to sign in at least a couple of times a week - but this was the last.

    Thanks for all of your USEFUL AND CONSTRUCTIVE *%$*^*** COMMENTS

  • John, the decision is up to you, but please consider that you had good feedback from this article in general. I'm sorry that you feel you've been mistreated. My comments and other ones that pointed out that dynamic sql is faster did not mean that you didn't do a good job. They just meant what was written: dynamic sql is faster.

    Personal attack is a different thing and you probably know it. You're asking for constructive feedback: pointing out better alternatives IS constructive feedback.

    Whenever you publish something, you have to be prepared to respond to comments and criticism. It's part of the game. I wrote 3 articles for SSC and I know it. If you can't stand anything but praise for your work, I think the problem is not with the critics.

    -- Gianluca Sartori

  • There's a real easy way to settle all of this, folks. All of you know how to generate a very large table of constrained random data very quickly using only code. It certainly doesn't need to be a billion row table but someone build a test table prove it with code instead of guessing. Only the code knows for sure.

    As a bit of a sidebar and to be totally honest, all articles of this nature should include a performance test even if the author makes no claims of performance simply because of the volume of data that the method(s) in such articles could be played against.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

Viewing 15 posts - 46 through 60 (of 79 total)

You must be logged in to reply to this topic. Login to reply