CASE statement versus dynamic Query

  • I have a table with 100 columns. I want to sort on each column. My Front-end decides which column is to be

    used for sorting. There are more than 500,000 records in the table.

    My stored procedure has a input which decides which column to use for sorting. Now i have two options.

    1. Create SQL Query dynamically based on the input parameter.

    2. Create SQL Query using CASE statement.

    Is there any third way? Also, which one would really work better? CASE statement would have SP's plan created. But,

    if i use CASE statement, maintenance would be difficult.

    Thanks,

    Ramesh.

  • Personally, I'd opt for the Dynamic SQL. It's going to be easy to maintain and it's going to be just as fast or faster than anything else because there's probably no chance of reusing an execution plan here, anyway.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • I find this to be an interesting question since I have read 2 sides to the sorting question. One side says sorting should be done in the app and the other says in SQL. In your case with a dynamic sort, why not do it in the app? Then you can get a reusable plan and the sorting is done in memory on the client. I am definitely interested in hearing other opinions on this.

  • I have to agree with Jack on this one. My premise was based on only whether or not to use CASE or Dynamic SQL. The real key is that sorting is very expensive... send the unsorted data to the client and let the client do some of the work by sorting it.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Its always good when Jeff agrees with you on something. If you cannot have the sorting done in the app then dynamic SQL is the way to go. I like to use sp_executesql when I do have to use dynamic SQL.

  • (Not disagreeing with any ofthe above posts - just modulating the answer)

    Just keep in mind that sorting at the client should be restricted to SMALL datasets, especially if the sorting is used in combination with a Top x predicate. Anything over a certain size will likely result in a LARGE amount of "unneeded data" being sent to the client, in turn resulting in a large effort on its side, resulting only in a large amount of the transmitted data being tossed by the client.

    ----------------------------------------------------------------------------------
    Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?

  • Matt has a good point as well. I was assuming (yes I know what assume stands for) that all the columns were being returned and that there was a where clause being applied as well so that only the rows needed were returned. If you are also dynamically determining the select list based on the sort column selected then you should use dynamic sql and I would even say that the query should be built in the app and passed to sql server, I rarely say that! Still using the command object and taking appropriate measures to protect against sql injection.

    This also leads to the questions, what are the business reasons for needing to be able to sort on one of 100 columns and do you need to return all 100 columns?

  • I thought some more information regarding the application would help you all guys. It is a status flow based .NET Web Application. We are using Grid View to display the data in the web page. It is a business requirement to see all the fields (close to 100) in the file (which we load into the system) in the grid and with the ability to sort based on all the fields (end users use it for reviewing the data). Web application was designed as an Object Oriented and it was decided in the initial phases of the design to use Custom Collection and Data Reader instead of the Data Set. We had performance issue with loading all the records into the grid, tested with 30,000 records running on P4 and 1GB RAM. The application may perform well if tested in the Server but we did not want to take chance since it will have 500 concurrent users at a time. It was decided to go for the SQL Server based sorting instead of a Web Server based sorting techniques.

  • Oh, I agree... the suggestion of sorting at the client means that you're sorting only the desired result set and no extra/unwanted data. Don't wanna "pop the pipe". 😀

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Yeah, sorting that many records on the client, especially since you are using a data reader and putting in a custom collection would stink. Now you need to be really smart in what you use for your clustered index:w00t:

  • As Jack mentioned earlier, do use sp_executesql in your dynamic SQL for efficient reuse of execution plans.

    Even with the 100 possible columns involved here, I'm sure a handful of favorites will take the bulk of choices for sorting, and you will want these execution plans to be re-used as much as possible.

    __________________________________________________________________________________
    SQL Server 2016 Columnstore Index Enhancements - System Views for Disk-Based Tables[/url]
    Persisting SQL Server Index-Usage Statistics with MERGE[/url]
    Turbocharge Your Database Maintenance With Service Broker: Part 2[/url]

  • Marios Philippopoulos (2/26/2008)


    As Jack mentioned earlier, do use sp_executesql in your dynamic SQL for efficient reuse of execution plans.

    Even with the 100 possible columns involved here, I'm sure a handful of favorites will take the bulk of choices for sorting, and you will want these execution plans to be re-used as much as possible.

    I'd have to say, it depends... I've seen it where reuse of the execution plan gives some horrible performance because of the change in selection method caused by the parameter change. I've recently run into that very problem where the recompile produces the correct answer using a merge join on a million rows in milliseconds as opposed to a half hour long run using a looped join.

    sp_ExecuteSQL isn't the panacea that some think it is... it's sometimes better to have a recompile occur.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • I'd have to agree that you'd probably be better off just telling it to recompile each time. In a case like this, unless your usage is very biased towards one specific set of columns being ordered - it will actually save you time, since the optimizer won't have to spend any time trying to figure out if it can use the old plan (which it might try to use even if it's not so good for what is going on now).

    ----------------------------------------------------------------------------------
    Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?

  • Good points guys, thanks.

    __________________________________________________________________________________
    SQL Server 2016 Columnstore Index Enhancements - System Views for Disk-Based Tables[/url]
    Persisting SQL Server Index-Usage Statistics with MERGE[/url]
    Turbocharge Your Database Maintenance With Service Broker: Part 2[/url]

  • Might be out of scope, but on my last job I was issued to create the similar table structure to keep there the contact information. The demand was in order to keep the different contact schemas in the same table.

    I've gone with the other approach (it was not too easy to get the boss' agreement :-))

    Instead of the single table with contact information I've implemented the structure of 2 tables: the contact and contact_details. The contact_details has 3 columns: contact_id, value and value_type_id.

    That all allowed me to avoid such problems as sorting. And has used the storage more efficient way.

Viewing 15 posts - 1 through 15 (of 16 total)

You must be logged in to reply to this topic. Login to reply