Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««12

CASE statement versus dynamic Query Expand / Collapse
Author
Message
Posted Tuesday, February 26, 2008 12:31 AM
SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Thursday, July 10, 2014 4:13 AM
Points: 1,860, Visits: 3,597
As Jack mentioned earlier, do use sp_executesql in your dynamic SQL for efficient reuse of execution plans.
Even with the 100 possible columns involved here, I'm sure a handful of favorites will take the bulk of choices for sorting, and you will want these execution plans to be re-used as much as possible.


__________________________________________________________________________________

Turbocharge Your Database Maintenance With Service Broker: Part 2
Turbocharge Your Database Maintenance With Service Broker: Part 1
Real-Time Tracking of Tempdb Utilization Through Reporting Services
Monitoring Database Blocking Through SCOM 2007 Custom Rules and Alerts
Preparing for the Unthinkable - a Disaster/Recovery Implementation
Post #460083
Posted Tuesday, February 26, 2008 7:11 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 6:43 PM
Points: 36,752, Visits: 31,208
Marios Philippopoulos (2/26/2008)
As Jack mentioned earlier, do use sp_executesql in your dynamic SQL for efficient reuse of execution plans.
Even with the 100 possible columns involved here, I'm sure a handful of favorites will take the bulk of choices for sorting, and you will want these execution plans to be re-used as much as possible.


I'd have to say, it depends... I've seen it where reuse of the execution plan gives some horrible performance because of the change in selection method caused by the parameter change. I've recently run into that very problem where the recompile produces the correct answer using a merge join on a million rows in milliseconds as opposed to a half hour long run using a looped join.

sp_ExecuteSQL isn't the panacea that some think it is... it's sometimes better to have a recompile occur.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #460221
Posted Tuesday, February 26, 2008 7:30 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Yesterday @ 1:01 PM
Points: 7,118, Visits: 15,000
I'd have to agree that you'd probably be better off just telling it to recompile each time. In a case like this, unless your usage is very biased towards one specific set of columns being ordered - it will actually save you time, since the optimizer won't have to spend any time trying to figure out if it can use the old plan (which it might try to use even if it's not so good for what is going on now).

----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
Post #460232
Posted Tuesday, February 26, 2008 11:41 AM
SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Thursday, July 10, 2014 4:13 AM
Points: 1,860, Visits: 3,597
Good points guys, thanks.

__________________________________________________________________________________

Turbocharge Your Database Maintenance With Service Broker: Part 2
Turbocharge Your Database Maintenance With Service Broker: Part 1
Real-Time Tracking of Tempdb Utilization Through Reporting Services
Monitoring Database Blocking Through SCOM 2007 Custom Rules and Alerts
Preparing for the Unthinkable - a Disaster/Recovery Implementation
Post #460445
Posted Saturday, February 4, 2012 5:55 PM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Wednesday, February 8, 2012 4:53 PM
Points: 110, Visits: 12
Might be out of scope, but on my last job I was issued to create the similar table structure to keep there the contact information. The demand was in order to keep the different contact schemas in the same table.

I've gone with the other approach (it was not too easy to get the boss' agreement )

Instead of the single table with contact information I've implemented the structure of 2 tables: the contact and contact_details. The contact_details has 3 columns: contact_id, value and value_type_id.

That all allowed me to avoid such problems as sorting. And has used the storage more efficient way.
Post #1246945
Posted Saturday, February 4, 2012 11:56 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 6:43 PM
Points: 36,752, Visits: 31,208
sergeyledenev (2/4/2012)
Might be out of scope, but on my last job I was issued to create the similar table structure to keep there the contact information. The demand was in order to keep the different contact schemas in the same table.

I've gone with the other approach (it was not too easy to get the boss' agreement )

Instead of the single table with contact information I've implemented the structure of 2 tables: the contact and contact_details. The contact_details has 3 columns: contact_id, value and value_type_id.

That all allowed me to avoid such problems as sorting. And has used the storage more efficient way.


Depending on the number of nullable "columns", I agree that's one way to use storage more efficiently. I don't understand how using such an EAV table would allow you to avoid sorting problems, though. In fact it would seem to exacerbate the problem a bit. For example, if the EAV contained a FirstName, LastName, and PhoneNumber, how would you sort based on LastName and FirstName?


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1246961
Posted Wednesday, February 8, 2012 4:41 PM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Wednesday, February 8, 2012 4:53 PM
Points: 110, Visits: 12
I was using the xml format for the output. Tha app sorted the set by itself. The major benefit is the indexed value field - which is any of the contact's info.

As there should be the reasonable amount of records - the task is affordable for the app
Post #1249402
« Prev Topic | Next Topic »

Add to briefcase ««12

Permissions Expand / Collapse