## Clarity vs Speed

Poll
 Which of these pieces of code would you use a) Series of "If" statements 33.33% 2 votes b) The function call 16.67% 1 vote c) The select statement 50% 3 votes Member votes: 6, Guest votes: 0. You don't have permission to vote in this poll
Author
Message
Paul G-468777
SSC-Enthusiastic

Group: General Forum Members
Points: 114 Visits: 84
The following all produce the same results. Which would you use in your production code and why
a) a series of If statements
if @gross_pay < 35.17
select @boo = 0
else if @gross_pay >= 35.17 and @gross_pay < 151
select @boo = 40
else if @gross_pay >= 151 and @gross_pay < 201
select @boo = 50
else if @gross_pay >= 201 and @gross_pay < 251
select @boo = 60
else if @gross_pay >= 251 and @gross_pay < 301
select @boo = 70
else if @gross_pay >= 301 and @gross_pay < 351
select @boo = 80
else if @gross_pay >= 351
select @boo = 90

b) a function that calls the same series of if statements in "a)"
SET @boo = dboMaxTempDeduction(@gross_pay)

c) a Select statement
SET @Boo = (SELECT 40 WHERE Exists(SELECT 1 WHERE @Inc > 32.5)) + isnull((SELECT ((convert(int, (@inc/50)) -3) * 10) + 10 WHERE EXISTS( SELECT 1 WHERE @inc > 150.01)),0) - isnull((SELECT ((convert(int, ((400.01 - @inc)/50)) -1) * -10) WHERE EXISTS( SELECT 1 WHERE @inc > 400)),0)

The speed of execution for each of these in microseconds is is a) 4.86 b) 51.64 c) 8.26
Mark Cowne
SSCertifiable

Group: General Forum Members
Points: 6549 Visits: 25602
To be honest I wouldn't use any, instead I'd use a range table such as below, avoids hardwiring values

create table Ranges(GrossPayMin decimal(10,3) not null,
GrossPayMax decimal(10,3) not null,
Boo int not null,
primary key(GrossPayMin,GrossPayMax))
insert into Ranges(GrossPayMin,GrossPayMax,Boo) values(0, 35.17,0)
insert into Ranges(GrossPayMin,GrossPayMax,Boo) values(35.17,151,40)
insert into Ranges(GrossPayMin,GrossPayMax,Boo) values(151,201,50)
...
insert into Ranges(GrossPayMin,GrossPayMax,Boo) values(351,99999,90)

select @Boo=Boo
from Ranges
where @gross_pay>=GrossPayMin
and @gross_pay<GrossPayMax

____________________________________________________

Deja View - The strange feeling that somewhere, sometime you've optimised this query before

http://www.sqlservercentral.com/articles/Best+Practices/61537

Paul G-468777
SSC-Enthusiastic

Group: General Forum Members
Points: 114 Visits: 84
Agreed. This is another perfectly viable method. Putting it through the same Iteration test it comes out as the third slowest (23 microseconds per iteration). If having the data matrix dynamic is a requirement to the problem then it is the ONLY reasonable solution.
matt stockham
SSCrazy

Group: General Forum Members
Points: 2664 Visits: 3178
Is it faster if a temp variable is used instead of a temp table?
How about if the table was pre-created? The dynamic solutions are still hardcoding values, to be truly dynamic they would be in an existing lookup table and there wouldn't be any insert overhead.
Antares686
One Orange Chip

Group: Moderators
Points: 27252 Visits: 802
Actually I might do this personnally

SET @boo = (CASE
WHEN @gross_pay < 35.17 THEN 0
WHEN @gross_pay < 151 THEN 40
WHEN @gross_pay < 201 THEN 50
WHEN @gross_pay < 251 THEN 60
WHEN @gross_pay < 301 THEN 70
WHEN @gross_pay < 351 THEN 80
ELSE 90
END)

Antares686
One Orange Chip

Group: Moderators
Points: 27252 Visits: 802
Sorry, it has both clarity and should have the speed you want.

Paul G-468777
SSC-Enthusiastic

Group: General Forum Members
Points: 114 Visits: 84
I have tried the more dynamic lookup suggested earlier with both a fixed table and the table variable. Interestingly enough there is not a significant difference in the timing.

For raw speed the verbose If statements wins over everything. The more convoluted select statement is next, followed by the Select then the function call.

A side note: I was challenged to come up with my own version of the type of "Max" function that SQL lacks - the max between two variables - and this employs that methodology. There are few places one needs it but if you do this is as fast as it is ugly.
Paul G-468777
SSC-Enthusiastic

Group: General Forum Members
Points: 114 Visits: 84
That is an excellent solution!
Jeff Moden
SSC Guru

Group: General Forum Members
Points: 218432 Visits: 41996
Heh... no matter how you swing it, this whole thing is RBAR... if it's for a GUI single row proc... no problem... if it even comes near a batch, big problem.

--Jeff Moden

RBAR is pronounced ree-bar and is a Modenism for Row-By-Agonizing-Row.
First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column.
If you think its expensive to hire a professional to do the job, wait until you hire an amateur. -- Red Adair

How to post code problems
How to post performance problems
Forum FAQs
Paul G-468777
SSC-Enthusiastic

Group: General Forum Members
Points: 114 Visits: 84
What you are saying is true. In the real world where this code was snatched from with a little bit of salty refactoring I avoided this process entirely. Goes back to an old axiom "If you don't like the answer - rethink the question." That is off subject though.

I was very interested in where experienced, practical professionals would come down on the question of having to choose between "verbose maintainable and slow" vs "nicely black box but very slow" vs "efficient but not so obvious". I wanted to use some sample code that would demonstrate these notions. I fear I went afoul there. My apologies if I wasted anyones time.

There are soapboxes everywhere but given a sort of "Sophies Choice" of code I wondered which side serious professionals would come down on. As a person who manages, trains or mentors people on a regular basis I keep trying to understand development choices others make as a part of training my thinking. For various reasons that responsibility brought me to where I had a need to visit the "clarity vs Speed" issue.

Thank you to all of you who responded. Each and every response was very instructional.