Is there such a thing as too much attention to the performance of SQL? Are we in danger of wasting time on optimising code when it is entirely unnecessary?
I agree that fast SQL Code is magnificent to see. For example, Phil Factor’s SQL Speed Phreak contests are great for showing effective techniques in reducing code execution time from over twenty minutes to half a second. The challenge in each round is to create the fastest possible method for solving the proposed puzzle – and that means writing computationally efficient SQL. It’s a great test of skill, and even spectators can learn a huge amount just from browsing through the solutions, even though they are occasionally quite experimental.
Is this always relevant to the workplace? Is it better to create highly customized and incredibly fast operations, or less speedy, yet more generic ones, that are quicker to construct? Obviously, the skills and ideas learnt in the Speed Phreak competitions can often be put into play in your everyday tasks but when is it worth really tuning everything up? If a code only executes occasionally and does not affect the users perception of the responsiveness of the application, then why spend time fine-tuning it? It isn’t only a question of time and inclination, but also of resilience and flexibility of the code. Although solutions which are “fast enough” can break just as easily as those which are tuned to perfection, they’re generally easier to put back together than the SQL Speed Phreak solution.
Of course, that means more time spent fixing and tweaking things somewhere down the line on top of the time taken to create solution in the first place, so I’m curious to know where the threshold is. What’s the dividing line between building a solution which is Blisteringly Fast, and building one which is just Pretty Quick?