Some of the comments posted seem to be following the lines of "what if this, what if that, this isn't quite right".
I would agree that the article may leave some discrepancies but I think it highlights a novice error that I've seen at every company I've worked for.
If you're querying tables that have a few thousand, maybe even tens of thousands of records, with ever lowering costs of high performing hardware, you probably don't need to care about how the query is constructed or executed. 50ms or 500ms means nothing to an end user and you'll get no reward for your troubles.
But if you are dealing with millions/billions of records this is definitely a lesson to be learnt. You'll achieve massive performance increase by helping the query optimizer and giving it a well structured query before it has to process it. Then just sit back and enjoy the praise from your boss.:-P
we have tables like this and my advice is always to break your searches into smaller data sets. i've seen people query a few years of data at once and it literally takes days to run. if you were to break this into smaller queries you would have your data in hours.