I find the last part of the question the most interesting: "Do you ever take this advice at face value, or do you always test it to ensure that it works well in your environment?"
Of course, in an ideal world, nobody would ever take anybody else's word for it and would *thoroughly* test all code/patterns/workaround etc before using them... but just suppose the sugar has already hit the fan and thorough testing is impractical - how much testing is 'enough' before deploying?
And does the source of the advice make a difference to the amount of testing you deem necessary?
Suppose the bosses are screaming and jobs are on the line (or your reputation as a miracle-worker is being questioned) - do you test less thoroughly if you get a suggestion from a more reliable source than from an un-trusted source? Would a quick once-over on Developer edition on your laptop provide enough comfort to convice you to deploy to a live production server *IF* the advice came from BoL or an MVP?
And what if a heretofore reliable source suddenly gives a duff bit of advice (even monkeys fall out of trees)? Do you stick by them because of their historical reliability, or feel cheated and shun them in future?
I'm genuinely curious what people think (and what they do in the real world ;-)
My own $0.02 worth is that I tend to turn to Google when I hit an immediate problem, use blogs and forums to keep up to speed with what's going on in the rest of the world, and use books when i need to deep-dive into a topic. I'm in agreement with the posters above who say that past performance is generally the most important factor when deciding which blogs and books to choose: considering both the author and the publisher.