The Jan 2008 issue of Software Test & Performance Magazine (available as free PDF download, you can also download back issues through 2004) has an article called 'Gauging Performance in the Absence of Metrics' that I found interesting. From a SQL perspective it's a good read because it talks about how you decide what is fast enough and it takes into account there are a lot of pieces vying for time; UI, network, database. It also talks about setting goals for concurrency and how it's possible to acheive the goal but still not hit the mark because of the way the goal is described.
To some degree it's all in how you write the requirements (tests)! It's not often we (DBA's) get a real definition of how fast anything needs to be, in practice we just try to make everything fun fast! Not the worst strategy, but all tuning is about trade offs and having some real values to hit would help us assess the trade offs in formal fashion. If nothing else this is an interesting view into the world of the tester and I suspect we all agree that we could stand to improve our testing.