I was watching a presentation on testing recently where the speaker noted that on one project he'd spent twice as much time testing as coding. That sounds like an outrageous amount of time, but my concern was tempered when he said the release to production produced no bugs. I'm sure some bugs surfaced later, but I have often seen that most of the bugs, especially the incredibly annoying ones, are typically discovered quickly.
I was reminded of that presentation when I saw this quote: "...the result was a two-year development process in which only about four months would be spent writing new code. Twice as long would be spent fixing that code."
That's a quote on the development of Visual Studio a few years back. I wonder if the "twice as long fixing" time would have been reduced with better testing efforts earlier in development. It's hard to know since all evidence on the value of testing is based on disparate projects with different teams working at different levels of experience, but I've run into a few people that think more testing reduces overall development time.
The consultant who gave the presentation believes strongly in testing, not only at the application level, but also at the database level. This person has tried different levels of testing on different projects, and found that building and writing tests throughout development results in many fewer issues at release. Perhaps more telling is that when the person has performed less testing in later projects (because the clients declined to pay for it), there were more bugs in production.
I don't know if the total time spent on building software is less with testing occurring early than with allowing clients and customers to test and report bugs. Certainly some of that might depend on how many bugs you fix and how many bugs people must cope with, but I do know that the fewer issues people find with your software, the more excited they are to ask you to write more code in the future.