I agree that software testing is often not what I would consider adequate; many times there isn't even a written test plan for in-house developed software (the plan is in someone's head...) And results from the testing that IS done often are not documented or tracked. So it's a multi-part problem in my view - management doesn't understand the value of spending time on it (until the crisis occurs), programmers are pressured to shorten developement times, users just want the software NOW, and so on. Better communication from IT on the benefits of testing may help; you sometimes need to educate non-IT people.
However I have a couple of "issues" with the article. One, it contains no depth (i.e. doesn't discuss the problems or causative factors). Two, the fact that 81% of companies don't know their testing budget is not relevant (depending on the size of the companies, which we don't know) - in many companies software testing is not tracked as a separate budget item in which case that question can't be properly answered. And third, the article reads like an "info-mercial". Software testing is inadequate, you need to hire my company to do it for you!
I prefer more information in articles, so I can make informed decisions. What are the barriers to good software testing? What are best practices, and who is using them? What are the costs associated with inadequate software testing? (you need those if you're going to convince upper management to budget dollars for it) What's the ROI on good software testing? How do you define "good" or "adequate" software testing?
(Wow, I should write an article - I've got the outline already!) :hehe:
Here there be dragons...,