• A few thoughts:

    Sometimes the user (or business team) is at fault. The environment for business today is about cost-control, and anything the bean-counters don't understand as a profit-generation centre tends to go by the wayside. When the requirements analysis process is failing (or has failed), often because the client doesn't want to pay for it to happenit makes it impossible to produce valid product because there is/was no valid definition of needs. Many projects I have seen fall into this category for failure, and result in product where the features themselves are engendering buggy outcomes. This, on a larger scale, is about communication failure, which has nothing to do with software development per se. Look anywhere in modern society and you can spot the side-effects of the decline in quality of communication. Software definition requires precision of language long before code is cut, and in many environments out there the communication is imprecise, inappropriate and downright counter to achieving success.

    More often the fault lies in the talent pool (as William Plummer observed). Try running a team of experts and see how long you can sustain it. Ten years ago my private firm had 11 professionals, all of them full-time, and all of them working projects as a unit. The expertise was rich and deep, and there were few jobs we tackled that did not succeed at almost every level, and usually exceeded client expectation. Five years ago, I had 3 experts on full-time, and 5 regular sub-contractors. Today, I have 1 body on full-time, and find it harder every year to find qualified, provable part-time talent within the budget constraints of the client requests. When a client wants a handheld application to feed an inventory system for 20K, end-to-end, fully integrated with an existing system, they generally have that budget end-to-end, not just for one aspect of coding. Try hiring even a 3-expert team for that rate, and you will never profit by it. One slip and the profit, if there was any, vanishes. But profit aside, just finding the talent is almost impossible. No offence intended to the University graduated computer scientists out there, but a degree doesn't apparently impart any more reliable a measure of value and productivity than you can get from a kid with a passion. Worse still, even when you can build a team with true expertise, keeping it is nearly impossible post-release, because the profit margins are now too small to float the team to the next project. Having to bid against off-shore teams, where wages and benefits are often less than a third as high, I can confirm with fair certainty that it is impossible to feed the team reliably.

    Environmental complexity is also a growing challenge, but libraries are just a part of that, and probably the least problematic part. The reality is being a good code-monkey is no harder than being a good mechanic, it's just different. Working well with a library framework is not any harder than coding to the machine registers, or any easier. Both are about logic. The libraries are just faster to code against. Good and bad coders exist on every end of every spectrum. The complexity comes from the interactions of the software with pre-built components, whether they be proprietary or not. Procedural coding is almost impossible on a project of any reasonable scale these days (hell, even scripting often relies upon "objects" these days), and object-based coding is harder than procedural coding. (That's not a comment on which is better or worse.) It takes an entirely different frame of mind to code to objects well, and I think that it is practically impossible to educate that thought process into place. People with good spatial relations skills must be better at it, and it must be some natural talent to close your eyes and see objective patterns. But that aside, the point is that it takes a degree of dedication and basic talent to work well in today's interactive systems, and like any talent pool that natural pool is smaller because it is more specific. That means a lot of the fundamental objects in a system (hardware, software, people, etc.) are not aligned to an approach that is conducive to developing strong couplings, because they were not built to be so aligned. This difference at the boundaries of the objects that interact is hard to test, because of the variations, and it can be a source of grief on any project that is even partially distributed. I would agree with David le Quesne that software is not inherently any more buggy, with the qualification that because of these complex underpinnings the nature of the bugs is different, probably harder to qualify, and definitely harder to fix because of the cascading dependencies. Anyone who had ever run into a .NET Framework quirk will attest that the lowest level bug in that library can show itself in some of the strangest surface ways, and can be hard to pinpoint for cause even with an excellent debugger. A final thought on libraries is that any talented coder will remain talented in any library, because their approach will be to deconstruct it as they use it, and understand its particular grammar. Having said that, my million line safety application has suffered 6 bugs in the last year as a direct result of patches to O/S, web, web browser, and other dependent features that are out of our direct control. In total, we had 7 bugs reported and resolved, so...maybe that says something.

    My final stream of thought on this is that a large part of the problem with buggy software is speed. The mantra that "time is money" has warped the awareness that the front-end load of time that it takes to do a development job right is higher in direct proportion to quality (with a good process). It seems to me that clients always want to lower the cost of development to its bare minimum, but have no expectation of the long-term potential cost impact of doing so. All efforts to introduce "project management" seem to fail on the budget these days, which often is too small even to manage the project, let alone develop and test it. The fault seems to lie with the "computer as toaster" mindset that is not only false, but nonsensical. MS Excel costs a $120, and so they expect their proprietary cost-tracking inventory tool couldn't possible cost more than a few grand. I actually turned a job down three years ago when a prospective client called me and said, "We calculated that every desktop we have costs us about $X a year to update, so we think this project will cost us $300K." The project was a cross-national real-time order processing link that would integrate with their third-party product providers, and their financial systems. It would have been mission critical, and connected to 41 outside suppliers (all with different APIs) and 2 financial systems, one internal and one at their financial institution. Requirements? Their response was we already know what it has to do. That was it, and there was no arguing the point. The expectation was it would be done in 6 months. I told them I thought I would pass, and offered that I also thought they should revisit the way they were calculating their budget, etc. Two years later and they had poured over a million dollars in, had a system that was affecting their bottom line because it was so buggy as to the feature-purpose, and they finally gave it up when they were bought out. (One feature missed was the provision to calculate customs, sales tax variations, etc.) In essence, the belief that these are easy solutions combines with the need to have them yesterday, but there is a lack of commitment to providing the structure necessary to develop them well.

    Now, having given myself a headache, I think I will go manage my taxes...which are often much easier than the work that generates them.