One of my editorials awhile back got some strong comments from a developer who says he delivers mostly bug free software to his clients and stands by his word. He was also attacking Microsoft and others for the quality of their software. Valid points, but I think that there are a variety of reasons and many companies live in different worlds from others.
First there's the software companies, the shrink wrap folks. Not necessarily sales oriented because I think the open source/FSF software falls into here. This is the group that produces generic software that others will install and use on their systems. Whether this is SQL Server or MySQL, the group is essentially producing a "product". I think this is the most difficult type of software to develop since you are dealing with open ended situations where the end machine could have any configuration and your software should deal with it.
The second situation is where you're developing software for a specific target, a company, a series of machines, a client, whatever the situation is, it's not unknown and you should do a good job testing the possible places where things will fail.
Both of these situations are different, and both hard to work in, but neither is an excuse for the current state of bugginess. I don't think we'll ever eliminate bugs, but we can do better. As for what's the reason? There are a lot given in the article, not enough time, laziness, too much complexity, etc. They're opinions and probably all true in some cases, but not necessarily the silver bullet.
Personally I think part of the issue is money. People race to get things done so they can sell them or move to the next project without really being willing or able to spend the time required to go the extra mile.
The other part? Libraries. I think we build more and more ways to abstract the developer from the complexities of a particular component/function/feature, and they end up not really understanding or being able to control the stack of code. Not that I think every programmer should write every low level routine, but they should deal with those routines sometimes. And they should really have the ability to work with those routines. In this case I see Open Source as a huge advantage over closed source.
But it's an advantage not a silver bullet. After all, I see many bugs in Open Source software as well.
Follow me on Twitter: @way0utwestForum Etiquette: How to post data/code on a forum to get the best helpMy Blog: www.voiceofthedba.com
I don't think software is particularly more buggy today than it was 10 or even 20 years ago. Remember dBASE IV, the software that was so full of 'anomalies' that it more or less finished Ashton-Tate. I think our expectations of software are higher these days, we all demand flawless 'plug and play' without the need to download drivers from the internet or resort to regedit. Maybe it is just that we are more aware of software quality control.
Libraries are a double edged sword. They provide a layer of abstraction separating the programmer from the lower levels of the environment, and providing uniformity and standardisation of coding. But at the same time I agree that they do encourage a sort of 'black box' approach to programming. 'Dunno what it does, but you put this in and something else comes out'.
When I was a fledgeling analyst/programmer, I worked for one manager who insisted that all his programmers had low-end specification machines (286s with 2Mb RAM and a 20Mb HD), because he figured if we could write to the code so that it worked on those, it would work on anything. Today's machines are so powerful that programmers don't need to write particularly efficient code.
Have you tried scolling through Word or Excel recently? It is so easy to overshoot the lines you are looking for. I am prediciting that Microsoft will release a speed control for Windows that allows you to slow the whole operating system down, a 286 emulator maybe
The problem is that there is not enough time spent on testing? TRUE!, but I believe that there is a limit to where the programmer must test. The ONLY person that can test a product properly is the user. The problem is to get the user, which needs to do his normal job also, to spend the time. Especially in the corporate world: If they find a bug 3 - 6 months after implementation, we as programmers sometimes wonder how they managed to use the product for so long, because, of course, the bug is now critical.
The development world seen from my limited view is this...
Budgets are very tight! So tight that no one can have an effective team to do all aspects of development. This may not be true at Microsoft or Oracle or SAS or (fill in the blank) but where I live we have an IT team of experts. Usually one person has the best skills in a given area and one or more people with varying skills to help back up the one skill set.
However, when it comes to programming, I'm it on my project except for one subcontractor. The sub had to give a fixed bid to get the work and has done a great job for the money paid. However, I need more of the sub's time, but am at management’s mercy to get the money to use his services.
That last paragraph sounds like I work for bad managers, but that's not true! They are excellent, but the contract we took was fixed bid so they are working with a limited set of dollars.
Back in "the good old days" when we walked to school 5 miles, uphill both ways, I worked in a group that included the CIO, 2 programmers, 1 DBA, 1 Network/hardware guy, 1 Tech Writer, and 1 Tester. We did 6 week releases and the CIO would agree with the corporate management team on a list of 10 items we would "try" to include in the next release. The IT group would meet and decide who would do what on the 10 items and agree to attempt all 10 or maybe only 6 or 7 or whatever we felt like as a group we could accomplish.
The DBA would actually do all the database thingies, not just backups and security. The Tech Writer would write up detailed specs on the items, them begin writing the test specs. The Tester would help the Tech Writer until we had something to begin testing. We programmers would decide which parts suited our skill sets the best and begin coding. The CIO would ride herd on the whole process and was a pretty good code slinger when he had all of his CIO duties under control.
In short, it was a team effort with all of us working to make the process a success. In my current situation I am basically a corporate team of 1 with one excellent subcontractor helping when I can get dollars for specific duties. Bubba's, this ain't the most efficient way to do software...
However, in today's corporate world, very few have the budgets to afford to build teams and keep them together for extended periods of time. It used to be that you lost good team players because good job opportunities came along. These days you loose the whole damn team every time a project closes, if you even had a team for the project
Ouch, I think sombody hit a nerve...
A few thoughts:
Sometimes the user (or business team) is at fault. The environment for business today is about cost-control, and anything the bean-counters don't understand as a profit-generation centre tends to go by the wayside. When the requirements analysis process is failing (or has failed), often because the client doesn't want to pay for it to happen, it makes it impossible to produce valid product because there is/was no valid definition of needs. Many projects I have seen fall into this category for failure, and result in product where the features themselves are engendering buggy outcomes. This, on a larger scale, is about communication failure, which has nothing to do with software development per se. Look anywhere in modern society and you can spot the side-effects of the decline in quality of communication. Software definition requires precision of language long before code is cut, and in many environments out there the communication is imprecise, inappropriate and downright counter to achieving success.
More often the fault lies in the talent pool (as William Plummer observed). Try running a team of experts and see how long you can sustain it. Ten years ago my private firm had 11 professionals, all of them full-time, and all of them working projects as a unit. The expertise was rich and deep, and there were few jobs we tackled that did not succeed at almost every level, and usually exceeded client expectation. Five years ago, I had 3 experts on full-time, and 5 regular sub-contractors. Today, I have 1 body on full-time, and find it harder every year to find qualified, provable part-time talent within the budget constraints of the client requests. When a client wants a handheld application to feed an inventory system for 20K, end-to-end, fully integrated with an existing system, they generally have that budget end-to-end, not just for one aspect of coding. Try hiring even a 3-expert team for that rate, and you will never profit by it. One slip and the profit, if there was any, vanishes. But profit aside, just finding the talent is almost impossible. No offence intended to the University graduated computer scientists out there, but a degree doesn't apparently impart any more reliable a measure of value and productivity than you can get from a kid with a passion. Worse still, even when you can build a team with true expertise, keeping it is nearly impossible post-release, because the profit margins are now too small to float the team to the next project. Having to bid against off-shore teams, where wages and benefits are often less than a third as high, I can confirm with fair certainty that it is impossible to feed the team reliably.
Environmental complexity is also a growing challenge, but libraries are just a part of that, and probably the least problematic part. The reality is being a good code-monkey is no harder than being a good mechanic, it's just different. Working well with a library framework is not any harder than coding to the machine registers, or any easier. Both are about logic. The libraries are just faster to code against. Good and bad coders exist on every end of every spectrum. The complexity comes from the interactions of the software with pre-built components, whether they be proprietary or not. Procedural coding is almost impossible on a project of any reasonable scale these days (hell, even scripting often relies upon "objects" these days), and object-based coding is harder than procedural coding. (That's not a comment on which is better or worse.) It takes an entirely different frame of mind to code to objects well, and I think that it is practically impossible to educate that thought process into place. People with good spatial relations skills must be better at it, and it must be some natural talent to close your eyes and see objective patterns. But that aside, the point is that it takes a degree of dedication and basic talent to work well in today's interactive systems, and like any talent pool that natural pool is smaller because it is more specific. That means a lot of the fundamental objects in a system (hardware, software, people, etc.) are not aligned to an approach that is conducive to developing strong couplings, because they were not built to be so aligned. This difference at the boundaries of the objects that interact is hard to test, because of the variations, and it can be a source of grief on any project that is even partially distributed. I would agree with David le Quesne that software is not inherently any more buggy, with the qualification that because of these complex underpinnings the nature of the bugs is different, probably harder to qualify, and definitely harder to fix because of the cascading dependencies. Anyone who had ever run into a .NET Framework quirk will attest that the lowest level bug in that library can show itself in some of the strangest surface ways, and can be hard to pinpoint for cause even with an excellent debugger. A final thought on libraries is that any talented coder will remain talented in any library, because their approach will be to deconstruct it as they use it, and understand its particular grammar. Having said that, my million line safety application has suffered 6 bugs in the last year as a direct result of patches to O/S, web, web browser, and other dependent features that are out of our direct control. In total, we had 7 bugs reported and resolved, so...maybe that says something.
My final stream of thought on this is that a large part of the problem with buggy software is speed. The mantra that "time is money" has warped the awareness that the front-end load of time that it takes to do a development job right is higher in direct proportion to quality (with a good process). It seems to me that clients always want to lower the cost of development to its bare minimum, but have no expectation of the long-term potential cost impact of doing so. All efforts to introduce "project management" seem to fail on the budget these days, which often is too small even to manage the project, let alone develop and test it. The fault seems to lie with the "computer as toaster" mindset that is not only false, but nonsensical. MS Excel costs a $120, and so they expect their proprietary cost-tracking inventory tool couldn't possible cost more than a few grand. I actually turned a job down three years ago when a prospective client called me and said, "We calculated that every desktop we have costs us about $X a year to update, so we think this project will cost us $300K." The project was a cross-national real-time order processing link that would integrate with their third-party product providers, and their financial systems. It would have been mission critical, and connected to 41 outside suppliers (all with different APIs) and 2 financial systems, one internal and one at their financial institution. Requirements? Their response was we already know what it has to do. That was it, and there was no arguing the point. The expectation was it would be done in 6 months. I told them I thought I would pass, and offered that I also thought they should revisit the way they were calculating their budget, etc. Two years later and they had poured over a million dollars in, had a system that was affecting their bottom line because it was so buggy as to the feature-purpose, and they finally gave it up when they were bought out. (One feature missed was the provision to calculate customs, sales tax variations, etc.) In essence, the belief that these are easy solutions combines with the need to have them yesterday, but there is a lack of commitment to providing the structure necessary to develop them well.
Now, having given myself a headache, I think I will go manage my taxes...which are often much easier than the work that generates them.
God save us if attorneys are used to keep software bugs from getting into released code