July 15, 2009 at 7:51 am
While I agree that you will never have a piece of software 100% bug free, I feel that 80% is not acceptable, especially on something as big as an operating system. You can either spend x number of hours fixing the application before release or spend twice as many hours after cleaning up the mess.
July 15, 2009 at 8:07 am
I completely agree with Steve. Having worked on numerous shrink-wrapped systems over the years (as well as a number of enterprise and SME systems), the temptation is to design and implement to the nth degree. But at some point something has to ship. I have a friend who is working on a major system at the moment and after six months of development work, with continually changing requirements, he's just starting to approach initial completion of the database and class library. The reason - the company directors are trying to think of every eventuality.
I'm reminded of something Fred Brooks said in The Mythical Man-Month (which echoes what Steve was saying):
"Plan to throw one away (you will anyway)"
Truer words have never been spoken...
July 15, 2009 at 8:16 am
It does depend.
I mean, for a small personal tool used to examine files for debugging, 80% is probably doable, so long as the 20% it doesn't cover is easy to spot.
If you're going to release a product however, that 20% had better not result in system failure or erroneous data. Sure, you're not going to get anything 100% perfect, but if a fifth of what you'd like it to do, it doesn't - it had better not screw the rest up. It won't be a case of fixing it it, but fixing all the problems it's caused, and you'll never get a chance to fix the product, cos no one would want software like that!
Certainly you can't include all the features you'd like - but I'd prefer a greyed out or mission option to something that tries, and fails.
July 15, 2009 at 8:38 am
The primary reason for over-engineering is “over-specing”, where business users demand features that are costly to deliver and provide little business benefit.
They have little understanding of what a feature actually costs to produce and how long it will delay delivery, so they have no incentive to make reasonable, deliverable demands.
July 15, 2009 at 8:48 am
Wow, I have some agreement! I was expecting a nice skewering this morning.
The 80% I was thinking about isn't the working/not working, but rather functionality as a few people pointed out. Or even performance. If something only does 80% of what you asked for, or at 80% of the desired load, I tend to think that works for many products.
Note that I did exclude large scale products (Windows, OSes, ERP). They take so long to install and configure that it's hard to justify doing major changes to those.
However we don't have great specs in many cases. And we never will because business users can't think of everything they need. Better software was produced in the past, partly because the domain was smaller, environments more closed, and people didn't worry as much about security. Today the world has changed, and while I think our programming training has gotten worse, there are more things to have to build into a product.
July 15, 2009 at 8:52 am
Steve, great article written to provoke debate, and it has.
I think you're right, but would add a couple of other reasons why.
First, the business users who define the specification don't usually know exactly what they want. Many of us will have run endless workshops where we ask the user to define their requirements in a level of detail that represents an unrealistic expectation from us of what they can possibly tell us. Therefore, they normally get some of what they tell us wrong or more usually they omit important details.
Maybe we sometimes don't get their point and write the specification incorrectly. They haven't time to fully review the huge specification we write and they don't understand some of it anyway, although they think they so. So despite everyone's best efforts we don't have a design that s 100% accurate to start with.
Second, and probably more important, in the many months it takes to build this solution their requirements WILL change, either because the business changes, legislation changes, personnel changes, the list is endless.
So I agree with you, accept that this is the case, get really close to your users, be a real team, start small and unsophisticated but scalable, deliver quickly and regularly and grow the solution together.
All the time making sure there are no large holes that will need blocking with unrealistic manual interventions.
I'm bought into your heresy!
Tim
.
July 15, 2009 at 8:55 am
mike.mcquillan (7/15/2009)
I'm reminded of something Fred Brooks said in The Mythical Man-Month (which echoes what Steve was saying):
"Plan to throw one away (you will anyway)"
Truer words have never been spoken...
Ah, but in his 1995 edition he said "I was wrong, an itterative model makes more sense".
I thought the whole point of agile methodologies was it allowed the customer to prioritise what they wanted. You've got 100 units of work to complete but only 80 are possible, which do you think are essential?
Chances are that the customer will end up deciding the majority of the remaining 20% aren't necessary after all.
I work with a system that is so over-engineered that when it breaks it is a nightmare to fix. Small simple things don't go wrong very often because they are small and simple.
July 15, 2009 at 9:06 am
I am in complete agreement with Steve on this point. The most successful commercial software product development effort I managed (a $20,000,000+ revenue generator) employed this very principal. Construct something useful, with due diligence regarding business case testing, and get it into the market before your competition. But the absolute key to the success of such an approach is having exemplary customer technical support; such that you are able to respond very quickly and effectively to any issues that may arise.
This works, it absolutely does. But you must have both organizational and customer base support in order to run with it.
July 15, 2009 at 9:10 am
With that clarification, I'll accept Steve's premise.
Better to be correct than complete.
July 15, 2009 at 9:25 am
If you buy a car, you have a very limited set of options compared to everything that is already part of that make or model; maybe another radio, or some additional climate control. If you want a custom build car, you have to pay a whole lot more ... Today, customers often expect custom build software for nearly standard prices. If you've ever tried to 'engineer' an application, you probably agree with me that 'software engineering' has little to do with engineering; almost every developer has his own 'style' and almost nothing can directly be reused by other engineers. Copy-and-paste programming is still very popular, and because it speeds up development so much, hardly no one does care at that time about the resulting maintenance nightmare.
If you try to build quality software, there's a great chance someone can build the same functionality much cheaper and faster, so today as a developer you have three choices:
1. Build software fast and cheap, but don't care about the quality.
2. Build quality software in your own time, but don't expect to make money with it.
3. Stop building software.
I do understand why the 80/20 rule must be applied in a commercial environment. But far too often failures do not disrupt the application but instead do corrupt the database. You can fix the problem in the application today, but that doesn't fix the corruption that makes years of collected data completely worthless. And as a developer, you face the same problems. Have you ever written your own version of a library function or procedure, because of the flaws it contained? How often did you spend too much time in creating a work-around for a bug in some 'standard component'? Don't expect the software you use to be any better than the software you produce, because the same (commercial) rules apply.
Being a developer for more than 20 years, I could tell so much more about this subject. But I don't want to post a reply that's more like an article ...
July 15, 2009 at 9:36 am
I have to agree with Steve on this one. I think the rules change a bit when working on a shrink-wrapped or commercial product, but for in-house development you need to implement the major functionality and get a product out and being used. If you insist on making sure you can handle every possible use of the software before you release it you'll never release it because you'll always think of something else that someone might want. The key word there is might. Meet your users stated needs, release the software, and then add the functionality that they find is missing. The most successful project I have worked on worked this way. It was a full web-based order entry system in classic asp and SQL Server that interacted with a production planning system on an AS400 using DB2 and was developed in about 4 months by 4 developers. It worked when released, but we probably added 20% more functionality to it in 2 months after it's released. When we were purchased by another company and our customer service reps had to change to use the new company's order entry system they begged to get back to the one we had developed.
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
July 15, 2009 at 9:37 am
vliet (7/15/2009)
If you buy a car, you have a very limited set of options compared to everything that is already part of that make or model; maybe another radio, or some additional climate control. If you want a custom build car, you have to pay a whole lot more
Absolutely!
100 years ago you got an unsophisticated expensive vehicle which broke down all the time but did get you around. You accepted the compromise because it was the best there was.
By the 1930's Henry Ford made them mass produced, cheaper, but still quite primitive.
Today, cars are much more sophisticated, reliable, less likely to rust, and cheaper than they have ever been.
It's an excellent analogy for hardware but I think it holds for software too!
It's taken 100 years of iterative development to get cars to this point...
Tim
🙂
.
July 15, 2009 at 9:55 am
Steve, in the application world where money distributions must be right the first time or there are legal costs/ramifications, you must try to account for everything. I think that too many applications are sloppily written because they assume that nothing will fall through the code to the end of the conditions. This I believe is why there are so many bugs to fix in operating system code. If you have good design specs, you write complete code, and you perform good testing, you should have a product that is closer to 99/100% than the 80% that you are advocating.
July 15, 2009 at 9:55 am
I haven't gone through all of the posts that were written in response to this article so what I have to say might have been stated already, in which case, I apologize for my redundancy.
I feel that a software should be developed with a consistent vision. MS seems to change their vision of what they want their OS to represent or do as frequently as the seasons. If they maintain an overall vision, they can release incremental fixes that move the OS to satisfying that vision. From my minimal experiences in development and now as a DBA, I've seen changes being implemented that make absolutely no sense and actually move the application away from the initial vision and over a period of time, it just becomes an insane mess that is almost impossible to get back on track.
I would be OK with Windows being released with known issues, which are properly documented and are in some near-future plan to be resolved as long as there is a long term scope of where they are going with their OS. Windows has apparently changed its purpose with every release. Sometimes they focus on functionality and other times, aesthetics. But the way these changes are implemented do not complement each other so it always goes one way or the other.
July 15, 2009 at 10:02 am
With the clarification the you (Steve) meant functions, not errors, I agree.
But I think 80/20 isn't even applicable in many cases with in-house business software.
I have found great value to the business in building simple applications that do as little as save a few hours work every month for a few people. If the application takes an hour to build, and saves two people four or five hours per month, it's worth it. Once it's in use, if new features are needed, or streamlining is needed, and will improve the return-on-investment, then they get added/modified.
First application I built started out really, really simple. Feature-set was very primitive. Interface was a pain in the butt. Wasn't too stable either. Could barely handle five concurrent users. But it was a HUGE improvement over the prior system, which was post-it notes with customer names and phone numbers on them, and a few scribbles about what they wanted to order, or just verbal requests to the production department with nothing even written down. It evolved over the following seven years into something slick, performant, stable, with an interface the users loved, that basically managed the whole business.
That application didn't even come close to 10% of it's needed functionality when it first was released, much less 80%. Was still better than nothing at all.
It's all relative, and it all depends.
If you count DOS as a precursor to Windows XP/Vista/7, then that started with something like 10% of the current functionality. It's now up to something like 200% of what people actually need, and that number is growing.
That's the way I look at it.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
Viewing 15 posts - 16 through 30 (of 53 total)
You must be logged in to reply to this topic. Login to reply