It Happens

  • I agree with Simon's sentiments, however, I have rarely been given the support from management (especially time) and the calibre of colleagues with the appropriate training and experience in the relevant skills (including myself at times) to deliver 100% quality software. Even when everything is going 100% there has always been something that has been forced in with pressures from those who have the power to say JFDI (Just Do It - exact phrase used twice, approximations almost every other time). It may not be right from a software development perspective but sometimes it is the right commercial decision.

    Simon appears to be delivering products which have a very different nature to project work. Technically the same but commercially different. I say kudos to Simon, his colleagues and his management team.

    Gaz

    -- Stop your grinnin' and drop your linen...they're everywhere!!!

  • Kumbaya, all of our software is perfect! 😀

    Simon,

    Sounds like you are in an company that actually encourages software quality and probably has the processes and personal to carry out that mission. Good for you.

    Unfortunately many people in this industry don't work for such establishments. A good developer would not release software with known defects, but in practice, "good enough" is the enemy of "perfect" and the developer doesn't always get to decide priorities. How many shops would pass the "Joel Test"? How many non-programmers in management are making technical decisions that effect code quality? (Even our so-called industry leaders, Microsoft, Oracle and Google, still produce software that has embarrassing bugs and oversights, not to mention security holes.)

  • paul.knibbs (9/25/2013)


    I always like Maurice Wilkes' quote about this:

    "I can remember the exact instant when I realized that a large part of my life from then on was going to be spent in finding mistakes in my own programs."

    That was back in the late 40s, I believe, so this has been going on for a while! It doesn't help that SQL, as a language, isn't really well designed for debugging--it's one of the few languages I've used where I find it faster to just rewrite a line of code that isn't working rather than try to figure out why it isn't.

    The "exact instant" refered to was some time in 1948 or 1949. He never did say exactly when, as far as I could discover, but he did say it was "as soon as [Wilkes and his research team] started programming" EDSAC; that would have been before the machine was declared completed and operational in May 1949, but more than a year after design for the project started late in 1946. Your quoted text which mentions it is Peter van Linden's 1994 misquotation from Wilke's 1985 memoirs.

    The Cambridge Math Lab fornighly colloquia which he established were still going striong when I worked for EE (1967-69) and I attended quite often. It was a long journey from my base at the Nelson Research Labs, so I couldn't get to to all of them, but I did meet Wilkes a couple of times and found he was even more impressive in person than in reputation.

    Tom

  • chrisn-585491 (9/26/2013)


    A good developer would not release software with known defects, but in practice, "good enough" is the enemy of "perfect" and the developer doesn't always get to decide priorities.

    Chris - as we said in the 60's "Right On Brother!"

    We do not get to make the call on scope, correctness, or completeness. And often production developers in the commercial environment are embarrassed when management starts trying to find who is at fault for the failures of the product, and marketing says "We can only sell what they give us. Don't look at us, it was them!"

    Not all gray hairs are Dinosaurs!

  • simon.crick (9/25/2013)


    "embrace the idea that code can be wrong" -- Nathan Marz

    Surely this is the wrong approach, as it just gives people an excuse to be lazy.

    Code should NEVER be wrong.

    It is not difficult to write code that works correctly.

    You just need to follow two simple rules:

    1) Be crystal clear about the valid set of inputs, and throw an appropriate exception if the input is not in the valid set.

    2) If you are not sure whether the logic is correct, break your code into smaller functions/procedures until it is 100% clear that it will work correctly in all cases.

    Bugs only arise when people do not follow these simple rules.

    Simon

    That statement is hopelessly optimistic, in my view. Another esential rule is always write defensive code - yes, your point 1 is a part of that, but only a tiny part; we all know that complex operating systems, compilers, and DBMSs contain bugs, and it is necessary to recognise that, to recognise also that we don't know what all those bugs are, and to write code that tries to work despite such bugs do proper error detection, error containment, and error reporting - the whole effor management thing - not just validation of inputs with exception throwing when they are invalid. But the cardinal rule of all, far more important than anything you have said, is first know what the requirement is and often that is in fact the most difficult thing to do.

    Even if one tries to follow all the rules (and there are still more, to do with avoidance of complexity, economy of invention, esthetics, resource consumption, which ought to be stated as part of the requirement but probably won't be because the requirement is almost always incomplete, which is another source of problems) the complexity of the task may be such that human limitations are likely to lead to errors - another function of defensive programming is to detect and contain these as far as possible, but as far as possible often is no going to be as far as perfection and that means that we must accept that our software will contain bugs instead of claiming that we are so perfect that we never make a mistake.

    Tom

  • Dave62 (9/25/2013)


    I wrote my first program on punch cards in the late 1970's.

    Really? That's unusual. I never wrote on punch cards or on paper tape, I used to punch holes in them instead or, sometimes, have someone else do that for me after I had written some program on paper. 😀

    (Sorry, couldn't resist that)

    I don't think the view that our programs will have bugs or may not work entirely as intended is too gloomy. I don't think that's an excuse to be lazy either. I just think it shows the maturity that's required to realize we're not perfect.

    It's delusional to arrogantly deploy a solution and then walk away thinking it's perfect and will never need to be touched again. :crazy:

    As soon as we have perfect humans we can then start expecting perfect programs. It may be a long wait... :hehe:

    All that makes perfect sense. Well said.

    Tom

  • L' Eomot Inversé (9/26/2013)


    Really? That's unusual. I never wrote on punch cards or on paper tape, I used to punch holes in them instead or, sometimes, have someone else do that for me after I had written some program on paper. 😀

    Tom,

    You have made me think of those old green and white forms we used to code Assembler and RPG back in the day. I remember desk-checking a pile of those sheets trying to determine the correctness of the process and flow before I had a deck of cards punched. My how times have changed!

    :Wow:

    Not all gray hairs are Dinosaurs!

  • Gary Varga (9/25/2013)


    simon.crick (9/25/2013)


    Steve Jones - SSC Editor (9/25/2013)


    simon.crick (9/25/2013)


    I was lucky. I had teachers who drummed this into me from an early age, and I have written some fairly complex systems that have not needed a single bug fix or enhancement in 20+ years.

    I can't argue with that. I hope you're well compensated as you must be one of the best developers around. I've never me anyone that could make this claim.

    I sense a degree of disbelief, and I can't blame you given the culture of not caring about the odd bug here and there that seems to invaded the computing industry, but it is 100% true. The first system I worked on was a client database and commission tracking system for a financial advisor. I finished it in the early 1990's, and it is still in use today, and it hasn't been touched for over 20 years.

    Simon

    I struggle with "fairly complex systems" and "not needed a single [...] enhancement". No changing requirements? Are you sure? Where in this world is a place where that neither financial regulation nor personal data regulation has changed in 20+ years???

    Surely nowhere N America, Scandinavia, the European Union, Australia, New Zealand, or South Aftrica. Unless Mr Crick did that work in a very primitive country he is evidently equipped with a Crystal Ball which allowed him in the early 90s to antcipate and program to comply with all the regulatory changes that were going to occur in the next 20 years.

    Tom

  • L' Eomot Inversé (9/26/2013)


    Dave62 (9/25/2013)


    I wrote my first program on punch cards in the late 1970's.

    Really? That's unusual. I never wrote on punch cards or on paper tape, I used to punch holes in them instead or, sometimes, have someone else do that for me after I had written some program on paper. 😀

    (Sorry, couldn't resist that) ...

    That's a good one Tom! 😀

    I guess technically we still don't "write" programs even today. We just type them into an IDE or editor of some kind. I haven't found a compiler yet that can make heads or tails of my chicken scratch.

  • L' Eomot Inversé (9/26/2013)


    Gary Varga (9/25/2013)


    simon.crick (9/25/2013)


    Steve Jones - SSC Editor (9/25/2013)


    simon.crick (9/25/2013)


    I was lucky. I had teachers who drummed this into me from an early age, and I have written some fairly complex systems that have not needed a single bug fix or enhancement in 20+ years.

    I can't argue with that. I hope you're well compensated as you must be one of the best developers around. I've never me anyone that could make this claim.

    I sense a degree of disbelief, and I can't blame you given the culture of not caring about the odd bug here and there that seems to invaded the computing industry, but it is 100% true. The first system I worked on was a client database and commission tracking system for a financial advisor. I finished it in the early 1990's, and it is still in use today, and it hasn't been touched for over 20 years.

    Simon

    I struggle with "fairly complex systems" and "not needed a single [...] enhancement". No changing requirements? Are you sure? Where in this world is a place where that neither financial regulation nor personal data regulation has changed in 20+ years???

    Surely nowhere N America, Scandinavia, the European Union, Australia, New Zealand, or South Aftrica. Unless Mr Crick did that work in a very primitive country he is evidently equipped with a Crystal Ball which allowed him in the early 90s to antcipate and program to comply with all the regulatory changes that were going to occur in the next 20 years.

    It was in the UK. The key to making it future proof was NOT to try to predict future, i.e. NOT to make any assumptions, but instead to identify the fundamentals that would never change.

    It's a good strategy that I try to apply to all my software.

    I thoroughly recommend it. 🙂

    Simon

  • simon.crick (9/25/2013)


    I believe technology also has a role to play in the prevalence of bugs in modern software.

    I certainly agree. Many bugs are caused by appallingly ill-conceived technology. The C++ language is perhaps the best example of this. C# has of course inherited all C++'s idiocies, and although it is possible to work in a mode that leaves the worst of the C++ rubbish out people have got used to using that rubbish and are too lazy to learn to avoid it. The rot started with the spread of C, but C's fault was that people treated it as a high level language rather than ralising that it was an extremely low level language.

    I have no hard evidence to back this up, but I would be prepared to bet quite a lot of money that modern languages like C# with lots of "advanced" programming concepts suffer from way more bugs than earlier languages like C++ that are more minimalistic.

    Advanced programming concepts are fine in principle, but they can lead to code with lots of unnecessary layers and levels of abstraction that can obscure the underlying logic, resulting in mis-understandings that would not have occurred if simpler programming concepts had been used.

    The idea that abstraction is a bad idea is just plain crazy. Abstraction is one of the main tools for reducing a problem into multiple smaller parts. Simple advanced programming concepts like reduction, function mapping, and unification are going to cause far less damage that hideous primitve complexity like C++'s Friend concept.

    I firmly believe that computer science will eventually realize this and there will eventually be a return to simpler and more minimalistic languages and technologies.

    I think computer science is way ahead of you.

    But this won't happen until computer scientists get over their inferiority complex (controversial, I know, but a well recognised phenomenon) and start thinking about how they can help us build more reliable systems rather than how they can impress their physics/engineering/mathematics colleagues with difficult to understand research papers.

    My view is that software with fewer bugs will be developed only when unscientific developers stop turning their noses up at science and actually take the trouble to learn some. You'll find that software written in languages like Haskell, ML, Erlang, SCCS and verified using tools like the Scott Calculus or Z have far lower bug rates than software written in C++ or Visual Basic. You'll also discover that developers who have been taught by computer scientists to consider error management as an essential part of every program manage to write programs that cope well with environmental problems (OS and compiler bugs, hardware failures, incorrect input, etcetera), while those that haven't will at best have a try at dealing with invalid input.

    Tom

  • One quick note:

    I don't see the idea of accepting we'll have bugs in our software as implying that we allow those bugs to live or that we don't work to eliminate them. It's not an excuse to write shoddy code.

    It's a reality and a challenge to work to improve our skills and software.

  • L' Eomot Inversé (9/26/2013)


    The idea that abstraction is a bad idea is just plain crazy. Abstraction is one of the main tools for reducing a problem into multiple smaller parts. Simple advanced programming concepts like reduction, function mapping, and unification are going to cause far less damage that hideous primitve complexity like C++'s Friend concept.

    Abstraction is good when used appropriately, but all too often I find myself debugging a rediculously simple piece of logic that could have been written in 10 lines of code that someone has somehow turned into 1000 lines of services, factories, handlers, delegates, extensions, overrides and god knows what else. I trace it through the 17 layers of junk only to find that it has done "x = a + b" instead of "x = a * b".

    Such elementary bugs would never occur if the logic was not obscured by vast amounts of unnecessary abstraction.

    This is not directly the fault of the programming language, as programmers do not have to use these techniques, but the fact that so much emphasis is placed on learning "advanced" programming concepts seems to make a lot of new programmers think they should be using the advanced concepts all the time.

    Personally, I would like to see more computer scientists working on ways to prevent unnecessary complexity, e.g. automatic compiler warnings when the amount of wrapper code exceeds some multiple of worker code.

    Simon

  • simon.crick (9/27/2013)


    L' Eomot Inversé (9/26/2013)


    The idea that abstraction is a bad idea is just plain crazy. Abstraction is one of the main tools for reducing a problem into multiple smaller parts. Simple advanced programming concepts like reduction, function mapping, and unification are going to cause far less damage that hideous primitve complexity like C++'s Friend concept.

    Abstraction is good when used appropriately, but all too often I find myself debugging a rediculously simple piece of logic that could have been written in 10 lines of code that someone has somehow turned into 1000 lines of services, factories, handlers, delegates, extensions, overrides and god knows what else. I trace it through the 17 layers of junk only to find that it has done "x = a + b" instead of "x = a * b".

    I couldn't agree more. In fact one of the things I really dislike about C++ is the way that sort of idiocy is encouraged in the C++ world. I've no particular objection to template libraries, factories, post offices, services, and so on but I have an intense dislike of their misuse, which appears to be much more common than sensible use of them; and certainly overuse implies misuse.

    Such elementary bugs would never occur if the logic was not obscured by vast amounts of unnecessary abstraction.

    I dont regard that sort of misuse of tools abstraction, I call it stupidity. The purpose of abstraction is to simplify, to reduce complexity, to make understanding easier. For example using STL iteraters to loop over the numbers from 1 to 7 instead of a simple C-style loop is not abstraction, it's obfuscation, because it makes the code harder to unerstand, not easier. Using prolog for a task which requires unification instead of reinventing the wheel in C++ is abstraction - because implementing unification in C++ is probably going to be a nightmare, whereas in prolog unification happens behind the scenes so you've abstracted away all the detail of its implementation (that's why we have logic programming languages - because they can do serious unification and constraint resolution without asking us to implement it in a language not designed for it). Using prolog for a task that doesn't require unification is not abstraction, it's just idiocy.

    This is not directly the fault of the programming language, as programmers do not have to use these techniques, but the fact that so much emphasis is placed on learning "advanced" programming concepts seems to make a lot of new programmers think they should be using the advanced concepts all the time.

    Yes, that's often true. But I can't understand anyone who thinks that. Surely anyone who has been taught to programme anything in whatever language (except perhaps C++ or Java) knows that you pick the tools and off-the-shelf components that will give you the easiest design, programming, testing, integration, validation, rollout and support task? If they haven't been made to understand that, they surely shouldn't be allowed to do anything except under very close supervision until they have learnt to understand it, because letting a developer ignore the impact of what he does on any part of that chain is giving him carte blanche to write unusable rubbish. Anyway, in some cases it is the fault of the programming language (or the development system or the available components/libraries for that language) - for example you probably can't write anything useful in Java without using a lot of stuff from run-time libraries and in the case of Java it seems to be the norm for those libraries to wrap things up in such a way that it is sometimes well nigh impossible to understand what they are supposed to be an abstraction of (and some, but not all, of the C++ template libraries are as bad or even worse, or were last time I looked which admittedly is a while back).

    Personally, I would like to see more computer scientists working on ways to prevent unnecessary complexity, e.g. automatic compiler warnings when the amount of wrapper code exceeds some multiple of worker code.

    Unfortunately it's not the computer scientists who provide the development tools, it's companies like IBM and Microsoft and Oracle and Apple and so on. Abstraction is where I don't need to know the exact details of something my program does because my development tool kit or a component I get off the shelf deals with that for me, a wrapper sometimes is and sometimes isn't. A wrapper can indeed be an abstraction (SQL Agent is a wrapper that abstracts the handling of scheduling for me, so that I don't have to worry about it, and by its job-step mechanism abstracts some mixed language programming issues for me, because I can have a job with steps using a dozen different subsystems which give me several different languages (but I've never actually written a job that used more than 3). A wrapper can also be nothing but obfuscation, something that's used because the developer thinks it's a cute thing to do.

    Tom

  • A great example of complication for the sake of it in .NET is the overuse of Linq.

    Gaz

    -- Stop your grinnin' and drop your linen...they're everywhere!!!

Viewing 15 posts - 46 through 60 (of 60 total)

You must be logged in to reply to this topic. Login to reply