From Great Idea to End Result

  • Applications need to do what they need to do, no more, no less. I'd sum up what I have grasped over recent years.

    a) Understand the market. At my current shop we, for example do not have high volumes, maybe a couple of million hits a month on our biggest sites, but we do have high complexity. No point in wild stress testing, plenty of point in getting the main calculations right (agri sector).

    b) Don't design for more than you need. The model needs to extensible, yes, but don't for example make everything fully flexible and data driven unnecessarily (YAGNI)

    c) Plan to refine as you go no.

  • I think that many people have covered what I believe which is that you need to deliver as soon as possible which, for me, is the minimum viable product.

    Of course, the trick is in defining viable.

    What isn't viable is a poorly put together system that hasn't been developed with good practices. A hack and ship mentality doesn't work.

    Gaz

    -- Stop your grinnin' and drop your linen...they're everywhere!!!

  • It may work in your environment but never has in mine. Once you put a product into production it should work. When it doesn't the users lose faith in the product and you. They are less likely to work with you in making the product better.

  • People respond to how they are treated and rewarded. I'd guess that if an IT group responded quickly, but it did not go perfectly, then they would be rewarded with complaints. Thus, the go slow approach. If you want a group to respond quickly, reward them for moving quickly, even if it does not work out perfectly.

    The more you are prepared, the less you need it.

  • I'm in a similar situation at the moment. I put a report together according to the users requirements and tested it thoroughly before I released it to him. During the UAT phase he got in touch to say that it was returning rows that weren't there. After we investigated it turns out that there is a deep-seated system bug that means that there are rows in one of our databases that match conditions that my report is looking for. They are few and far between but the difficult part is that there is currently no way of distinguishing them from rows that are legitimately to be returned. The dev team are looking in to it but as it stands the report may occasionally return them.

    I've explained this to the user and asked him how he wants to proceed. As far as I'm concerned the report is doing as he asked but it may sometimes return phantom rows that look exactly like real rows. The numbers of rows involved are very small and all the rows that are returned have to be checked manually any way but does this mean the report doesn't work?


    On two occasions I have been asked, "Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?" ... I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.
    —Charles Babbage, Passages from the Life of a Philosopher

    How to post a question to get the most help http://www.sqlservercentral.com/articles/Best+Practices/61537

  • The quote in the article was "velocity is more important than perfection". This reminded me of another good quote: "Direction is more important than Velocity".

  • One great quote in there is "velocity is more important than perfection", which is a tenet that I have found to be very true over the years. It's not that you throw junk out that isn't well built or tested, but that you don't try to meet every possible requirement or handle every little issue.

    Velocity is a direction, we all know what North, South, East and West mean. But perfection is a vague concept that exist only within the mind of one single person. Ask 100 people about what a perfect XYZ looks like, and you'll get 100 different answers. You can document one version (or one amalgam) of perfection on paper, and it then becomes a direction, but as we all know from experience, perfection is subject to change from one day to the next. That why IT departments spend a lot of time spinning their wheels.

    Don't ask us to build a perfect e-Commerce website; just hand us a set of blueprints.

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho

  • Interesting. And Amusing. Lots of people screaming something like "Agile will never work, it's about getting out of the door regardless of quality", others claiming it's this wonderful "new" system, and maybe a few (only a few) looking at what really matters. And I don't think Steve ever metioned "Agile".

    Well, I think I was using "Agile" at the end of 1975, when of course it was called "common sense", not"Agile". I had been persuaded to take over and rescue a project that had been repearedly failing to deliver for a couple of years (and had destroyed the careers of five or six project managers). I sorted the short term commitments using just ordinary management, not even common sense - just beating up accountants to get the team paid for overtime and shift allowances, and taking on the bug report backlog myself so the team didn't have to look at it (that was the "gain trust" step that's essential when taking over a demoralised team, fortunately I could discover all the bugs despite never having worked with the hardware or the programming language before). But the medium term (about 5 months) commitments were a bit more difficult - they obviously couldn't be achieved. So I talked to the people who had imposed these commitments; hardly any were real customer requirements in the committed timescale, so I simply infomed senior management that we would be achieving only some of them (specifying what was and wasn't going to be achieved) and left them to howl at each-other - that's what I understand the "Agile" approach to be (of course if management supports "Agile", there's no howling). That left me enough resource to add a vast reliability improvement (50-fold reduction in frequency of OS crashes cause by my teams OS components) to the medium term delivery. And that's the point of "Agile" - you don't implement pointless junk, and you use the saved resource to improve reliability/general quality. But I don't like calling it "Agile" because I learnt this approach to development and release in the early 70s, decades before some peope decided to call it "Agile", pretend it was new, and claim credit for something they'd contributed nothing but a new buzzword to.

    Tom

Viewing 8 posts - 46 through 52 (of 52 total)

You must be logged in to reply to this topic. Login to reply