Gary Varga (5/22/2013)
I do fear that, whilst the 90s moved towards greater software engineering practices, since the popularity of the www that there has been a return to the hack and slash approach to coding. Agile has only made this worse. It is not agile in itself but the inappropriate use of isolated and subverted agile techniques and practices that have been twisted to justify a distinct lack of attention to quality of both practices and output.
I don't think Gary is sufficiently pessimistic.
The rot set in long before he suggests. I'm not sure whether is was as recent as 1973 (C) or as early as 1968 (BCPL) - anyway, somewhere around then the concepts of modularity, error management, high level language, defensive programming, strong typing and abstract types and all other good software engineering practises were abandoned in favour of pointer arithmetic and similar low level nonsense, programming for the normal case and to hell with errors, type systems that were almost meaningless, the idea that thorough testing without any formal testing methodology would work and make up for sloppy development, and development methodologies that were largely a series of big bangs, while the whole idea of building computer aided design and development systems for software was abandoned by all but a few. At the end of the 60s companies like Burroughs and ICL were successfully developing complex mainframe operating systems in high level languages derived from the Algol 68 standardisation process (ICL's VME operating system is still sold by Fujitsu, who took over the company about 30 years later) but they were the only two; by the mid-70s everyone (except those two and Eriksson) was writing all OS software and most applications in C - an very low level language based indirectly on BCPL. Dartmouth's BASIC spawned a slue of offspring, poisoning the programming environment right up to today. The universities continued to do research (we got languages like HOPE, ML, Prolog, Parlog, CCS, CPS) but this was ignored by most of industry (surprisingly, IBM took a liking to one of Oxford's formal verification languages and made fairly extensive use of it for a while, but that was very much the exception). It took until very recently for either of our modern industry giants (Apple, Microsoft) to offer any language with a serious declarative component (I believe MS's F is the only one so far) although the various ML dialects, Haskell, Prolog and Parlog are now extensively used for software development by developers outside the IT industry.
The idea that the WWW has led to the rot is wrong - the rot antedates the web by getting on for a quarter of a century. It may have accelerated it, as the herd on advocates of development non-methods can now communicate more freely than before, or it may have slowed it down because everyone now has access to knowledge that previously was very hard to come by - actually, I think the effect has been more positive than negative. If you've been allowed to use sensible practises and tools in any sort of software development any time in the last 45 years you have been one of the lucky few.