Will Coding Be Less Important?

  • Back in the early 1960s, someone probably wrote an article stating that in the near future coding would be less important. By "coding" they meant machine language instructions punched into cardboard.

    But whatever happened to Microsoft English Query?

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho

  • As long as having even the simplest exchange with Alexa or 'chatbots' is an exercise in frustration and confusion, I am not too worried about AI taking the coding jobs.

  • Matt Miller (4) - Monday, June 4, 2018 8:25 AM

    Sergiy - Sunday, June 3, 2018 11:32 PM

    "eventually we'll tell systems what we want and they'll build algorithms"Does not quite work even with actual people. :-)In fact, they build algorithms, but usually it's not exactly what you expected when forming a request.

    Any language formal enough to describe the algorithms you want in a way for the AI to build it accurately WOULD be code, just a newer higher-level programming language.  UML, OCL and a number of the variants that the Open Group have been working on are trying to bridge the gap between using natural language and/or "normal" diagramming and turn it into workable code. And there will always be coders needed to build the algorithm-builders (or the algorithm-builder-builders, etc...)

    Needless to say - there still is a LOT to do before we get to do away with all code, nor is there really a true desire to do so.  Make it easier/quicker to get workable code - absolutely; but that's very different from doing away with code.

    That doesn't mean an AI couldn't do it, but it would understand enough about the business to replace you, long before it could understand what you're asking for.

  • I guess back in the 1950s folks in the restaurant industry initially felt threatened by the concept of frozen dinners, and then in the 1990s the media said that self serve online trading would be the end of financial advisers. Of course, all those concerns were silly in retrospect, especially the one about folks not needing financial advisers.Voice activated bots are actually not the most effective user interface for self serve programming. Perhaps the best example of this concept actually working are like PowerBI where the user connects pre-built data sets and widgets. But even then, there is a significant amount of engineering involved behind the scenes in prepping the data. The usefulness of the tool is only as good as the data and database platform behind it.

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho

  • Thinking about AI and user interfaces reminded me of something I saw at the county fair as a kid. Back in the 1970s, an engineer created a circuit board that could beat humans more often than not at Tic-Tac-Toe. The user interface was a chicken, but as you see by reading the article, the chicken component is actually not necessary

    How I Made A Chicken Play Tic-Tac-Toe ?? - A True Story
    https://steemit.com/life/@creatr/how-i-made-a-chicken-play-tic-tac-toe-a-true-story
    .

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho

  • Hmph. Automating the coding is probably a lot  easier than getting the requirements out of the user. I can't code ambiguous directions to the computer and I suspect an AI can't either.

  • Eric M Russell - Tuesday, June 5, 2018 6:39 AM

    Voice activated bots are actually not the most effective user interface for self serve programming

    I bought a 2nd hand car and didn't realize it had voice activation until I got cut up one night and cast aspersions as to the driver's parentage.  Got the shock of my life when a voice said "Calling Linda's Dad"!

  • Back in the 80's I was very much into the "AI" languages of the day LISP, PROLOG and SCHEME.  I even became very proficient in LISP thanks to XLISP & AutoCAD.
    In the years since then, there have been many speculations, but none of them has come to be.
    I think we still have time before computers take away computer programmers jobs, but I do hope that we can use them for testing in a very robust and verbose documented way.
    After all most of the time we have bugs brought to the light of day we have a hard time recreating them.

  • HighPlainsDBA - Tuesday, June 5, 2018 12:43 PM

    Hmph. Automating the coding is probably a lot  easier than getting the requirements out of the user. I can't code ambiguous directions to the computer and I suspect an AI can't either.

    I agree.

  • A lot depends on what is meant by "coding".   In the early days it meant working directly in machine code, remembering the hex or octal operation codes, and being unable to use labels/names for addresses. The swittch to using assemblers and microassemblers eventually abolished coding in that sense by the 1950s (microassemblers were introduced for control store computers without conditional operations in the late 1940s and in the 1950s for full microcode systems ).  The meaning then  changed to mean using an assembler or microassembler that represented directly the machine code or the microcode but provided names for operations and registers and allowed the creation of labels for store addressess.  That kind of coding is now pretty rare outside of using microassemblers for hardware manufacturing construction and for using as an intermediate easily read notation for the machine code generated by a compiler - so coding now generally means writing software in a language which is either compiled into machine code or converted into some intermediate form for which there is an interpreter.  In the 1950s, the compilable and interpretable languages were pretty primitive - the initial versions of IPL, Fortran, Lisp, Algol, and Cobol  were proposed in 1956, 1957, 1958, 1958, and 1959 respectively, and coding was still mostly in machine code until some point in the 60s. Things have changed vastly since then with declarative languages like Prolog, Hope, ML, and Haskell, several process oriented languages based on CSP or CCS or some mix of the two, languages like Python and C# and JavaScript, abysmal messes like C++ and Java, semi-declarative languages like SQL (maybe some version of SQL is declarative, but if there is such a version I don't believe anyone has ever implemented it) and quite a lot of languages specialised for use in particular fields.   And sometimes a language was developed to do one specific job, (and some of those languages were never reused for anything else).  So coding now means a whole range of different things and we need far more people to do it now than ever before.  And it is rather obvious that coding will continue to be required, because we will have more and more uses for computing.

    As others have commented, the first essential part of coding is getting a clear and unambiguous description of what is required - or, perhaps, just a description that is near enough clear and contains little enough ambiguity that an appliction that conforms to that not ideal description will not be completely useless. Sadly, the people whose requirement has to be determined usually are not coders, so they have no unambiguous language that they can use to produce that description; perhaps we will eventually find some very high level and very easy to learn coding language which the owners of the requirements can handle - and they will then hand it over to a compiler or interpreter that generates the required machine code to do the job; or perhaps not.  But even if we do invent such a language and the tools to allow it to be used that will simply mean that coding continues to be done, and done in that language.

    So the Quartz essay has got it completely wrong.  The other paper Steve referenced is much more realistic.

    Tom

  • Steve Jones - SSC Editor - Friday, August 17, 2018 11:14 AM

    People are way to stuck in the mentality that coding means typing things into a keyboard, and also the weird mentality that "Hey if we have a GUI anyone can do it".  At the end of day whether it's a GUI or pure text all you're doing is making a computer do something, some people will be good at it and some won't.  It's like claiming that the transition from letters to phones means people no longer communicate.

  • No-code platforms are basically interpreters from a human to the machine language.

    To see how effective those things are copy this post to Google Translate and do double translation - say, to Russian, or Bosnian and back to English.

    Would you be satisfied with such computer interpretation of your commands?

    _____________
    Code for TallyGenerator

  • I don't think so it is going to be, but in future times it is going to be huge that I am sure.

  • So coding is going to stop happeming because people will use things such as speadsheets.
    Presumably writing excel Macros doesn't count as coding.

    Tom

Viewing 15 posts - 16 through 30 (of 31 total)

You must be logged in to reply to this topic. Login to reply