Will Coding Be Less Important?

  • Comments posted to this topic are about the item Will Coding Be Less Important?

  • I guess my question would be... why do people think that AI is going to do well?  Someone has to write the AI code and, according to what I've seen in a whole lot of code, we'll be in deep trouble with AI.  Even the automotive industry, which used to be known for "no fail" implementations has started to fail especially when it comes to software.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • "eventually we'll tell systems what we want and they'll build algorithms"

    Does not quite work even with actual people. 🙂

    In fact, they build algorithms, but usually it's not exactly what you expected when forming a request.

    _____________
    Code for TallyGenerator

  • 15 years ago I specialised in content management systems.  My colleagues and I found ourselves wanting to program at a lower level and effectively reversed engineered the way that the CMS interacted with the database in order to get the desired result.
    When we gained more experience we found that the CMS supported what we wanted to do natively.  Those man weeks we had spent coding, testing and integrating could have been achieved in hours if we had known more about the CMS features.  There was also an aspect of trusting the CMS framework. 

    As developers we were too quick to dive down to the low level.  This is a pattern I have seen in many applications.  Diving down into the code/script facility of ETL tools, bypassing the features of an ORM that actually turns it into a safe productive tool (usually the letter M).  In the SQL Server world there is inappropriately writing SQL in a way to force an execution plan thus blocking any benefits that come with upgrades to the query engine.

    The level of programming that is required for standard business computing is far more accessible now than it ever was before.  Things that are simple drag/drop with simple method calls used to require serious heavy-weight programming and an uphill struggle with MFC.

  • I agree with the tone of the earlier comments. My response would be "coding will morph into something else", or something that is called something else. Two of the really difficult things are: deciding what you want to represent - "the Model" and deciding and then expressing what you actually want to do with it. Both of those remain hard.

    Does anyone else remember "The Last One"?  https://en.wikipedia.org/wiki/The_Last_One_(software) 🙂
    This was supposed to be "The last program you needed to write"! in 1981! Substitute your favourite 4GL or other program generator techology for "The Last One".

    Past experience in IT and other industries indicates that it takes a long time for something like "coding" to go away completely. The places where such activities go away first are where the product is standard and process repetitive - we see that again and again in IT. The places where these activities remain are where fundamental decisions are needed and where we are pushing close to the limits of something.

    Tom Gillies LinkedIn Profilewww.DuhallowGreyGeek.com[/url]

  • David.Poole - Monday, June 4, 2018 2:19 AM

    Those man weeks we had spent coding, testing and integrating could have been achieved in hours if we had known more about the CMS features.  There was also an aspect of trusting the CMS framework. 

    This is why I say that little play on words of "Just because you can do something in SQL, doesn't meant you shouldn't".  SQL Server is a remarkable tool that a lot of people try to avoid from the front end or brute force things with PowerShell, etc, and then wonder why their code is slow and difficult to maintain.

    In the SQL Server world there is inappropriately writing SQL in a way to force an execution plan thus blocking any benefits that come with upgrades to the query engine


    Worse than that, I find that a lot of people write SQL that blocks any benefits that come with the product as it currently is.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Every compiler is a code generator.

    A generator of Machine code - aka Assembler.

    Telling a machine what to do and let it find a way to execute the request - it's not AI.

    It's rather "SELECT * FROM Customer".

    May be some of you heard of that declarative language.

    It's been around for quite a while.

    AI would be for a machine to figure out the query, to form the task by itself.

    To do what developers/BAs/PMs/CEOs, etc. do.

    But if humans made a machine smart enough to figure out targets, purposes, then there is no guarantee it will figure out humans' purposes, not the one of its own.

    In the end of the day, AI is about making a machine to operate as a human.

    If successful - it will make another set of human like minds,

    Trying to make them follow our orders - well, there is a name for it - slavery.

    We've been there as a society, and moved out of there.

    Replacing a developer using an organic bio-computer (aka brain) with another one using a synthetic one (probably bio one as well) won't change a lot.

    Don't you think?

    _____________
    Code for TallyGenerator

  • Jeff Moden - Sunday, June 3, 2018 7:54 AM

    I guess my question would be... why do people think that AI is going to do well?  Someone has to write the AI code and, according to what I've seen in a whole lot of code, we'll be in deep trouble with AI.  Even the automotive industry, which used to be known for "no fail" implementations has started to fail especially when it comes to software.

    They think it's going to do well simply because it's learning trial by fire. The exact issues we have with not knowing how to communicate something is exactly what AI is going to help solve. It's going to spend every second of it's life trying to figure out the best way to communicate it from user input. The more user input, the better.

    Does that mean it won't fail? Humans fail, so will machines. It's going to fail on the implementation level and it's going to fail on the learning level. We are aiming for the learning level more because like humans, the more we fail, the more we learn. Machines will be no different in that regard. We can only hope that with each fail, it does get smarter and the AI engine uses that information to make smarter and better decisions a human would make.

    However, this is a long time coming and we have yet to really scratch the surface of AI. This is ideally where the marketing hype is confusing people. We still have a long time before we really start reproducing the human brain. Even if that was successfully accomplished tomorrow, we still have a long way to go before even a small percentage of that is implemented widespread. There is simply too many applications for it and adoption is often extremely slow on new tech.

  • The idea of machine generated coding very much reminds me of 4th generations languages being touted as the answer to everything back in the late 80s/early 90s. They work for simple tasks that don't require extensive customization for a business but the idea of bypassing a trained coder didn't always work well.

    AI has a good deal of potential and is a fascinating endeavor but even with recent advancement there seems to be a long way to go that its influencers don't easily admit. As to AI replicating
    human intelligence, two good reads on the subject are The Emperor's New Mind & Shadow of The Mind, both by Roger Penrose. Penrose argues that we are very far from achieving an AI that
    comes close to the complexity of the human mind. With that said, AI could be a very good tool now and in the future but there could be a good deal of catastrophe that comes along with it.

  • xsevensinzx - Monday, June 4, 2018 6:01 AM

    Jeff Moden - Sunday, June 3, 2018 7:54 AM

    I guess my question would be... why do people think that AI is going to do well?  Someone has to write the AI code and, according to what I've seen in a whole lot of code, we'll be in deep trouble with AI.  Even the automotive industry, which used to be known for "no fail" implementations has started to fail especially when it comes to software.

    They think it's going to do well simply because it's learning trial by fire. The exact issues we have with not knowing how to communicate something is exactly what AI is going to help solve. It's going to spend every second of it's life trying to figure out the best way to communicate it from user input. The more user input, the better.

    Does that mean it won't fail? Humans fail, so will machines. It's going to fail on the implementation level and it's going to fail on the learning level. We are aiming for the learning level more because like humans, the more we fail, the more we learn. Machines will be no different in that regard. We can only hope that with each fail, it does get smarter and the AI engine uses that information to make smarter and better decisions a human would make.

    However, this is a long time coming and we have yet to really scratch the surface of AI. This is ideally where the marketing hype is confusing people. We still have a long time before we really start reproducing the human brain. Even if that was successfully accomplished tomorrow, we still have a long way to go before even a small percentage of that is implemented widespread. There is simply too many applications for it and adoption is often extremely slow on new tech.

    Heh... hopefully those smart machines won't figure out the real cause of problems... humans. 😀

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • I've been coding since 1976. I've been hearing about coders being replaced since...1976. 🙂

    AI won't replace humans in the loop, just as voice dictation didn't replace the keyboard. Want proof?

    Ok, watch the next meeting between management and developers when a project is first being hashed out. Listen to the impedance mismatch between what management thinks they're saying and what developers hear. Think about the questions developers ask that management waves off as "trivial details".

    Yeah. Now try that with an AI... :laugh:

    Second point, using English (or any human language) to code with. Um, yeah. About that...there's this little thing called "ambiguity". English is a wonderfully rich and information-dense communication tool, but one thing it isn't is precise. There's way too much context dependency built into it.

    Again, get managers and developers together to see what I mean. Throw a poor AI into that hellstew? Quick, call the SPCAI !*

    *Society for the Prevention of Cruelty to Artificial Intelligences

  • Call my cynical, but coding isn't going away

    I remember back when I took one of my first CS classes in college, and the instructor told us that learning C++ was a waste of time because machines would do all of our coding for us. That was more than 20 years ago, and we aren't any closer to a complete turnover than we were then.

    I do agree that we will see more and more tools that allow non-coders to  produce things that are useful to them. That isn't coding.  For example, a document management system I work with has a workflow tool that allows you to create "rules" and "actions" that use predetermined functions to accomplish work.  You can only do what is already built, and essentially you are just creating a list of steps to follow.  Again, not coding.

    Dave

  • "The players" in IT have long been wanting to eliminate coding and, it seems, has been a large factor behind Microsoft's frameworks for the past 20 or so years.

    But this is the proverbial "dog chasing its tail" routine. It was predicted that the digital age would eliminate paper, as just about "nothing would need to be printed anymore"--yet we have seen the rise of printers and paper usage to levels never fathomed since the advent of the "computerized office".

    And how about "The Cloud"? We are in the midst of dealing with folks who truly believe "The Cloud" is the end of in-house software development. And yet, "The Cloud" is just another cycle of "mainframe to client-server back to mainframe", with "The Cloud" serving as the surrogate "mainframe". And you wait--we will (partly) return to localized development once again as businesses--small and large--discover that housing one's data and intellectual property under the auspices of an environment you have no control over is not ideal in this age of cyber-terrorism.

    We need people to write the code that writes the code. AI is simply nothing more than the next mountain range we come to after reaching the pinnacle of the one we are on, made of the same stuff we have been slugging for decades.

  • Sergiy - Sunday, June 3, 2018 11:32 PM

    "eventually we'll tell systems what we want and they'll build algorithms"Does not quite work even with actual people. :-)In fact, they build algorithms, but usually it's not exactly what you expected when forming a request.

    Any language formal enough to describe the algorithms you want in a way for the AI to build it accurately WOULD be code, just a newer higher-level programming language.  UML, OCL and a number of the variants that the Open Group have been working on are trying to bridge the gap between using natural language and/or "normal" diagramming and turn it into workable code. And there will always be coders needed to build the algorithm-builders (or the algorithm-builder-builders, etc...)

    Needless to say - there still is a LOT to do before we get to do away with all code, nor is there really a true desire to do so.  Make it easier/quicker to get workable code - absolutely; but that's very different from doing away with code.

    ----------------------------------------------------------------------------------
    Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?

  • Tom Gillies - Monday, June 4, 2018 3:38 AM

    I agree with the tone of the earlier comments. My response would be "coding will morph into something else", or something that is called something else. Two of the really difficult things are: deciding what you want to represent - "the Model" and deciding and then expressing what you actually want to do with it. Both of those remain hard.

    Does anyone else remember "The Last One"?  https://en.wikipedia.org/wiki/The_Last_One_(software) 🙂
    This was supposed to be "The last program you needed to write"! in 1981! Substitute your favourite 4GL or other program generator techology for "The Last One".

    Past experience in IT and other industries indicates that it takes a long time for something like "coding" to go away completely. The places where such activities go away first are where the product is standard and process repetitive - we see that again and again in IT. The places where these activities remain are where fundamental decisions are needed and where we are pushing close to the limits of something.

    I'm not heard of "The Last One". I can remember when 4GL's were supposed to take away all development. I learned one in the 90's, I think it was called Focus (not sure). It was supposed to make software development more abstract. It was nice, but at the end of the day it was just a glorified report writer. It had some really nice features, but it couldn't communicate to IoT devices, monitor anything at all, send alerts/notifications, etc. And I've not seen a job add for it in over a decade. I no longer list it on my resume - what's the point.

    Kindest Regards, Rod Connect with me on LinkedIn.

Viewing 15 posts - 1 through 15 (of 31 total)

You must be logged in to reply to this topic. Login to reply