Is Computer Science Dead?

  • Comments posted to this topic are about the item Is Computer Science Dead?

  • I never really thought about it that way, but it is so true.

    Being a DBA can sometimes be outright boring and that's why I still do programming on top of being a DBA. It just amazes me how all my old programming techniques fit right into these new frameworks (well just .Net as that is the only one I currently use.

    Yes, my days with Pascal (and Assembler:hehe:) taught me a lot and yes, programmers should start learning by using one of these 3rd generation languages but that is a big ask these days.

  • All i really need to know i learned in the first year of programming course in the college.

    I learned little bit more next years, especially Simula 67 (3rd year).

    Technology is far away now, but the principles are the same.



    See, understand, learn, try, use efficient
    © Dr.Plch

  • These days I found self called developers or programmers that don't know the meaning of OOP and C or C++ are "ancient" languages, nevermore used :w00t: Study computer science, it will pay off.

  • Yes - It's dead.

    I'd never recommend someone to learn computer engineering so he can work as a dev or dba etc.

    In the end it's all about what you want, where will you earn decent money and have a fulfilling job. You can earn better at a lesser effort in other jobs, so why go for this?

    I personally like it, but as time changes and IT moves towards lower cost countries sooner or later that will affect us a lot more than it does today.

    Back to the money and effort. Two friends bailed on computer engineering after one year because they found it too heavy, these were normal people, not better or worse than anyone else. However, as economy engineers they could study at 200% speed and still find it easier than 100% speed studying for computer engineer.

    I do not believe there is a future for this job, it's already dead. You can do well within it, and I think change will come slowly, so it might take 20 or 50 years before salaries really starts to drop as india and china and east countries competes efficiently. But why would you spend as much effort in this job unless you love it and do it for joy when you can earn better and do less in other jobs?

  • I have seen recent graduates (last 10 years) with a distinct lack of understanding of programming fundamentals. These people were writing software!!!

    There is a reason that I was taught different languages through my 8 years of full time Computer Science education. Different languages solve different types of problems and also show different ways of doing things. Some of the simpler free form languages, such as Pascal, make for an excellent basis in learning to program as it allows for discussions and demonstrations on the importance of naming, code structure, comments, format etc. As a previous contributor said, C/C++ (amongst others) allows for performance tuning, memory management and algorithm evaluation. Nowadays I would expect to see OS scripting taking a more formal approach (*nix shells and PowerShell should not be considered anything short as being part of a systems administration or specific scripting module). Teaching someone to do a simple type of application development closely coupled with a particular framework is very short sighted.

    Programming is not going anywhere. I have heard that no-one was going to be coding in the West for decades now. Demand has not really decreased. It will if there are no decent coders left. On the job learning is fine but there needs to be an academic foundation too. I am not prescribing full time education but I believe that that too has its place.

    Gaz

    -- Stop your grinnin' and drop your linen...they're everywhere!!!

  • One can learn to program and be a hell of good programmer without ever stuying computer science or anythig related to it....BUT be sure that this person will study by his own everything he needs and even more. Finishing a full 4-5 years of computer science will not turn you into Alan Turing, Linux Torvalds or any other, but will help to set the bases. Again, those bases can be settled up by your own, maybe in less time, maybe already targeting a known goal, sql developer, sysadmin, dba, .net developer, etc. But just knowing all the .net framework and all the commands does not make a good programmer. Studying and practice is the key to success, without spending hours studying and then applying that knowledge, no way that person can turn into a good developer/dba/etc.

  • yazalpizar_ (3/28/2012)


    One can learn to program and be a hell of good programmer without ever stuying computer science or anythig related to it....BUT be sure that this person will study by his own everything he needs and even more. Finishing a full 4-5 years of computer science will not turn you into Alan Turing, Linux Torvalds or any other, but will help to set the bases. Again, those bases can be settled up by your own, maybe in less time, maybe already targeting a known goal, sql developer, sysadmin, dba, .net developer, etc. But just knowing all the .net framework and all the commands does not make a good programmer. Studying and practice is the key to success, without spending hours studying and then applying that knowledge, no way that person can turn into a good developer/dba/etc.

    I agree that it can be done outside of an academic institute or other structured learning methods but this issue is twofold:

    1) People don't know what they need to learn without decent guidance.

    2) People just don't do this (the majority that I come across anyway).

    There are always exceptions of which I think that you will find a higher percentage amongst contributors to forums such as this.

    Gaz

    -- Stop your grinnin' and drop your linen...they're everywhere!!!

  • I wish I had a nickel for every time someone told me programming was dead. Back in the 70's CASE was going to make programmers obsolete. At one time I was told spreadsheets were going to lead to the demise of programming because the average user could arrive at their own answers. The list goes on.

    The best argument I heard was when an "expert" panel at an early personal computer show pronounced that COBOL programmers would soon be extinct because all high school students would soon be graduating knowing how to program. An elderly gentleman (to me, I was in my early 20's) rose and said that wasn't so. He said, "Look around you, everyone in this room knows how to read and write. But how many of you are going to write novels?"

    I've never been a fan of a computer science education. Certainly there is an advantage to being taught, but I want programmers who want to learn instead. The best hires I've made have been people who did not have a degree but had been self taught. That indicated to me the desire to learn, even if it is on their own.

    One time I had this conversation with an accountant who wondered why programmers were necessary, hadn't we written everything yet. I reminded him that programming is less than a century old, while accounting was thousands of years old. But what I thought was my best response back to him was, "Who programmed cellphones before they existed?"

    My most fun and challenging projects have been when I worked in languages that did already have all the good stuff preprogrammed (spoolers, communications, math, etc.), unlike today when you spend most of your time learning the APIs of the packages that have already been written. But there are plenty of libraries that can be yet written.

    We still have challenges ahead, multi-cores are here to stay and efficiently using them is still not baked into the languages.

    The world has not caught on to the information age. Industrial Age thinking still prevails. A friend related a story from his work where a VP had bragged about how they planned to add off-shoring in both India and China. He thought that at the end of the day the programmers present would transmit their code to the next team who would continue coding and then ship it off to the next. He said, "Just imagine what your code will look like when you show up the next day." :crazy:

    And be honest, we still have a long way to go on developing the human-computer interface.

  • Yes, its dead. Other ways to get into programming or DBA. My wife and I are programmers. She took the punishing route.

    And C was the most terrible class I took in college. (Just my opinion) But working with memory and understanding pointers was a waste of time.(again, just my opinion) It was punishment for the sake of punishment. Out of 30 students, I was the only one to get one of the assignments done. And only because I had a friend who programmed in C that practically did it for me. Of couse, unless you actually want to work with memory or pointers.

    To be honest, if one wanted to change the world, a class in proper database design should be required in high school and college. Until then, no real progress will be made in this world. You think I am kidding, but I'm not.

    I went the Psycholgy and Criminal Justice major route with business minor, then MIS as a master. Then a slew of books in my free time in order to pass MS exams. I would have never made it through CS.

  • I earned a Computer Science degree in the early 80's. Back then it was less about programming and more about the theory of computability, languages, operating systems, compilers, and hardware. Even the nascent relational theory, networking, and AI was taught to seniors.

    While programming was a required part of it, the goal was to train those that would design new OSs, languages and database systems. There was a lot going on back then in these areas- but now fewer people work on the lower level systems and there is more demand for mere programmers, and less need for understanding of the underlying systems.

    That said - if you want to distinguish yourself, you need to know what is going on in the layers below your code - and that's where the real computer science is happening.

  • IT has been dieing since IT began. Every new technology is the end of the world. Saw it with client/server replacing big iron in glass rooms, saw it with the interweb and thin clients, saw it with outsourcing, now hearing the same crap regarding omg "The Cloud".

    There will always be a requirement for true Comp Sci people. Do you need that to succeed in IT? No, not at all. There are other skills that are just as valuable, creativity for one. Plain old common sense is another.

    The shocking thing to me is the dumbing down of IT. Everything is fine when things are running well, but the second something goes wrong these people are at a complete loss. The bar is so low its almost comical.

    When you ask a "developer" to troubleshoot a bug in a system *they wrote* and the answer you get is "I dont know, the computer just did it" something is terribly wrong. I should never have explain to a programmer that computers dont just do stuff, they are programmed to do stuff, you are the programmer, therefore the computer is doing *exactly* what you told it to do, right or wrong, on purpose or inadvertently. Anybody that thinks computers are magic should not be any where near the IT industry.

    Can you tell you've hit a nerve? 🙂

  • There are just too many things wrong with this article to address. I can't argue that the perception that programming is dead is true, and I can look around at fellow programmers and see why. I see my job as getting business people tools to collect, then get to, their data. If a pre-packaged tool can do that cheaper than I can by writing code, that's great. Many coders, dba or otherwise, see those pre-packaged tools as a threat. That's where they shoot their own foot because if they worked with those tools, they would be seen as valuable members of a team, instead of people tyring to protect the job they learned to do 20 years ago.

    Academia probably needs to change too. If I were starting a project that I knew would take 4 to complete, I would have to plan on technology changes that would occur in those 4 years. I don't think schools look ahead like that.

  • WolforthJ (3/28/2012)


    There are just too many things wrong with this article to address. I can't argue that the perception that programming is dead is true, and I can look around at fellow programmers and see why. I see my job as getting business people tools to collect, then get to, their data. If a pre-packaged tool can do that cheaper than I can by writing code, that's great. Many coders, dba or otherwise, see those pre-packaged tools as a threat. That's where they shoot their own foot because if they worked with those tools, they would be seen as valuable members of a team, instead of people tyring to protect the job they learned to do 20 years ago.

    Academia probably needs to change too. If I were starting a project that I knew would take 4 to complete, I would have to plan on technology changes that would occur in those 4 years. I don't think schools look ahead like that.

    The article does not discuss if programming is dead, offcourse is not dead and I doubt will die, always there is something to be programed. I use 3rd company tools to speed up my job, Telerik in VisualStudio, RedGate on SQL, SMSS Tools to write TSQL and many more. But these tools are a way to get to my goal, not a goal on themselves. Many selfcalled programmers copy/paste a couple (o thousands) of lines taken from a forum, add some fancy comments and...tada! jobs is done...but once they try to do something little bit different then "sorry, I can't find on internet what you need to do' or 'this can not be done'. That is why programming is not dead and will not die.

    That is why some college is not so bad, it will settle the bases for the incoming years. You do not need to learn the last programming language, the last framework or the last tool to do whatever you want to do. You need to learn about algorithms, OOP, methods, structures, including mathematic skills that will save your *** many times 😀 That is why an academic course cannot change each year or each 5 years to adapt it to the last fashion in programming language. Take C/C++, learn from it, program on it, go as deep as you can, spend endless hour twiking and tuning everything and after that I assure you other languages will come as easy as ABC

  • Actually I was told a while back to look at it this way. In the early years when Computers were new things were generalized, kind of like early doctors. Now that the field has grown a lot you have fields of practice which are increasingly more important that. At some point computer science will become a basic class toward a targeted path such as Database design or Network architecture. Things will become specialized, which improves quality. You wouldn't want a general practice doctor performing heart Surgery if he was not specialized in it. The same is becoming true with computer systems as they become more and more critical. Could you imagine get a BSOD on the Space Station oxygen supply or worse your artificial heart? Read After the Gold Rush: Creating a True Profession of Software Engineering by Steve McConnell for more on this. I do think ultimately as systems become more and more critical that you will ultimately have some areas so specialized you will have true engineers in that area who have to give approval over other peoples work because of legality.

Viewing 15 posts - 1 through 15 (of 39 total)

You must be logged in to reply to this topic. Login to reply