Training Computer Scientists

  • Comments posted to this topic are about the item Training Computer Scientists

  • I so strongly disagree with Steve and the authors of the post linked to and it is an argument I am passionate about. Specifically, I do not believe that we should be disregarding tertiary education. Nor, do I think that tertiary education should be trying to stringently tailor itself too closely to an industry.

    In my opinion, industry is for the highly specialised development of specific skill sets, where we learn to craft our knowledge within a narrow space. The mentorship many of us receive in industry is a critical part of this development. Academia has to be different than industry - we all know it costs enough! Returning to academia, I want an experience and a perspective that I cannot attain in industry. As a professional recently returned to tertiary education, this discussion is so topical to my life!

    There is a strong thread of discussion within the business world that universities are not adequately preparing graduates to step into the workforce. However, we should qualify exactly what we mean by this. Is it the case that universities are failing to develop the technical skills such that graduates might step directly into productive roles? Or is it that graduates do not possess the personal skills (a willingness to learn, critical thinking, ability to problem solve, challenge themselves and learn independently) to be successful in the workforce? I believe that the development of specific technical skills is not a primary goal of tertiary education. But if graduates are lacking in personal skills then we have a great failure of society, and not just academia, in developing our youth.

    I do agree however that changes in the education system have helped reduce essential soft skills mentioned above. I believe that the pressure and competition for exceptional grades has led to a narrowing of the educational experience. Taking risks or exploring outside the bounds of assessment do not necessarily translate to that coveted A+ and therefore, our students become more mechanical and restricted in their approach to learning. Similarly, we see less open-ended questions in exams and assignments, replaced instead with a higher degree of specification and less ambiguity. This to me is the greatest failure of the current education system.

    Reflect on your own first years in industry. Did you possess the necessary technical skills from day one? Are you still in the same field / specialty area? Or, has your career taken unexpected twists and turns that have required you to adapt and learn quickly? Why then should we expect current graduates to come out as technically-skilled in specific areas?

    I would encourage a return to the philosophical ideals of higher education, the development of critical awareness, open debate and creativity. These are the skills that we don't have the time and luxury of developing once we enter the business world.

    Nick

  • As an IT pro and graduate, I've been involved in the recruitment of programmers and the very first requirement on the job spec. is a degree or HND or equivalent level qualification in a relevant discipline. Candidates without were rejected straight away and HR would not allow them to be even considered.

    When I first joined a software house in the seventies, computer science degrees were fairly scarce and they took graduates from many disciplines including music and sciences using aptitude tests and problem solving to help sift them. Computer science graduates fast tracked through though.

  • I don't agree with this article. This is true that computer science degrees, or other degrees, don't prepare the student to the point of being able to "hit the ground running". But this is not what any good HR should be expecting either.

    I started as a BI Dev, like most people, by accident, without any relevant degree or even idea of what I was really getting into. I learned with books, with blogs, etc. But it's only recently when doing an MSc in B.I. that many things made sense and this course opened many horizons as well.

    But from the perspective of the recruiter, it is true that most of the thing that we learn at University doesn't have an obvious benefit. But given the need of recruiting people that can adapt quickly to different environments, HR will have either to organize and pay for screening out "dim" applicant using psycho-technical (or IQ) tests or using the University doing it for them.

    In other words, nowadays, going to Uni is less about learning and more about credentials. This is why it is getting extremely expensive to go Uni. The screening out is not only on cognitive ability but also on social status.

    So the idea that going to Uni is useless for I.T. professional is simply wrong...unless you are in the extreme range of cognitive ability (and come from a relatively healthy family) like a Mark Zuckerberg or a Bill Gate.

  • It appears the main problem here was the ability of the college to communicate what the courses were about and what was involved, and what the commercial application would be likely to be. The strong theoretical background type degree should always be available - and clearly marked as such.

    It does seem there are courses for developers rather than computer scientists available. Becca should probably have been doing one of them.

    Maybe that ties back to Steve's theme of communication being the most important skill.

  • I think Im in basic agreement - certainly in the U.K. the establishment is pre-occupied with 'relevant' degrees. If you're a half-decent programmer you can pick up anything taught you in such 'relevant' degrees in very little time.

    However there are many things which can be taught/encouraged in university education which will make you a better programmer. Steve mentioned communication skills, someone in a previous reply mentioned more philosophical reflections. I'd like to add understanding time and calendars - a topic that we all think that we know and we nearly always get wrong.

    Conversely there are some skills which are better learnt 'at the coalface'. Taste in APIs is possibly one such topic, thriving in organisations, etc.

  • While I don't think a degree should be a prerequisite for an IT job, I do strongly agree that so many 'soft' skills are developed while working on a degree.

    Keep in mind, though, that UK degrees and US degrees are very different -- UK degrees don't require the broad basis of subjects that US degrees do (or at least, the last time I looked). I have a degree in Economics from the Univeristy of Texas at Arlington. To get that degree, I had to take university-level history, science, math, humanities, and literature. A UK batchelor's degree doesn't require that the student have courses in anything but their speciality. I believe that the requirement to have studied at a higher level a range of topics makes one able to communicate better. Think about how one has to approach a maths class, for example, as opposed to a literature class. You're forced to extend your range of 'skills' to cope with vastly different subjects. That said, even if the degree is very narrowly focused, like in the UK, you still have to develop skills that you probably didn't have when you were in high school, and those skills will probably make you a more effective employee. (Please note that I'm not slagging off UK degrees -- just pointing out that they're quite different than US degrees.)

  • I agree with Steve that college provides more than training in terms of communication skills and general knowledge. I also believe that community colleges provide the skills training better suited to our technical work.

    I started teaching technology subjects at the Junior/Community college level in 1969. My last teaching assignment was in 2010. What has been interesting to me is the number of IT and Computer Science college graduates attending community college to get training in programming and database development. In 2010 I had one student who attended my class concurrently with her masters degree program. It seems she got more hands on training at the community college than in grad school.

    As technologists it is in our best interest to communicate with high school students to let them know about our fields of interest. There are many ancillary jobs associated with development and administration like testing, technical writing, and business engineering that could be filled by people without a college degree. Most high school students have no clue about them.

  • For me, I have always found that the goal coming out of college with a degree is to become a critical thinker. In order to do that, you need to have a well-rounded base. Having said that, that foundation should be laid within the first couple years of a 4-year program, with more emphasis on core studies in the final two years. Additionally, internships should be MANDATORY! They will bridge the theory with the practical, and hopefully, create a dual-way tunnel of information -- students presenting to companies new ideas/a fresh perspective and then students bringing back to the schools what is actually needed to be taught.

    Just my humble opinion.

    - Chris

  • I think a lot of the problem is the idea that computer science and software development are the same thing. It's sort of like saying that physics is the same thing as mechanical engineering. The first of each pair definitely contributes to the second (and vice versa,) but they are two very different disciplines.

    Looking at the NCES college navigator[/url], I found over 500 schools that offer undergraduate degrees in computer science, but only 40 that offer undergraduate degrees in software engineering. Developing software systems professionally has a lot more to do with engineering than with building efficient algorithms.

  • nick.dale.burns (6/30/2015)


    I so strongly disagree with Steve and the authors of the post linked to and it is an argument I am passionate about. Specifically, I do not believe that we should be disregarding tertiary education. Nor, do I think that tertiary education should be trying to stringently tailor itself too closely to an industry.

    In my opinion, industry is for the highly specialised development of specific skill sets, where we learn to craft our knowledge within a narrow space. The mentorship many of us receive in industry is a critical part of this development. Academia has to be different than industry - we all know it costs enough! Returning to academia, I want an experience and a perspective that I cannot attain in industry. As a professional recently returned to tertiary education, this discussion is so topical to my life!

    There is a strong thread of discussion within the business world that universities are not adequately preparing graduates to step into the workforce. However, we should qualify exactly what we mean by this. Is it the case that universities are failing to develop the technical skills such that graduates might step directly into productive roles? Or is it that graduates do not possess the personal skills (a willingness to learn, critical thinking, ability to problem solve, challenge themselves and learn independently) to be successful in the workforce? I believe that the development of specific technical skills is not a primary goal of tertiary education. But if graduates are lacking in personal skills then we have a great failure of society, and not just academia, in developing our youth.

    I do agree however that changes in the education system have helped reduce essential soft skills mentioned above. I believe that the pressure and competition for exceptional grades has led to a narrowing of the educational experience. Taking risks or exploring outside the bounds of assessment do not necessarily translate to that coveted A+ and therefore, our students become more mechanical and restricted in their approach to learning. Similarly, we see less open-ended questions in exams and assignments, replaced instead with a higher degree of specification and less ambiguity. This to me is the greatest failure of the current education system.

    Reflect on your own first years in industry. Did you possess the necessary technical skills from day one? Are you still in the same field / specialty area? Or, has your career taken unexpected twists and turns that have required you to adapt and learn quickly? Why then should we expect current graduates to come out as technically-skilled in specific areas?

    I would encourage a return to the philosophical ideals of higher education, the development of critical awareness, open debate and creativity. These are the skills that we don't have the time and luxury of developing once we enter the business world.

    Nick

    Well said and I fully agree.

  • I have always prided myself on accomplishing what I have without any higher education. I've gone from being a carpenter to owning my own construction business, became a draftsman for a residential builder (learned AutoCAD, Revit, MS Office on the job), then worked in an Engineering department as a CAD Administrator (add SolidWorks to the mix), and finally found my way into the world of VB.NET, SQL, databases and now Javascript.

    That being said, I admit that I've always also felt that I could've gone much farther in all my pursuits had I gone to college. It's the theory part of all this stuff that I know I lack and is the most difficult part to pick up just from books and blogs.

    If time (and finances) allowed, I would swallow my pride of 'self achievement' and go to college; I think the experience would help fill in many gaps in my self-learning, but I would expect the courses to be well thought out.

  • As with a lot of posters in the forum, I disagree with Steve's assessment of the ability to be hired without a degree (the 2 of the 3 companies I have worked for required them). Without the piece of paper, you may not even get a chance to speak with someone from the company as most use a vetting service to reduce the number of applicants. It is still possible to get a programming job without a CS degree, though, having hired people recently with a plethora of degree types as long as they had the relevant experience.

    I also disagree with Steve on the role of university education. I expect a fresh-out employee to have the capability to learn, not necessarily do. I don't expect them to have the greatest knowledge about a specific product, language, or process. That will come with on-the-job training; that's why we pay them less than an experienced developer. To augment their training, we may put them in an advanced short-course, taught by someone working in the field, but more often, they will learn from reading, mentoring and social media.

    Regards,

    Joe

  • I have worked with a lot of self taught and degreed graduates.

    Those with degrees are usually more disciplined in the approach they take. They don't "Cowboy up".

    I disagree on the plumber, mechanic etc. analogy.

    Try to call yourself one of those on a business loan. Try to call yourself that on a business license application.

    They also require training and certifications.

    The biggest joke in the IT field is the "Software Architect". That second word has meaning in the real world.

    IT just lets people use "Architect" or "Engineer" like they are "Wizard".

    We need to have more respect for the meaning of titles and the training that others go through to get the titles.

    Then perhaps we can get more respect for our work.

  • I was surprised how strongly I disagreed with this article. I have a BS in Computer Science and while it's true that the technical skills I learned while getting my degree are no longer relevant, what I learned that is still relevant is the art of Critical Thinking. I've always felt that any competent developer can learn any language as the concepts are similar (i.e. loops, input, output, etc.), it's only the syntax that's different. However not everyone can take a complex problem (or non-complex problem for that matter) and break it down into it's component parts that logically fit and work together in a way that makes sense for a machine. I've interviewed plenty of people that can write code but very few that can think critically (and the ones that can think critically are the ones that I hire).

Viewing 15 posts - 1 through 15 (of 96 total)

You must be logged in to reply to this topic. Login to reply