It's not dead, but it's not absolutely vital, if, and it's a big if, the person actually loves doing this stuff. Some of the best I've dealt with are self-taught, and while there are hurdles for those types in the beginning, most soon realize that a lot of the best practices that they didn't teach themselves actually have a true benefit, at which point they adopt them. Things like strong typing, explicit conversions, self-documenting code, version control, etc., aren't necessarily important initially to a self-taught developer, but after dealing with the repercussions of blowing those things off a few times, the good ones come around very fast.
I also agree with the above posters who noted that learning a machine language (or assembly), on any processor for that matter, can help you greatly in many small ways in the future, whether self-taught or through formal training. I haven't coded in anything that low level in decades, but I find that the understanding of what's going on behind the scenes helps me frequently understand new concepts or issues more quickly.
Edit: I see bnordberg made some of the same comments as I did. Simulposts are fun.
I don't think that computer science degrees are dead, but I do think they are "sick" (to extend the metaphor). I think most people entering college today are sensitive to their employment opportunities after college. A few years ago there was a huge rush to outsource IT to off-shore sites/countries. If I were an American college freshman today, it would seriously not consider getting a degree in CS, because I would be very concerned that there wouldn't be a job for me when I graduated.
If you are going to require a programming language, then it should be Fortran.
Punching cards will either make you a better programmer by getting more work out of each instruction or become a better typist.
CS is not dead.
We will still need coffee pots that need programming.
I can't help posting another one
how many of the top 100 richest business people in the world has a MBA or evan an BBA? knowledge is about learning, of course with self motivation and commitment. if we follow the line of richest business people, then we don't need any kind of university education.
There is nothing wrong with the idea of "self taught" expert. but the danger is that many many people nowadays have the perception that they can self taught to be a good developer. Do you know that since IBM published the rate of failure in software projects in the industry in the 80's, there is not much a improvement as the statistic shows in this 21th century.
Look back these 25 years, the fundamental concepts and technology actually haven't changed much. what are the invention in the past 10 years, 5 years?
We need to brush up the CS training, we need people to do research and promote innovation. otherwise, the computing industry will soon be controlled by package tools/software. People will be trained to program plugins, packaged component only.
p.s. I've heard people saying Relational DB, OOD are just common sense. but how many can really design a good RDB, OO model?
Computer science is not dead but it is not as popular as 10 years ago in college, thanks to all the big companies outsourced to other countries. The employment outlook affects the students choosing their major which right now is biology and biochemistry. Also in most colleges, they are totally out of touch with the real world. When the company hires a new graduates, you basically have to train them all over again.
In the other thread, a lot more women also exit IT business. The parents do not encourage their daughters to major in CS.
Right now I am working on a project, there are twenty contractors and 5 in house employees. I am the only 'woman'.
Read "After the Gold Rush: Creating a True Profession of Software Engineering" by Steve McConnell.
I think it explains the shortfalls of a CS degree versus what is really needed.