I was talking with some data professionals recently about training and the value of college, work, or some other method of entering the technology business. I have a few thoughts about the different ways of teaching people about this business, and I'm curious what you think.
What do you think of traditional computer science? Colleges have students study computer science theories, languages, and more. Build small software projects, write pieces of operating systems and more as part of their curriculum. Students also still have the requirements of other core classes like science, language arts, etc., but does this train them well? I almost think that students ought to be charged with working on real projects, perhaps open source projects, and tackling some small part of the system. Perhaps an older student needs to do project management work, or architectural specifications for less experienced students that are actually programming. Is that good preparation for the real world?
What about a vocational school that teaches students with real projects? If the students are required to work in the same ways that companies do, given requirements, asking for milestones and deadlines, providing some training, but not all, would you consider that better preparation than anything else?
Ultimately the best preparation I've seen for IT is to immerse people in actual work situations, having them solve real world problems. The best way to do that is actually set up environments where students are forced to build domains, set up clusters, write code that actually would be used to manage a server or produce an application. In essence, run students through an apprenticeship.
Or maybe the best solution is to actually offer more students apprenticeships where they can learn in the real world.
I know there's no one way that works for everyone, but I think that we certainly can find ways to better equip the majority of potential employees than we do now.