• Computer science degree from a liberal arts school here. Our major started with programming, the "weed-out" course (Assembly Language) was a 200-level one, as were the hard theory required courses. I do think that learning the theory behind what you're doing is a good thing as it explains why systems work better when you do things a certain way. And getting exposure to different languages helps you figure out how you learn best, which is always helpful given changes in technology and the job market.

    One thing that the core classes, required or not, helped greatly with is communication. It doesn't matter what part of IT you're in, you need to be able to communicate with users and with management and executives just as much as you need to be able to communicate with your teammates. Attempting to explain why a "real quick change" is going to take 40 hours of work without making the requestor's eyes glaze over or breaking their brain is an asset.

    That being said though, I do wish that I'd had more opportunities to learn in real world situations before getting let loose into the "real" world. Going back now and talking with the professors I learned from and worked with makes me realize how very different academic computing is from the corporate world. And while team projects are helpful to some extent in figuring out project management, some kind of internship in a corporate setting would have been extremely helpful for seeing how things work outside academia as well as figuring out where I wanted to focus in the classroom and how it related to my future career.

    Jennifer Levy (@iffermonster)