How to Study in Academia

Plenty of professionals in the IT industry hold degrees from four-year universities, but they aren’t common enough to be considered typical. The attitude of most techies toward college degrees in the past typically was one of cool indifference. The conventional wisdom went something like, “Sure, a degree would be kind of OK to have, I guess. Possibly.” This wasn’t due to a lack of respect for academia and its approach to IT. Well, maybe a little …

The main reason behind this view, though, was relevance — or lack thereof — to the field. Many colleges didn’t have technology courses of study for a long time, and the ones that did frequently offered curriculum that was nice to know but wasn’t necessarily applicable to IT pros’ desired vocations. Also, there was a sense that the pertinent content could be learned more quickly and easily on the job.

But in today’s academic environment, things are different. Colleges and universities have cutting-edge IT environments in which students can get loads of hands-on experience that they later can apply in the real world. Moreover, as they strive to learn new nontechnical skills and make themselves more marketable in the workplace, they’re pursuing degrees outside of computer science such as business or English.

Plus, as a reader recently wrote in to tell us, the appeal of a college degree for IT pros (and people in just about any field) is that it meets one of the major requirements of the hiring gatekeepers in HR.

Thus, it shouldn’t come as too much of a surprise that more and more IT pros are venturing into this ivy-covered world. What the uninitiated techies will find, however, is vastly different from what they’ve gone through with certification and on-the-job training (OJT).

Take the instructors, for instance. In college courses, students learn from professors who have spent all or most of their professional lives in academia, whereas in certification and OJT, they’ll typically be taught by people who have years of experience with the skill or knowledge in question.

This is not to disparage the professors or try to prove the latter methods are somehow better — it’s just pointing out that the kinds of instructors encountered are different and that they’ll approach pedagogy in their own way. In college, the instructors usually will be more deductive and focused on theory, while those in OJT and credentialing programs probably will be much more straightforward and focused on the task(s) at hand. If you opt for academia, be prepared to ask lots of questions.

In terms of studying, IT pros can expect differences, as well. For starters, there will be little to no studying involved with OJT by virtue of how it’s delivered and what the aims are. With certification, there will be plenty to study, but much — if not all — of it will be mapped to the certification exam with objectives clearly spelled out.

Obviously, it’s different in academia. In both OJT and certification, the goals of studying and training are more or less spelled out for participants, who are guided toward those ends. In academia, students often are invited to draw their own conclusions and come up with their own approaches to problems. Thus, studying is more self-directed, with learners typically seeking out their own sources to support their ideas.

Therefore, to learn and study in academia, you need to come equipped with critical thinking skills and always have in mind the question: “Why?” It also pays to pay attention to the details while also being able to comprehend the “big picture,” i.e. how all those details are related to one another and support the whole.

Interestingly, some of the higher-level certification programs out there are beginning to resemble academia in these respects. This is a good sign, as it reflects maturity for the industry and a rise in professional expectations for techies. Expect the market to move in this direction more in the coming years.