Gary, I totally understand the mentality that results in the scenario you describe.

And, aside from noting how inefficient it is in the "replace part; try again" scenario (although, to be real, in some cases, "replace part; try again" is absolutely the correct approach), I think it's important for us to understand that this approach will not work in the future when everybody is working in a multi-discipline environment.

The "replace part; try again" scenario has always been an option of last resort. If we think back to the type of hardware diagnostics we did in the 1980s, a lot of that was the only approach that worked. Pull all of the expansion cards. Reboot. Plug 'em back in one at a time, and see what happens. Same approach applies to software/services. Disable all the services. Turn 'em back on one at a time, figure out which service is causing the problem. These are all DIAGNOSTIC tools!

But even those tools seem to have been lost, from a logical perspective, in what I'm seeing today.

You are absolutely correct in that "critical thinking skills" are NOT developed on-the-job. Those are skills that are developed from classroom education and exercises designed to develop those skills.

Where I'll disagree, though, is that it does NOT take a "genius" to possess and use critical thinking skills. I truly believe that 90% of the people competent enough to work in IT in the first place, have all of the mental capabilties necessary to develop critical thinking skills; they just need a motivated mentor to help them do that.

Ever go into a computer repair shop and watch the guys work? What they do is just replace parts, until the machine starts to work again. If they can't, they just tell the owner that the device can't be fixed, and offer to buy it to use as parts in their next repairs. That's how it works, because it's all they know.

You don't develop critical thinking on a shop floor. I know you can do it through studying math, physics, chemistry and electrical engineering, and I'm told you can do though studying law. But that takes time, and that takes money, and generally, industry won't pay for it.

There's always room for a few near geniuses at the very top, but what they want the most of is guys clever enough to know which part to replace, and that's what they'll pay for. And, you get what you pay for.

Great article, the challenges that an IT professional faces (or any professional) can become overwhelming overtime if left unchecked, on the other hand it's enjoyable (rewarding in the long-run) if updating skills/knowledge/information is viewed as a necessary core responsibility of the job itself. The latter is the logical option and it is not easy for example, an individual might begin a job or entrepreneurship and then deduce that they need to complete a Master's degree to deliver better service, and after completing the degree will deduce that more information is needed and so on.

And thank you for the quotation, I have heard about Manager A many times and every time the reasoning would always divert towards the measures that could be taken to limit the capital or time loss that the business will face if the trained staff do leave. But now the more I think about it, Manager B's point is more important not only during regular business operations but also during a stage when the business is expanding.

Thanks for sharing those resources, Lawrence. Based on your wargaming suggestion, sounds like hackathons could help teach some critical thinking, too. I am going to ask my Twitter contacts about similar resources on IT administration. Maybe we can come up with a good set. I'll report back.

Absolutely agree. Recently I've taken to describing this as the "throw mud against the wall" approach. The inherent problem with this approach though is I've seen a lot of people throw so much mud at the wall that they can no longer find the exit door. If it goes too far, that mud starts sliding off the walls and pooling on the floor making an even bigger mess.

The concept of "how to write a program" is one of the skills that the legacy four-year programs did bring to the table (even if they were still doing it with FORTRAN and COBOL well into the 1990s; a few introduced C in the 1980s). In fact, I daresay my undergraduate courses focused a lot more on logic and process development than it did syntax specifics. (Of course, pushing a card deck through a card reader for time-sharing on a System/360 definitely encourages you to minimize syntax issue on the front side. Exceptionally frustrating to get a source printout the next morning during the last week of class only to find out you missed a trailing semi-colon on a Pascal statement.)

To that point, and Laurianne's about MOOCs, Stanford University published course material on their three most popular Software Engineering courses, which I was able to obtain through Microsoft's now-gone Zune podcast library, but they're also available on YouTube:

I belive that what is missing is the Engineering mindset. What we have today is more of what Pete Goodliffe describes as a Tinker-Crash, Tinker-Crash mindset. Keep trying stuff until the code compiles, then call it done. I see far too little effort to understand the cause of the code not compiling.

This leads us to the current state where so many programmers work based on what amounts to little more than superstition, having little true understanding.

One day, a couple of programmers were having a mostly civil disagreement about code editors. One was a huge fan of Sublime Text (which is pretty dang cool actually), and how he could do X and Y so much faster, and do Z with a single keystroke. The two guys went back and forth for a bit talking about how this or that feature made them even faster. Thing is, both of them wrote horrendous code. I finally couldn't take it anymore and butted in with something to the effect that faster was the last thing either of them needed, that what they needed was to slow down and THINK for a second before the started to write any code. I was not as polite as I could have been, it was not my finest hour. But I had been spending the last several days fixing their code, fixing simple mistakes that should have never been made.

I was very fortunate to come in to programming as a career change from an engineering discipline. My two cents: In a 4 year program, spend the first two learning an engineering discipline completely unrelated to IT. Learn to think a certain way, then the right answers come naturally.

I think that MOOCs can make it easier for IT pros, but that still requires the IT pro to have the initiative and self-discipline to complete the course. The other thing I've long recommended for IT pros is an online library subscription. There are many other resources also available.

Stretch projects are definitely a great way to develop critical thinking skills, but also just wargaming a problem around a conference table that directly relates to the workplace and job can help. "If this <server> you're responsible for administering demonstrates <these symptoms>... what's your approach." Of necessity, however, this process requires three things: [1] a facilitator, [2] a mentor, and [3] a committment from the employer to invest the worktime on the task.

This could also be done on an individual basis. Offer a staff member a "homework assignment" that involves developing and exercising critical thinking skills. For some staff members, it may require some preliminary classroom education on the type of physical and mental tools that can be used in the process. I'm continually amazed at the number of IT pros who can find a public forum to ask a Level 100 question, but apparently were not aware or (worse, I fear sometimes) simply not willing to start with the basic resources: Like product documentation or a search engine.

But even more so than just continuing education to keep up with new skills, a notable number of first-year IT pros barely have the skills necessary to perform their assigned job duties. If it's the intent of the employer to hire a green candidate, that's great! Everybody needs someplace to start, and I applaud those employers willing to take the risk. But taking the risk also means committing to the investment in developing that staff member.

I'm reminded of a recent quote, which unfortunately I've lost the source for so cannot attribute as I'd like: Manager A: "What happens if we train 'em and they leave?" Manager B: "Worse, what happens if we don't and they stay?"

Lawrence, don't MOOCs make it easier than ever for IT pros to learn a new skill or brush up on an existing one? Seems to me online learning suits this need well. Less expensive than the training classes of old, too. On another topic, what strategies do you recommend for developing critical thinking skills in staff members? Stretch projects and what else? Thanks.

To learn more about what organizations are doing to tackle attacks and threats we surveyed a group of 300 IT and infosec professionals to find out what their biggest IT security challenges are and what they're doing to defend against today's threats. Download the report to see what they're saying.

Is DevOps helping organizations reduce costs and time-to-market for software releases? What's getting in the way of DevOps adoption? Find out in this InformationWeek and Interop ITX infographic on the state of DevOps in 2017.

Chances are your organization is adopting cloud computing in one way or another -- or in multiple ways. Understanding the skills you need and how cloud affects IT operations and networking will help you adapt.