We use cookies and other tracking technologies to support site functionality, remember preferences, protect against fraud, and analyze traffic. With your consent, we also utilize advertising cookies and other tracking technologies to provide you with personalized ads. By clicking Yes, you consent for us to set advertising cookies. Please see our Privacy Policy.

Is More IT Experience Always Better?

When hiring new staff, how much experience should they have? Opinions vary depending on whom you ask, with some saying that more isn’t necessarily better—especially if “ten years’ experience” is actually “one year of experience repeated ten times,” or if “experience” is simply a code word for “can’t learn anything new.”

“Do you work in software? Do you have more than a decade of experience? I’m sorry to hear that,” writes Jon Evans in TechCrunch. “That means there’s a strong possibility that much of what you know is already obsolete. Worse yet, there’s a good chance that you’re set in anachronistic ways, hidebound with habits which are now considered harmful. If you think your experience is automatically valuable, I warn you: think again.”

“Experience provides a backbone for problem solving, risk analysis, and stakeholder interaction—among other things,” protested one commenter. “In fact, knowing when to use a technology is a skill often only found in those with experience. Too many projects have fallen into maintenance sinkholes due to the implementation of a ‘ground-breaking’ technology that never found traction in a competitive market. The claim that experience is not valuable is a dangerous one.”

“The least important part of my developer job is the syntax of the programming language I use and the names of the functions in its library,” writes another. “What developers spend most of their time doing is sitting at their desk alone, or meeting with other developers, figuring how to get from a brief English description of a feature to working code. The skills used for this task are organizational in nature: Building ontologies, abstracting, working with people, estimating schedules. In my experience, people continue to get better at these things with continued practice, right up to retirement.”

And while Evans writes that he’s not promoting ageism, some commenters felt that he was promoting exactly that. “It read to me like he is in favor of age discrimination, but is trying to tread on the edge of the line of saying it,” notes one.

Similarly, some companies get hung up on requiring a certain number of years of experience in a particular technology—in some notable examples, eliminating everyone from consideration except literally the people who designed the language or system—when what they really need is someone who can demonstrate that they can learn, writes Jeff Atwood in the blog, Coding Horror.

“This toxic, counterproductive years of experience myth has permeated the software industry for as long as I can remember,” Atwood writes (emphasis his). “Imagine how many brilliant software engineers companies are missing out on because they are completely obsessed with finding people who match– exactly and to the letter– some highly specific laundry list of skills. Somehow, they’ve forgotten that what software developers do best is learn.”

Atwood goes so far as to suggest that beyond 6 to 12 months of experience, programming experience doesn’t matter. “It’s been shown time and time again that there is no correlation between years of experience and skill in programming,” he writes. “After about six to twelve months working in any particular technology stack, you either get it or you don’t.”

At a time when organizations are struggling to find skilled workers, it’s important to remember that talent can come in many packages, and at any age and with any amount of experience. While it may seem simpler to try to use hard-and-fast rules of thumb when hiring, they can end up cutting you off from valuable employees.