closed as not constructive by Mark Trapp Oct 2 '11 at 3:25

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
If this question can be reworded to fit the rules in the help center, please edit the question.

It's amazing how ephemeral a technology that you spent 7 years learning and using can become when it loses out to Oracle (or Linux). Admittedly, what I learned about building and deploying applications didn't vanish, but no one cares about Pick, Ultrix, or any number of losing technologies.
–
Craig TraderMar 14 '11 at 20:48

The principle is true. The actual value is - to my knowledge - much larger.

I recall a Pragmatic Programmer presentation where they said about seven years, but I cannot locate it now, so the value might be slightly different.

Just think how technologies have changed: Fifteen years ago the web was brand new and we all tried to write web pages - perhaps even with tables - and an animated gif. Seven years ago, AJAX took off. Today some folks write Doom-like games for cell phones.

Your best bet is to learn general stuff that can be applicable with the next technology that comes by instead of saying "Begone! I only know Visual Basic!" (or the equivalent in 15 years).

The best (worst?) test I can think of is to just think back a year. How much of the programming knowledge you use each day was learned in the previous 18 - 24 months? Moreover, how much was invented in the previous 18-24 months? The principle seems highly suspect to me, as the majority of programming and technical knowledge I use on a daily basis was acquired over 5-10 years.

Now if you are developing for something like a mobile phone platform, perhaps it's a different ballgame.

Nope, mobile's no different to anything else. All of the vital skills you use on a day-to-day basis last for years. I'm guessing that it's easy to forget the day-to-day skills because they are semi-automatic.
–
JimMar 6 '11 at 22:56

1

The fundamentals don't change, but the API's and techniques do change. If you program a mobile app while thinking about a desktop, you'll likely not have something usable.
–
jmort253Mar 7 '11 at 4:22

Single facts have no great relevance. You take them, understand them, apply them for the moment only.

But doing so teaches you the process of handling facts or at least handling a certain subset of facts. I have learned a lot of maths in school which I actually never used. Still I learned and trained mathematical thinking.

I worked as a web programmer with Ruby on Rails. And while I don't write web sites at the moment, it heavily influenced my thinking bout code and made me a better C++ coder. (Usind more STL for example).

Similar goes for learning Racket. I never wrote any large program, but it gave me a new viewpoint to apply on some problem space.

In my experience, there is a huge disconnect between the media/public image of which technology is the New New Thing, and what's actually being used in the real world out there.

Take something like Visual C++/MFC in the desktop application space. While it might seem old and outdated, and probably not something that a new programmer should be learning right now for desktop development, there are still a heck of a lot of real world projects and jobs out there written in it that are being maintained - and probably will be maintained for years and decades to come. I was going to give COBOL as the example, but that would be speaking theoretically - I actually know the VC++/MFC example very well personally.

Essentially, it's not that technologies become useless and unused when they become "obsolete", it's more that they are no longer seen as the most up to date way of doing things and starting new projects. But the decommissioning of large real world software systems which ain't broke and don't need fixing happens much more slowly. Many of the Visual C++/MFC projects I've worked on (that started in the early 1990s) are still very much alive, and employing many many programmers (both in maintenance and new development), and don't seem to be going anywhere anytime soon. In fact, I'm sure most of the ones I'm thinking of will still be around in 2020, and longer.

Of course, this isn't even the main issue. The main issue is that a lot of the concepts are similar, or related, and you're not "starting from scratch" when learning some new technology. For example: once you understand markup languages and what they're about, it's very easy to learn new ones. So it doesn't matter so much that JSON is the new new thing and all you've used for years is XML. It's just a matter of learning new syntax - as opposed to being some non-programmer who knows nothing about what markup languages even are, or the internal concepts behind the data they're representing, etc.

TL;DR: 1) There is a lot of "obsolete" technology in use out there, but since it isn't the sexy new thing, you don't hear about it much - but it's far from worthless to those employed with it. 2) Programming concepts build on top of each other and evolve. Few things are truly something you have to learn completely from scratch and forget the old.

It was once closer to true -- long ago, you had little choice but to program at a relatively low level of abstraction, which meant knowing a huge number of details that were no longer relevant on a new platform.

Over time, however, more and more programming is done at higher and higher levels of abstraction. A higher level of abstraction translates more or less directly to less concern with details that are likely to change and become obsolete quickly.

There are obviously people who work on things like device drivers or tiny embedded systems who still have to work at a low level of abstraction. Outside of areas like that, however, there's relatively little excuse for such things. Yes, a lot of people do learn a lot of trivia they never need, but if you're really using such stuff in your code much, chances are pretty good that you're just not making very good decisions. Most such things can (and more importantly, should) generally be avoided.

It depends on what you spend your time learning. I learned Bourne shell and C programming in 1980. I still use it every day. On the other hand, the time I spent learning Compuserve menu structures is a complete loss, and it wasn't really terribly useful even at the time. Then there in-between things like RS-232 cable pin-outs and serial protocols: useless today, but essential for about ten years of my life. Pick the technologies you devote a lot of time to carefully.

There's a nugget of truth or relevance in here but I think it's presented inaccurately.

A better way to present this would be

How much of the knowledge you use
today did you have 18-24 months ago?

or

In 18-24 months time, how much of the
knowledge that you will applying to do
you already know? How much will you
need to learn from today in order to
complete those tasks?

It might depend on which field you're working in, but I know that I'm constantly working on new technology. Every project seems to have a huge chunk of new stuff that I need to learn - new frameworks & patterns, new approaches to slightly varied problems, or just new tools that are (supposedly!) better than what we previously used.

If every six month project has just 12.5% new knowledge required, then over two years a full 50% of the knowledge used will be 'new'.

I've heard this phrase attributed to engineering fields, not programming. More specifically, I've heard, "By the time you receive a Bachelor's Degree in Engineering, your first two years of study will be based on old technology." (Or something to that effect.)

I don't think it applies to programming at all. The only possible way I could see it relating is when features are deprecated or removed from a programming language/library/whatever.

I think you can easily disprove the statement by playing around with the object you in 'half of everything you know'.

There is some given distribution of knowledge, some of which will become obsolete (regardless of rate.) So, if a given person only contains knowledge from the half of this spectrum which will remain after 18-24 months, they break the statement.

Robert Harvey nailed this, but after thinking about it, I'm compelled to throw brevity to the wind and answer.

I have to add a disclaimer, I didn't go digging into Perl On Rails the moment that it was announced. I had a feeling that it worked well for the very localized use that it was designed for and made a note of it for future reference.

I also did not succumb to over 50 permutations to the standard C library over the course of the last two decades, I wish I could link to them, but they appear to now be existentially challenged.

Insert a long rant here about believing everything that you read or hear.

When something new comes out, grab it and have a look. If you say 'ick', drop it. If you say 'wow', improve it. If you can't contemplate such a decision, go pick the brains of people who can.

Judge everything on technical merit alone. Worth your time means it saves you time while getting a thumbs up from the majority of your peers.

Now, I'll address your question directly:

Half of everything you know will be obsolete in 18 - 24 months, true or false?

You'll have to let us know in 18 - 24 months. Companies pay substantial sums to get people talking about how great their product is. We have to wade through not only start up companies but established giants that shell out considerable sums of cash to:

Have [sic]respected bloggers regurgitate a sales pitch to their readers

Send out expensive branded gratis gear to gain brand placement where it might be noticed through bragging or use

Pay people to make sure you see a 'working solution' in Google's top 10 when researching a problem

Pay for 'awards' from the 'top ten directory' sites and pretend those are authoritative

A veritable plethora of other means to convince people to stop thinking and just follow the crowd

You could, of course, form your own decisions based on previous experience and your trials with something new. While doing that, eschew employers that have managers who hand out commandments based on their RSS reader.

I do, however have this amazing new bridge building library that is smart enough to switch between Brooklyn and London based on your locale. It's going to be huge, do you want to get in on the ground floor?

My answer is indeed intentionally sardonic and perhaps the anti boolean, but really? For exception handling purposes, my answer is a resounding false.

If you think something is technically sound - embrace it, otherwise it's business as usual. C is my primary language, it works just as well as it did nearly two decades ago, while I get paid over twice as well as I did nearly two decades ago.

Simply said, if it's a hot fad or trend, if your a good programmer, you'll read about it, then go back to what your normally do or work with.

Unless there is something vital to what you do, or new practices that make sense for you.

Just because something is new, somewhat new, or somewhat old, doesn't make it the must use solution for anything.

I have a simple phrase, that cover's all of this.

"If it works, use it"

That means, if this new technology is awesomely cool, but not anything that makes your work more productive, or higher quality, or less error prone, or solve technical problems like mobile or client/server solutions. Than it's best to read it, then ignore it, until you have a practical use for it.

I have seen and read more people wasting time, trying to find the hot new thing, then use the hot new thing. Which usually ends up to be a total waste of time and money.

It is important to always learn, and practice, and improve your craft and skills.

However you should learn what is useful, or what gives you different perspectives to solving problems you normally have.

But other than that, should go back to the basics of being an awesome programmer.

If that were the case only 5.39x10-6 of Mythical Man-Month would be relevant today. As it is there are very few key principles that Fred Brooks details that have dated significantly or proved to be fundamentally false.

I'm not so sure. Some of the things are outdated (does anybody really use the Chief Programmer Team nowadays?), a very few are outright wrong (his conclusion on information hiding is the one that comes to mind), and quite a few things have been established in the popular culture, and arguably are no more relevant than Sigmund Freud's arguments that there are parts of our mind we're unconscious of. It's still worth a read, and of course there's far more than five parts in a million (two letters?) that are relevant.
–
David ThornleyMar 9 '11 at 21:35

Here's a better version of the sentence: half of everything you learned today (or this week, or this month, or this year) will be obsolete in a year or two. That's true - you learn how to do something in version 5 of a tool, and when 6 comes out it does that automatically, or you learn how to do something in a language that doesn't catch on, so you never use it again. But the other half of what you learn each day stays with you, and grows, and is what makes a 20-years-of-experience developer better than a two-years-of-experience one.

The average technology platform sticks around for somewhere between 10 and 25 years, so this seems pretty unlikely to me, even if you entirely discount the fact that knowledge of patterns persists through technologies. If you are on any sort of major platform you can count on that stack being popular for AT LEAST 5 or 6 years before it even starts to fade away. I know programmers who have been coding in RPG for 30 years using almost identicle hardware and software tools.