Writing code that will still run years from now

Programming languages change.
Libraries change.
Some code from 5, 10, or even 20 years ago might still run and produce expected results, whereas some code from 2 years might fail with a syntax error.
This is partly inevitable, since languages evolve (at least, most do).
Developers have a responsibility to maintain their code. But sometimes, stability is an important requirement in production code, and code should simply run for 10 years without the need for someone going through the code every year to adapt it for language changes.
Or I might have small scripts, for example for scientific data analysis, that I need to revisit after not touching them for years.
For example, at meteorological offices there is a lot of operational Fortran code even for non-speed-essential parts, and code stability is one of the reasons.
I've heard fear for instability is one of the objects they have against moving to Python (apart from language inertia of course; it's only possible for new code not dependent on old code).
Of course, one strategy for stable code is to freeze the entire operating system. But that is not always feasible.

I'm using Python as on example, but the issue is not limited to Python in particular.

Documents on Python compatibility issues

In the case of Python, there are several documents outlining policy for backward-incompatible changes.

PEP-5

There must be at least a one-year transition period between the
release of the transitional version of Python and the release
of the backwards incompatible version. Users will have at
least a year to test their programs and migrate them from use
of the deprecated construct to the alternative one.

Personally, I consider that one year is rather short. It means I might write some code, and 1½ years from now it won't run anymore.

PEP 291

PEP 291 contains an incomplete lists of guidelines of things that should be avoided in order to maintain backward compatibility. However, it relates only to Python 2.x. As Python 2.7 is the final release in the 2.x series and Python 2.7 is bugfix-only, this PEP is now only of historical interest.

Along with this there are several rules you can infer that are probably
true most of the time: don't call stuff starting with "_", don't monkey-
patch anything, don't use dynamic class replacement on objects from
classes other than your own, don't depend on the depth of inheritance
hierarchies (for example, no ".__bases__[0].__bases__[0]"), make sure
your tests run without producing any DeprecationWarnings, be mindful of
potential namespace conflicts when adding attributes to classes that
inherit from other libraries. I don't think all these things are
written down in one place though.

In addition, there were some points about "mine fields" (new features likely to change) and "frozen areas" (very sold APIs virtually guaranteed not to change). Quoting Antoine Pitrou:

I think the "frozen area" should be defined positively (explicit public APIs and
explicitly guaranteed behaviour) rather than negatively (an explicit "mine
field"). Otherwise, we will forget to put some important things in the minefield
and get bitten later when we need to change those things in a
backwards-incompatible way.

There doesn't seem to be any conclusion from this thread, but it gets pretty close to the core of what I'm looking for. The thread is almost four years old, so perhaps the situation has changed or improved. What kind of code is likely to survive, and what kind of code is more fragile?

Porting guidelines

Useful compatibility

PEP 3151 introduced me to the concept of useful compatibility. In my own words, this boils down to the idea that only if code is carefully written language developers need to be careful to maintain compatibility. It doesn't really define useful compatibility, but I think it's similar to the ideas I quoted from the PEP 387 discussion above.

From the programmers' point of view

As a programmer, I know that Python will change in the future and that people — most notably myself — will try to run my code perhaps several years from now in a Python version that is one, two, or perhaps three minor versions up. Not everything will be compatible, and in fact it's easy to come up with code that will fail (I once encountered code stating if sys.version[:3] != '2.3': print 'Wrong version, exiting'). What I'm looking for is a set of guidelines on what to do and whatnotto do to enhance the chances that my code will still run unaltered in the future.

Are there any such guidelines? How do I write Python code that will still run in the future?

My question relates both to the Python core, to its standard library, but also to commonly used add-on libraries, in particular numpy, scipy, matplotlib.

EDIT: So far, two of the answers relate to python2 vs. python3. This is not what I mean. I know about tools to migrate from Python2 to Python3. My question relates to language changes yet to come. We can do better than a crystal ball in finding coding guidelines that are more stable. For example:

import module is more future-proof than from module import *, because the latter can break code if module grows one or more new functions/classes.

Using undocumented methods may be less future-proof than using documented methods, as something being undocumented may be a sign of something being not stable yet.

It's this kind of practical coding advices that I'm after. Since it's about present→future, we can limit ourselves to Python3, because Python2 is not going to change anymore.

4 Answers
4

This is an unsolved problem in our field. There's no way to be sure that your code will indefinitely work. Even if your code was truly perfect in the forwards-compatible sense (and if it is, please come work for my company! ;) ), if it runs on, uses, or is used by any other software which gets a bug or changes in any way, your code may not work.

So I can't give you a list of things to do that, if you follow them, will guarantee success. But what you can do is minimize the risk of future breakages and minimize their impacts. A more knowledgeable Pythonist would be able to give you advice more specific to Python, so I will have to be more general:

avoid writing code that exploits implementation details. Code to interfaces, not implementations. Code against multiple implementations of the same interface. For example, run your code in CPython, Jython, and IronPython and see what happens. This will give you some great feedback about your code. This might not be helpful for Python3 though -- last I heard, some implementations were still in Python2.

write simple, clear code that is explicit about its assumptions

write modular, composable code. If some code must do something dangerous (in the future-proof sense), separate it so that even if it has to change, the rest of the code doesn't.

have a specification of some form. This is similar to the points about unit tests, if you use tests as a spec, and interfaces, which can also be used as specs. (I mean interface in the general sense, not the Java keyword sense).

Doing any of these things may/will increase the amount of work you have to do. I think that makes sense -- a lot of these points can also be made for how to write good code, which is quite difficult (in my opinion). Sometimes you may need to violate some of these suggestions. That is perfectly acceptable, but be aware of the costs.

It's great that the Python team is thinking about this, and for sure they are far more talented and skilled than I will ever be. Still, I would estimate there's a 100% that somebody's code somewhere will stop working the way the want it's intended to when Python is upgraded.

It's called Configuration Management. If the system is never changed, it shouldn't break. So don't change the system. Worried about new Python releases? Don't upgrade. Worried about new device drivers? Don't upgrade. Worried about Windows patches? ...

For Python 2 --> Python 3, there is a Python 2to3 library already installed (it comes with the original Python package).

Based off of that, soon after new versions are released, there should be similar libraries that come with each new version. However, as Martijn stated, libraries like these will only be released for major versions (like version 3.0) but not for minor versions (such as 3.2). However, between 3.0 and 3.2 (or any other minor versions), there shouldn't be any compatibility issues, so converting to version 3.0 should be fine.

No, 2to3 only helps you upgrade code across the major version gap; there are no libraries (needed) to upgrade code across minor versions.
–
Martijn PietersFeb 7 '13 at 17:02

@MartijnPieters Minor versions shouldn't have compatibility issues, as there shouldn't be excessively large changes. If there are compatibility issues and large changes, a completely new version should be released.
–
F3AR3DLEGENDFeb 7 '13 at 17:05

I don't have a lot to add, the "program for 2, and use 2to3" seems to be a common addage around the internet lately. However there is something you should look at:

It's called six (pypi page). It's a python library devoted to helping write code that runs on both python 2 & python 3. I've seen it employed in a number of projects while perusing around the net, but the names escape me at the moment.