Posted
by
timothy
on Tuesday January 22, 2013 @10:02AM
from the start-packing dept.

CowboyRobot writes "In 25 years, an odd thing will happen to some of the no doubt very large number of computing devices in our world: an old, well-known and well-understood bug will cause their calculation of time to fail. The problem springs from the use of a 32-bit signed integer to store a time value, as a number of seconds since 00:00:00 UTC on Thursday, 1 January 1970, a practice begun in early UNIX systems with the standard C library data structure time_t. On January 19, 2038, at 03:14:08 UTC that integer will overflow. It's not difficult to come up with cases where the problem could be real today. Imagine a mortgage amortization program projecting payments out into the future for a 30-year mortgage. Or imagine those phony programs politicians use to project government expenditures, or demographic software, and so on. It's too early for panic, but those of us in the early parts of their careers will be the ones who have to deal with the problem."

Stop warning people! The Unix 32 bit Epoch cleanup is my retirement plan! I'll make a killing fixing legacy software when the kids out of school only know how to point and click through their GUI IDE's and don't know the difference between a short, an int, a long, and a long long... and time_t is completely foreign to them.

The main problem is actually not that the time_t in the OS is 32 or 64, hardware and operating systems are upgraded. And, very few systems shipping today will have 32 bit time_t. This will not be the main problem (even if it will be a problem with some systems, for example, some nuclear reactors are managed by custom unices that are not maintained anymore and they don't dare to touch those systems, but this is solvable in the few cases where this is a problem)!

The BIG problem for more mainstream software is all the software using binary file formats, such software will completely break down, if time is stored as a 32 bit UNIX time stamp. There may also be problem for text based file formats if for example the reading code uses %d with sscanf to read in the timestamp, but if these are compiled with a modern compiler, the compiler will warn that %d is being used with a 64 bit time_t integer. It is thus solvable, but will require recompiling the software and some minor patching. Binary file-formats will however be a pain in the ass, so bad that software systems will probably have to be certified as Y2038-safe.

I worked for a DoD contractor in 1999. Our customer had a piece of ground support software for the A-10 that they wanted to make Y2K compliant. Of course, we had to write up a detailed description of the problem and what it might affect so the customer could justify the expenditure to the bean counters. Then we had to write up a preliminary design review to get approval at the high level for what we were going to do. Then we had a critical design review so they could sign off on the detailed implementation of how we were supposed to implement the changes. I based my CDR on the code I had already written to fix the problem. Then I had to put together a testing specification so that we could prove that our date fix didn't destroy the functionality of completely unrelated code. Then we had to go test it on site, do a test report and a final project report.

Nine month project. I don't remember for sure, but I'm pretty sure that the government spent half a million on the entire program. I was the sole technical actor. I changed something like 20 lines of Pascal code, which took me maybe a week or two. I think I spent more time figuring out how to put together their ancient tool chain than writing actual code. Your government dollars at work.

With the acceleration of development that has been occurring over even the last 10 years, I hardly doubt there will be much to worry about 25 years from now.

You sound like a manager.Leave this on the back burner for the next 24.5 years and then drop everything because then the sky is falling.

I'm glad this issue got mentioned because frankly I don't ever think about the unix epoch time overflowing. It's a good thing to give programmers a nudge for their current(and future) development
"Hey use 64 bit time values"
"Oh yeah, sure,"
25 years later, no problem.

I've seen some relatively new systems using a 32-bit time, even some using the same API as Unix. It's relatively common in embedded systems. Many of them just go ahead and use signed 32-bit as there's no need to have internal times be compatible with external systems.

Ultimately I think the problem arises because of overusing time_t. Originally this was something you used to attach a time to a file in a file system, or a time in a log entry, etc. A time system on a file system with a limited range is not a problem. That media is not going to be active 40 years later. The file may last longer but in such cases presumably it's copied to new media with a different method of storing times. So not a problem for file systems to do this. The problem however is that this time system was expanded to be used for more general purpose times. And over time it's practically a de-facto standard to stick with time_t and its 1/1/1970 base date. People don't think about converting from system time into an external time when exporting data, they just assume time_t is valid everywhere. Programmers just use whatever time system they find in the system's libraries without worrying about whether it is an appropriate use or not.

Many application areas already have to avoid using time_t anyway and are used to converting between one time system to another. Astronomy, geneology, medical records, historical data, etc.