tag:www.schneier.com,2015:/blog//2/tag:www.schneier.com,2006:/blog//2.949-2015-02-17T07:07:34ZComments for Unix Date BugA blog covering security and security technology.Movable Typetag:www.schneier.com,2006:/blog//2.949-comment:297201Comment from Olaf on 2008-08-14Olaf
This is kind of strange - I had a problem in cygwin so I tested on a Suse 11 machine also.
$ date -d "03/09/2008 02:59:59"
date: invalid date `03/09/2008 02:59:59'

This happens with any time between 02:00:00 and 02:59:59 on this particular date.

]]>
2008-08-14T22:43:56Z2008-08-14T22:43:56Ztag:www.schneier.com,2006:/blog//2.949-comment:223163Comment from Jeff Silver on 2007-12-11Jeff Silver
As there's only a little of 2007 left, I guess this is my last chance to alert people to the "year 2008" bug, which I first thought of shortly after Y2K died down. Even after all the Y2K stuff, there are still systems around that use a 2-digit year (generally with some sort of check so that small numbers are 2000-based while larger ones are 1900-based).
The problem will arise if a program reads a 2-digit year with a routine that uses the leading-zero-implies-octal convention; e.g. '%i' in a C format string, or int() in Python.
This is fine for years up to 07, but in 08, either an error will result, or the number will be interpreted as zero (=> 2000).
I have actually seen this sort of thing for real once, but it was a 2-digit month number rather than a year.
Anyway, watch out for strange bank statements next year!]]>
2007-12-11T13:13:48Z2007-12-11T13:13:48Ztag:www.schneier.com,2006:/blog//2.949-comment:88537Comment from Rich Painter on 2006-07-17Rich Painterhttp://painterengineering.com
I wrote on these subjects long ago...

]]>
2006-07-17T16:54:08Z2006-07-17T16:54:08Ztag:www.schneier.com,2006:/blog//2.949-comment:82499Comment from roger on 2006-06-28roger
@Ross Patterson:
> The problem, just like Y2K, isn't so much about the current date as it is about all the dates that are stored...

While true, most of the nontrivial, medium- or long-term stuff is now stored in databases, nearly all of which support ANSI date standards (which generally have either a Y10K problem or else no problems for many millennia).

@Clive Robinson:
> Folks the bad news is as David Mery pointed out that if your implementation is 32bit signed then 2016 gets you.

That's 2106, not 2016. Going unsigned gives you longer, but means you need 64 bit times in order to understand anything before 1970.

It's an ANSI standard. The reason for choosing it as an epoch is that 1600 was a century year that's also a leap year, plus close to the UTC base, or something like that.

]]>
2006-06-28T09:19:44Z2006-06-28T09:19:44Ztag:www.schneier.com,2006:/blog//2.949-comment:82466Comment from nbk2000 on 2006-06-27nbk2000
I've just recently been getting spam dated January 18th, 2038.

Spammers joke? Or corrupted *nix machine?

]]>
2006-06-28T03:02:20Z2006-06-28T03:02:20Ztag:www.schneier.com,2006:/blog//2.949-comment:82417Comment from Terence Davis on 2006-06-27Terence Davis
I have a good one though not Y2K related. I was working for a company that had developed a system with two ways to view system events. First, via the standard Windows NT GUI and the other via a custom written character-mode interface, ala VT-100. During the short window between when daylight saving time starts (or ends) in the US and when it ends/starts in Europe there was a fun bug where the extra hour was subtracted twice. It took me a while to reproduce it and then another while to figure out what was going on (had to look at the Visual C++ runtime source code) and then an eternity to get Microsoft to own-up-to and confirm my findings. Eventual solution was to turn the time into some weird filesystem based time stamp that used 64-bit measures of, I'm not kidding here, pico-seconds since 50,000BC (or something equivalently esoteric). Then convert back to a string. MS said they would address the libc.dll defect in the next release. . .yeah, right.]]>
2006-06-27T22:08:35Z2006-06-27T22:08:35Ztag:www.schneier.com,2006:/blog//2.949-comment:82351Comment from Christoph Zurnieden on 2006-06-27Christoph Zurnieden
> Windows NT (xp, etc) internally use a 64-bit time counter, with an epoch starting in 1600 and counting in 100ns intervals.

That's nearly "coordinated universal time" (UTC, see e.g. http://en.wikipedia.org/wiki/UTC for a short summary). The UTC epoch starts at October 15, 1582. Why does the Microsoft epoch differ? Handling time with all those different calenders is a hard job itself, there was absolutly no need to arbitrarily complicate writing portable software further.

CZ

]]>
2006-06-27T18:49:35Z2006-06-27T18:49:35Ztag:www.schneier.com,2006:/blog//2.949-comment:82334Comment from Sam on 2006-06-27Sam
Haven't seen it mentioned on this thread, and its worth noting: Windows NT (xp, etc) internally use a 64-bit time counter, with an epoch starting in 1600 and counting in 100ns intervals. Most of the win32 time functions use this (e.g. FILETIME/SYSTEMTIME).]]>
2006-06-27T16:48:09Z2006-06-27T16:48:09Ztag:www.schneier.com,2006:/blog//2.949-comment:82315Comment from Christoph Zurnieden on 2006-06-27Christoph Zurnieden
> Well guess what most Unicies have bugs in their time routiens especially those that have POSIX compatability and 2016 is the date that fails most often...

Yes, most unices have bugs in their time routines, but this behaviour is no bug, it's "posixly correct":
POSIX extends the ISO-9899:1999 definition "The range and precision of times representable in clock_t and time_t are implementation-defined" [7.23.1] to "time_t and clock_t shall be integer or real-floating types." The keyword "integer" includes both signed and unsigned integer and is in the current POSIX-standard at least 2^31 bit large.
On the other side: the C-structure 'tm' has a member "tm_year" of type "int" and INT_MAX is +32767[ISO-9899:1999 5.2.4.2.1], so failing before the year 34667 must be a bug and either repaired or, as Digital did it, documented somewhere.
So it's a good question if it is a bug and the vague wording of the relevant specifications doesn't help much here.

CZ

]]>
2006-06-27T14:47:16Z2006-06-27T14:47:16Ztag:www.schneier.com,2006:/blog//2.949-comment:82276Comment from Clive Robinson on 2006-06-27Clive Robinson
Folks the bad news is as David Mery pointed out that if your implementation is 32bit signed then 2016 gets you.

Well guess what most Unicies have bugs in their time routiens especially those that have POSIX compatability and 2016 is the date that fails most often...

IF you hunt around through the internet you will find a copy of Digitals man page for time which actually talks about a roll-over bug in 9999 and the authour obviously had a sense of humour when they wrote it.

P.S. For those using Solaris, check the POSIX compatability I was working in a team that discouvered it was not Y2K compliant (with POSIX) but Sun decided that was not a Y2K issue. On speaking to somebody there at the time (who has moved to bigger and better things) they found that it was a known issue (in the source code) that nobody could be bothered to fix...

]]>
2006-06-27T12:15:38Z2006-06-27T12:15:38Ztag:www.schneier.com,2006:/blog//2.949-comment:82070Comment from Alex S on 2006-06-26Alex S
That's the next generation's problem! I handled my Y2K... let them deal with theirs! *evilgrin* I'll be retired by then!]]>
2006-06-26T16:03:30Z2006-06-26T16:03:30Ztag:www.schneier.com,2006:/blog//2.949-comment:82035Comment from Erik V. Olson on 2006-06-26Erik V. Olson[1] all citations from: "The Open Group Base Specifications Issue 6, IEEE Std 1003.1, 2004". No direkt links, you have to register before, sorry.

Oh, good. I've always wondered something. What does false(1) return if it crashes, and is A) a bug, B) a feature and/or C) POSIX compliant?

Good question. sleep(1) is bound by Posix to support 2^31 seconds[1]:
"As with all other utilities that take integral operands and do not specify subranges of allowed values, sleep is required by this volume of IEEE Std 1003.1-2001 to deal with time requests of up to 2147483647 seconds."

Because of the restriction of sleep(3):
"Application writers should note that the type of the argument seconds and the return value of sleep() is unsigned. That means that a Strictly Conforming POSIX System Interfaces Application cannot pass a value greater than the minimum guaranteed value for {UINT_MAX}, which the ISO C standard sets as 65535, and any application passing a larger value is restricting its portability. A different type was considered, but historical implementations, including those with a 16-bit int type, consistently use either unsigned or int." sleep(1) may "[...]have to make multiple calls to the delay mechanism of the underlying operating system if its argument range is less than this.". Also because of the unsigned return the negative value caused by the integer overflow let sleep(3) return 0. That means that 'sleep(1) 1000000000' should have failed in the morning of december, 31st 2037 and not before, therefore the sleep(1) utility used was not posix-compliant.

But I must admit: it was nonetheless one of the funnier WTFs ;-)

CZ

[1] all citations from: "The Open Group Base Specifications Issue 6, IEEE Std 1003.1, 2004". No direkt links, you have to register before, sorry.

]]>
2006-06-26T10:39:46Z2006-06-26T10:39:46Ztag:www.schneier.com,2006:/blog//2.949-comment:81782Comment from GSM Weenie on 2006-06-25GSM Weenie
I actually hit this bug in 1998... One of our GSM operator customers had a strange setup for their pre-payment cards. Each was valid for one year from date of purchase, but if you bought two at once the incentive for pre-purchasing was that they would be valid for two years. Someone found that the 3rd would be valid for 3 years, and so on... The operator was happy with this - they got more cash up front and didn't mind people buying in advance - after all, they were buying a cash card, not minutes. So if rates went up, no-one lost out.

Anyhow, buying 5 in 1998 meant you had valid pre-payment funds until 2003 (or until the funds were used up, whichever happened first). You can see where this is going.

Someone bought 40. In 1998, meaning their expiry date was now 2038. When they bought number 41, their expiry date was now 1970 - and the system did not handle it too gracefully.

I wasn't involved in the fix, but I suspect it was a combination of checking the expiry date expected was later than the previous one, and limiting the purchases to 5 at once... I hope to be retired by the time it comes up again!

]]>
2006-06-25T08:07:19Z2006-06-25T08:07:19Ztag:www.schneier.com,2006:/blog//2.949-comment:81562Comment from another_bruce on 2006-06-24another_bruce
i always wanted to start the world's first y10k consulting firm. it's only 8000 years away, not too soon to prepare! ]]>
2006-06-24T05:13:01Z2006-06-24T05:13:01Ztag:www.schneier.com,2006:/blog//2.949-comment:81549Comment from Anonymous on 2006-06-23Anonymoushttp://cr.yp.to/libtai.html

Dan Bernstein has his own 64 bit Unix time software in the public domain called libtai.

]]>
2006-06-24T02:49:22Z2006-06-24T02:49:22Ztag:www.schneier.com,2006:/blog//2.949-comment:81533Comment from Glenn Willen on 2006-06-23Glenn Willen
> Why should `sleep 1000000000` overflow?
> Isn't counting down from any arbitrary
> number down to zero (or the other
> direction) quite independent of the current
> date?

No, presumably it works by recording what time it is now, adding the appropriate amount to calculate the time it should wake, then going to sleep until that time. That way it doesn't have to use CPU cycles while it's waiting. But if the calculation of wake-time overflows, watch out.

]]>
2006-06-24T01:34:03Z2006-06-24T01:34:03Ztag:www.schneier.com,2006:/blog//2.949-comment:81532Comment from Anonymous on 2006-06-23Anonymous
> Why should `sleep 1000000000` overflow?
> Isn't counting down from any arbitrary
> number down to zero (or the other
> direction) quite independent of the current
> date?

No, presumably it works by recording what time it is now, adding the appropriate amount to calculate the time it should wake, then going to sleep until that time. That way it doesn't have to use CPU cycles while it's waiting. But if the calculation of wake-time overflows, watch out.

]]>
2006-06-24T01:33:37Z2006-06-24T01:33:37Ztag:www.schneier.com,2006:/blog//2.949-comment:81526Comment from Torbjorn Kristoffersen on 2006-06-23Torbjorn Kristoffersenhttp://sgt@rasterburn.org
> the format will change and i hope we will be by then 128 bit
> Posted by: Watches at June 23, 2006 03:40 PM

Um..128 bit? You do realize that while a 32-bit signed integer lasted for approximately 68 years, a 64-bit signed integer will last for 292471208677 years!

A 128-bit signed integer will last for 5395141535403007094485264577495 years.

Clearly you can see 128-bits would be completely unnecessary as the earth is probably destroyed after 292471208677 years, as well as the sun :)

]]>
2006-06-24T00:46:06Z2006-06-24T00:46:06Ztag:www.schneier.com,2006:/blog//2.949-comment:81524Comment from Woo on 2006-06-23Woo
I still don't get the original problem...
Why should `sleep 1000000000` overflow? Isn't counting down from any arbitrary number down to zero (or the other direction) quite independent of the current date?! (no, I haven't looked up the implementation of the sleep command.. feel free to enlighten me ;) )]]>
2006-06-24T00:38:02Z2006-06-24T00:38:02Ztag:www.schneier.com,2006:/blog//2.949-comment:81499Comment from Jungsonn on 2006-06-23Jungsonnhttp://www.jungsonnstudios.com/blog/
Ah that ol' Unix epoch time...
We can use Swatch time instead.]]>
2006-06-23T21:38:22Z2006-06-23T21:38:22Ztag:www.schneier.com,2006:/blog//2.949-comment:81497Comment from Jungsonn on 2006-06-23Jungsonnhttp://www.jungsonnstudios.com/blog/
@Lucy

LOL - does M$ still exists?

]]>
2006-06-23T21:36:07Z2006-06-23T21:36:07Ztag:www.schneier.com,2006:/blog//2.949-comment:81467Comment from David Mery on 2006-06-23David Meryhttp://gizmonaut.net
There are many interesting dates; C's time_t failure will happen in January 2038 or 2106 depending if you're using signed or unsigned integers.

You know, I remember thinking about this problem when I was a 17 year old dev, and laughing about how 2038 was going to be much worse than 2000, as I happily compiled my code with 32-bit time quantities.

There was once a COBOL programmer in the mid to late 1900s. For
the sake of this story, we'll call him Jack. After years of being
taken for granted and treated as a technological dinosaur by all
the UNIX programmers and Client/Server programmers and Web site
developers, Jack was finally getting some respect. He'd become a
private consultant specializing in Year 2000 conversion. He was
working short-term assignments for prestigious companies,
traveling all over the world on different assignments. He was
working 70 and 80 and even 90 hour weeks, but it was worth it.

Several years of this relentless, mind-numbing work had taken its
toll on Jack. He had problems sleeping and began having anxiety
dreams about the year 2000. It had reached a point where even the
thought of the year 2000 made him nearly violent. He must have
suffered some sort of breakdown, because all he could think about
was how he could avoid the year 2000 and all that came with it.

Jack decided to contact a company that specialized in cryogenics.
He made a deal to have himself frozen until March 15, 2000. This
was a very expensive process and totally automated. He was
thrilled. The next thing he would know is he'd wake up in the
year 2000; after the New Year celebrations and computer debacles;
after the leap day. Nothing else to worry about except getting on
with his life.

He was put into his cryogenic receptacle, the technicians set the
revive date, he was given injections to slow his heartbeat to a
bare minimum, and that was that.

The next thing that Jack saw was an enormous and very modern room
filled with excited people. They were all shouting, "I can't
believe it!" and "It's a miracle!" and "He's alive!" There were
cameras (unlike any he'd ever seen) and equipment that looked
like it came out of a science fiction movie.

Someone who was obviously a spokesperson for the group stepped
forward. Jack couldn't contain his enthusiasm. "It is over?" he
asked. "Is 2000 already here? Are all the millennial parties and
promotions and crises all over and done with?"

The spokesman explained that there had been a problem with the
programming of the timer on Jack's cryogenic receptacle, it
hadn't been year 2000 compliant. It was actually 8,000 years
later, not the year 2000. But the spokesman told Jack that he
shouldn't get excited; someone important wanted to speak to him.

Suddenly, a wall-sized projection screen displayed the image of a
man who looked very much like Bill Gates. This man was Prime
Minister of Earth. He told Jack not to be upset. That this was a
wonderful time to be alive. That there was world peace and no
more starvation. That the space program had been reinstated and
there were colonies on the moon and on Mars. That technology had
advanced to such a degree that everyone had virtual reality
interfaces which allowed them to contact anyone else on the
planet, or to watch any entertainment, or to hear any music
recorded anywhere.

"Well," said the Prime Minister, "The year 10,000 is just around
the corner, and it says in your files that you know COBOL."

]]>
2006-06-23T19:41:19Z2006-06-23T19:41:19Ztag:www.schneier.com,2006:/blog//2.949-comment:81457Comment from Woody on 2006-06-23Woody
Luckily, 64-bit time (for system time) gives us enough leeway to actually be free from this problem. Java's internal time is 64-bit, and is milliseconds since the epoch (fuzzy on the exact epoch they used). But it's good for an insane amount of time (500 million years, if I did my math correctly).

2^64 / (365*24*60*60*1000)

]]>
2006-06-23T19:36:45Z2006-06-23T19:36:45Ztag:www.schneier.com,2006:/blog//2.949-comment:81456Comment from Anonymous on 2006-06-23Anonymous
What does this have to do with squids?
]]>
2006-06-23T19:36:32Z2006-06-23T19:36:32Ztag:www.schneier.com,2006:/blog//2.949-comment:81455Comment from Stu Savory on 2006-06-23Stu Savoryhttp://www.savory.de/blog.htm
My funniest Y2K story (I had duty that night) :

When the date rolled over, the fire alarm in (all the branch offices of) a certain bank, whom I shall leave nameless to protect the guilty, triggered. So for safety's sake, the security SW unlocked all the doors and even opened all the sliding doors :-) Luckily, the bank had the foresight to have staff on site in every branch at the rollover.

One would hope, but consider the number of backbone systems running *nix that management (and maybe even some of the engineering staff) is terrified of rebooting for fear it will never again come back. With any luck, this will include something harmless that nonetheless causes some interesting fireworks. :)

Yup, it's happened in the past. In the 1970s, IBM mainframes shifted their epoch from 1960-01-01 to 1900-01-01. The cleanup work was huge. The dominant date format was yyddd (remember Y2K?), but there were plenty of these "number of days since the epoch" dates lying around too.

The problem, just like Y2K, isn't so much about the current date as it is about all the dates that are stored in existing data and the processing of them. In 2037 we're gonna be partying like it was 1999 all over again.

With any luck, we'll be just as successful the second time around and everyone will ask "Where's the Earth-shattering kaboom? There's supposed to be an Earth-shattering kaboom!" again.

]]>
2006-06-23T18:58:33Z2006-06-23T18:58:33Ztag:www.schneier.com,2006:/blog//2.949-comment:81441Comment from Anonymous on 2006-06-23Anonymous
>Another solution with the Unix date for systems unable to be moved to 64 bit time would be to move the Unix Epoch up to say Jan 1, 2001 which would move the day of reckoning to 2068

oh oh can you imagine how many systems THAT fix would break!!!

]]>
2006-06-23T18:54:44Z2006-06-23T18:54:44Ztag:www.schneier.com,2006:/blog//2.949-comment:81440Comment from Stephen Smoogen on 2006-06-23Stephen Smoogen
Another solution with the Unix date for systems unable to be moved to 64 bit time would be to move the Unix Epoch up to say Jan 1, 2001 which would move the day of reckoning to 2068]]>
2006-06-23T18:45:01Z2006-06-23T18:45:01Ztag:www.schneier.com,2006:/blog//2.949-comment:81439Comment from radiantmatrix on 2006-06-23radiantmatrixhttp://radiantmatrix.org
@Victor:

The patch is ready, and some UNIX-likes (including Linux, IIRC) already use 64-bit dates with a mere compile-time option. Fortunately, extending to 64 bits will only cause problems with applications that depend on the value "wrapping around", which is pretty rare.

]]>
2006-06-23T18:43:58Z2006-06-23T18:43:58Ztag:www.schneier.com,2006:/blog//2.949-comment:81435Comment from Michael Lee on 2006-06-23Michael Lee
Just as Y2K provided for the retirement of a bunch of COBOL programmers, the 2038 bug (time to think of a sexy name for it now) will provide it for C programmers.
]]>
2006-06-23T18:38:06Z2006-06-23T18:38:06Ztag:www.schneier.com,2006:/blog//2.949-comment:81433Comment from Victor Bogado on 2006-06-23Victor Bogado
When are we going to start using 64bits for this? I find the unix date very clever and simple. It does not rely on local standards or calendary details like when there are leap seconds.

Even with todays 32 bits systems we could update the date to 64 bits, it is a long long in the linux ABI for C, and create a deprecated API to read and set the 32 bits that would turn out a warning to the compiler.