Date: Fri, 14 Aug 2015 09:26:59 -0700
From: Paul Eggert <eggert at cs.ucla.edu>
Message-ID: <55CE16D3.3030803 at cs.ucla.edu>
| Microsoft file times are unsigned 64-bit quantities that count the number
| of 100ns intervals
that's 0.1 us. Amazing.
| since 1601-01-01 00:00:00 universal time.
That's "recent enough" that it is probably not material.
| If his system is using this format, that'd explain why it overflows
| for the Big Bang
Yes.
| though it wouldn't explain why the result was dated 1970.
It could, depending upon how the conversions is done - we know that in
tzdata files, 0 == 1970-01-01
So, take the big bang 0xF800000000000000 multiply by 10*1000*1000
the result is 0x93D1CC0000000000000000 truncate that to 64 bits,
(leaving 0), then add the constant conversion factor to adjust
the unix epoch based time to the windows one.
Then print that, you get 1970, just as you would have if you'd started
with a true 1970-01-01 timestamp (ie: 0).
This seems very likely to be the problem. The bug is whatever is doing
the conversion isn't range checking the input - if the unix time_t value
is smaller (or bigger) than their format can represent, they should be
either generating an error, or limiting it to the earliest (or latest) times
that the format can represent.
kre