Eliminating leap-seconds is like defining pi to be 3.1. Is there anyone in the world who thinks this is a good idea?

This would mean that atomic midnight and solar midnight would drift apart from each other by seconds per decade. That's madness.

I've been hearing about this for years, but I mostly disregarded it because it was so ludicrous on its face that I was sure it was just some reporter not knowing what he was talking about. But it sounds like it's actually for real.

A proposal to fundamentally redefine UTC will come to a conclusive vote in January 2012 at the Radiocommunications Assembly of the ITU-R in Geneva. It would halt the intercalary adjustments known as leap seconds that maintain UTC as a form of Universal Time, and eliminate the requirement that time services transmit the varying residuals (DUT1) between Coordinated Universal Time and Universal Time of day. If the proposal is approved, UTC would not keep pace with Earth rotation and the value of DUT1 would become unconstrained.

Adverse impacts from redefining UTC have not been extensively researched and documented. The implications extend from technical infrastructure to legal, historical, logistical, sociological and economic domains. Affected technologies may include (but are not limited to) applications in astronomy, astrodynamics and celestial mechanics, geodesy, ground-to-space satellite communications, navigation, remote sensing and space surveillance.

40 Responses:

Eliminating leap-seconds is like defining pi to be 3.1. Is there anyone in the world who thinks this is a good idea?

Not really. pi is a fixed value and we can work it out to whatever precision we need. OTOH, there is no "length of an Earth day", as each one is different. However, for many applications it is useful to have one value we can use as the "length of a standard day". Provided we don't pretend that a standard day is an Earth day (as those who would redefine pi do), remember that they are different, and keep an accurate track of the difference between "standard time" and "earth time", I don't see why it's automatically a bad idea.

This would mean that atomic midnight and solar midnight would drift apart from each other by seconds per decade. That's madness.

I think you misunderstand what's going on here. In UTC, seconds are of a fixed, non-varying length, but the number of seconds in a year varies, in order to keep UTC in sync with observed solar time (since, in solar time, the length of the year varies, tending to lengthen as the rotation of the Earth slows down over millennia.) They are proposing to eliminate this, basically making UTC and TAI equivalent, instead of UTC being a compromise between TAI and UT1.

If you have a definition of "standard day" that is not "the length of an Earth day, as it is observed" then after a while midnight moves, and you have exactly the same problem that necessitated replacement of the Julian calendar with the Gregorian.

You might as well declare that a "day" should be metric, and made up of 10 hours of 100 minutes of 100 seconds. That's just not a useful unit for anything or anyone.

So? If midnight gets too far out from the position of the sun/stars (say, 5 minutes out, a couple of centuries from now) you could just alter all the timezones by 5 minutes to be e.g. UTC-0455, UTC+0605, etc...

It's not about the amount. It's where you acknowledge the difference between TAI and local time.

I fail to see the point of UTC. Local time is defined as a difference from UTC, which is sort of constant, except for those unpredictable leap seconds, which means UTC is itself defined as a difference from TAI. The important practical difference is that everyone who works with dates and times remembers to deal with varying timezone differences from a baseline, but many fail to deal correctly with the UTC differences and those extra seconds that appear/disappear.

Why not cut out the middleman, ignore UTC/leap seconds, and just define local time as a difference from TAI, allowing for sub-hour adjustments in timezone differences as necessary? Using timezone differences with a resolution of seconds seems like overkill, given the width of a timezone.

Nobody fails to deal with leap seconds because everyone gets their UTC updates from ntpd or GPS, and it's already baked in.

Whereas, it's a rare piece of software that deals with timezones/DST properly. Also the timezone and DST rules are in constant flux and are hard to update, so everyone's tables are always slightly out of date. Client-side timezone handling is a minefield even when the developers know what they're doing!

You say this is "eliminating the middleman", I say it's pushing a hard problem downstream instead of handling once it at the source. The upstream approach has been demonstrably working fine for decades.

I swear I've seen a lot more code that assumes that the "seconds" part of a time can never be 60, or adding 86400 seconds to a day will always give you the same time on the next day, or even will be the next day (No, 00:00:00 + 86400 can be 23:59:60 on the same day); than get timezone handling wrong. It's not that that hard, and people generally think about DST changes at least a couple of times per year, unlike leap seconds which most people generally never think about.

But you've probably seen more code than me, so I can't really argue with you on that.

One thing though...

DST rules are in constant flux and are hard to update, so everyone's tables are always slightly out of date.

Really? Aren't DST rules automatically updated on a regular basis alongside general updates/patches/security fixes on every OS out there?

It's wishful thinking to believe that all systems have all the latest patches, or even patches from the last 5 years. And then there are monkey-wrenches like the fact that every installation of Java has its own copy of zoneinfo -- and the mess that is Java often requires that a single system have multiple different versions of Java installed, which implies that they can't be auto-upgraded. Steve's links below contain more examples of this kind of thing.

Code that assumes there are only 60 seconds in a minute is just objectively buggy. Whereas systems that merely have out-of-date zoneinfo files might be not running the latest OS updates for any number of completely sane and non-negotiable reasons.

And that's not even counting appliances! I have a personal, concrete example of this: I own a clock that gets its time from GPS. Well, I bought it before the last time the DST rules changed, which means that every 6 months, I had to re-set the time manually, because it was going by the old rules. To fix this, I had to mail it back to the guy who built it and have him replace the firmware. That's what happens when you have a downstream solution instead of upstream.

Code that assumes there are only 60 seconds in a minute is just objectively buggy.

I totally agree. I was just pointing out that I'd seen that bug more often than I had of timezone related bugs. (And I've seen plenty of those)

My main point is that maintaining local time as a difference from TAI is simpler overall than maintaining local time as a difference from UTC, which is itself maintained as a difference from TAI. Plus, the timezone update problems you report are a red herring here, and will continue to happen anyway, no matter what the resolution of the leap-second issue is. Further, if leap second changes are folded into timezone data at multi-century intervals, they can be scheduled decades in advance, giving people plenty of time to accommodate/prepare for them and update whatever might need updating.

There's also the fun that not every app development layer even has access to ZoneInfo. Try writing a scheduling app that's TimeZone/DST compliant but entirely in the browser and you find that you have no DST support at all. The browser can tell you if a time in your local time zone is in DST or not (or will be), but whether or not you're DST when setting a time to some other time zone is anybody's guess.

Using UTC alone doesn't help at all in this front, because just getting GMT offsets means you can't resolve the time back again from GMT. UTC-5 might be Atlantic, Eastern during DST, or 3 different southern hemisphere options (or Indiana during DST...maybe...), so an all-javascript client has to actually have downloaded into it all of the DST rules and preserve the user-selected time zone in full, separate from the actual resultant UTC time they selected.

Then there's countries that can't make up their mind to support DST and (in the case of Albania) actually vote on it in Legislation every year on a year-by-year basis. Businesses follow EU standards (usually means Berlin), but state-run services have to follow the law that may be different one year to the next.

GPS does not run on UTC, so getting UTC updates from GPS is whack. GPS doesn't know about leap seconds.

Daylight saving is another problem. David Prerau's 'Saving the Daylight' presents it as an amazing accomplishment, which might have something to do with having made his living describing the benefits of daylight saving, rather than considering the computing cost and problems introduced.

How hard is it to get up an hour later and go to work an hour later six months of the year? Why do we have to be fooled by setting the clocks back? What latitudes does this behaviour even make sense at?

The nav message doesn't provide a complete history of leap seconds, which makes computing differences in time for epoch handling rather difficult. More importantly, it doesn't tell you about an upcoming leap second in advance - interesting problems there in correctlng after the fact. GPS alone is not enough; time is a database problem.

Reusing "UTC" to suddenly mean yet another thing, a weird UTC/TAI hybrid that will retain a 34-second offset in eternity (or, realistically, until 2020 when someone comes up with the next Great Idea)? That's practically brand necrophilia. Call it UTI or something and leave my Zulu time alone. (Or, well, don't leave it alone. Keep adding leap seconds to it, even if you think that in a perfect world you wouldn't have made that decision based on what you know today.) You can't change your mind every decade.

Aren't there other conventions to destructively redefine first? Electrons still carry negative charge, "phosphate" means different things in organic and inorganic chemistry, Pluto's a planet, short-day plants are really long-night plants, people are calling beetles "bugs" with nary a punishment, and thymidine breaks the base-nucleoside naming rule. Once you're done with those, fly to the UK to convince all the ffordes to capitalize their names, arrive at a consensus on what aspirin, acetaminophen, and adrenaline are called, and rent a car to start a grassroots conversion campaign towards driving on the right (bonus points if you do so at 60% of the actual speed limit, explaining that you refuse to accept imperial units).

And a golden shower should totally count as cleaning yourself! (I, uhm, appear to have picked a urine theme. Apparently the official suggestion is "TI" rather than "UTI" for the let's-piss-all-over-accepted-standards idea.)

Seriously, that's not how adjectives work in English or indeed in any language that's not legalese or mathematics: a skew field isn't a field, it's a non-commutative field (which isn't a field either, except when it's a commutative non-commutative field), your car's previous owner isn't its owner, vegetarian turkey's not turkey, and a dwarf planet isn't a planet.

It depends on the adjective. And in the case of "dwarf", it is as I said. A dwarf mammoth is a mammoth, a dwarf hippo is a hippo and a dwarf human is a human. So a dwarf planet is a planet, and thus we have 13 know planets in our solar system (and several candidates).

"Aren't DST rules automatically updated on a regular basis alongside general updates/patches/security fixes on every OS out there?"

The OS on my VCR and PDA haven't been updated in a long, long time. And in the case of the VCR, it would mean finding a new ROM for a long-discontinued model. (Worse, I have two VCRs and a DVD recorder, and I can never remember which of those switch to DST on the currently-correct date, which switch on a no-longer-correct date, and which only switch manually.) I've got a little, battery-powered time-and-weather gizmo that changes on the wrong date as well, IIRC.

The definition of "second" is fine. The duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom seems legit to me.

This would mean that atomic midnight and solar midnight would drift apart from each other by seconds per decade.

But atomic midnight and solar midnight can already differ by more than an hour due to timezones. If we can cope with this kind of difference, we can surely cope with the odd second or two. A second's difference is about the same as moving 30 metres east or west.

Last I checked he just wanted developers to use TAI and do the time-is-a-database-problem conversions at the moment of necessity (which is pretty trivial with UNIX-type filters, as far as logging goes).

It wasn't hard dealing with leap seconds in code. I've done it and the code worked fine.

What was really painful was to determine whether the data being input was already adjusted for leap seconds or not. Nobody ever documents that, and just saying "UTC" is not sufficient information.

As for GPS my recollection (probably fuzzy) is that GPS sends one packet with the time, and another (less often) with the number of seconds of adjustment. I forget whether the "time" is already adjusted or not.

I do think it would be cool if we displayed all "current time" as seconds since the epoch.

GPS time is NOT adjusted for leap seconds, but it's a different offset than TAI, because GPS time zero point is (according to leapsecond.com) Jan 6 1980. The GPS satellites broadcast the current GPS-to-UTC offset to make it trivial to compute current UTC from GPS, but as Lloyd points out above, there's no history available so trying to get anything but current time is nontrivial.

/agree on all points. As I recall, at the time I was fooling with code, the adjustment was either 13 or 14 from the Unix epoch. As mentioned elsewhere, the total official from-zero adjustment is 34.

I remember thinking about incorporating history (which is available) into the code, but never got around to it: most of the data was recent enough it didn't matter in practice. It was a fun thing to learn about, though.

John Herschel proposed a correction to the Gregorian calendar, making years that are multiples of 4000 not leap years, thus reducing the average length of the calendar year from 365.2425 days to 365.24225. Although this is closer to the mean tropical year of 365.24219 days, his proposal has never been adopted because the Gregorian calendar is based on the mean time between vernal equinoxes (currently 365.2424 days).

The insertion of leap seconds is unpredictable, occurs with as little as six months' notice, and is an unnecessary pain in the ass to deal with when doing time calculations.

Who the hell cares if civil time is off by a few seconds, or even a few minutes, compared to the unpredictable and ever-changing rotation of an arbitrary ball of rock? I sure as hell don't, especially since it makes my life EASIER, and you don't seem to articulate any practical problem it will cause you, you're just whining that it doesn't match historical practice.

Guess what? Sometimes historical practice sucks, and we can do better.

All most of us are trying to do is deal with scheduling and coordination amongst humans. We couldn't care less if the third rock from the sun agrees with our scheduling, it has no impact.

Those few specialists for whom the precise duration of Earth's rotation has an _actual_ impact already use far more accurate measurements than ordinary civil users of time, and this change would have virtually no impact on them.

Poul-Henning Kamp has written about the proposed change in CACM/ACM Queue:http://queue.acm.org/detail.cfm?id=1967009
His text is actually somewhat balanced. Lots of discussion there too, including this important pointer:

Hours, they shove in leap hours, to keep thing in sync. One every friggazillion of years.

Right now leap seconds are AWFUL and you cannot predict for sure if there will be one with one year of advance, only six months. There is people needing to put UTC stuff in hardware, hardware that must be certified and then sent into production and then god knows if it will EVER see a firmware upgrade or even THE network, EVER. (after StuXNet? much more unlikely)

Right now thos things use GPS time. GPS time is ALREADY flipping leap seconds the finger (they ain't gonna cope with that level of crazyness) and IS drifting with no backup plan. UTC resolution is about having a backup plan. They count leap seconds until they make up an hour, and THEN act upon it. (massive update)

It's possibly gonna happen in 3600 years, so every contractor out there is ok with that :)

Don't act like it's mohammad cancelling a couple of months to support peace. (yes he did — there was a dumb rationale behind, that somehow worked)

I still think we should switch to a calendar with 13 months of 28 days each plus 1 Midyear Day. At least then that much would be consistent. As it is, there's really no such thing as 'a month' as a unit of measurement, since the definition of it varies from month to month. And that's the problem with leap seconds, they add unnecessary complexity, inconsistency, and unpredictability into the equations. Units of measurement should be consistent.

Imagine the problems if the length of a foot (or meter) varied every time you measured something. Carpenters would be building chaotic contraptions, trains would have to have extremely flexible axles to make up for tracks that wobble back and forth at varying distances (despite always being the same number of inches apart), and real estate pricing would be all messed up. "Let's see how many square feet that office is today. Last time we measured, it was 1200, but it was 1400 the time before that, it could be 1000 today."

Leap seconds are bad because they hit at the lowest common level, throwing off all greater measurements (minutes, hours, days, weeks, months, years). None of our time measurements can then be expected to have any level of accuracy or consistency. Leap millenia would be less of a problem, but even so, there just doesn't seem to be any practical need for them. So what if someday the sun will be highest at 00:00 instead of 12:00? Just gradually shift your sleeping patterns to match, as the centuries go by. Or not. We have things like candles and electric lights now.