Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

Mortimer.CA writes "As discussed on Slashdot previously, there is a proposal to remove leap seconds from UTC (nee 'Greenwich' time). It will be put to a vote to ITU member states during 2008, and if 70% agree, the leap second will be eliminated by 2013. There is some debate as to whether this change is a good or bad idea. The proposal calls for a 'leap-hour' in about 600 years, which nobody seems to believe is a good idea. One philosophical point opponents make is that the 'official' time on Earth should match the time of the sun and heavens."

We can ignore the problem then too. Eventually, morning and evening will be on different days. We might just gain or lose a whole day. Heck, we can ignore the problem forever. We'll be off by a year, then a decade...

We can ignore the problem then too. Eventually, morning and evening will be on different days. We might just gain or lose a whole day. Heck, we can ignore the problem forever. We'll be off by a year, then a decade...

Ok guys. We're in California, it's midday 26th June while I have snow falling on my face and I can't see shit because it's new moon.

It has always seemed to me that there should be computer epoch time and then you should have a conversion from that epoch into a time that make sense for the user. So, computer time units could be fixed to the vibrations of your favorite atom and human time could be fixed to the orbit and spin of your favorite planet. And all systems would do a conversion between the time systems at display. Different systems could do different conversions. Applications programmers could remain oblivious to the conversions if all time was stored in a universal fixed format independent of any particular planet, orbit, or galaxy.

Basically, you compute what time of day it is based on your clock ticks and the orbit and spin of your planet. You don't need to model the entire orbital mechanics of your planet... if you think about it that's what all "time of day" systems do now... highly simplified models of the Earth in space. We know that the earth will be inside the zone of space we call "November" and we know it will be turned to the position we call 6am UTC when the clock ticks out this number or any number in this modulo. As we become more demanding of time and more exacting of the position of the planet in space we need to make more sophisticated orbital models... or allow for heuristic adjustments to existing look up table based models.

Time as in time-space has nothing to do with any of this and it is passage of time in space that a computer should be worried about keeping inside itself... not where the sun is. If you want "where is the sun?" you should be use a conversion or algorithm to calculate "where is the sun?" and the "time" inside the computer should be seen as the number of clock cycles that computer has experienced. Using clock ticks alone, your computer can probably do a fair job at guessing at where the sun is... but that's not what computer time is about.

Of course, these ideas neglect relativity. Eventually we'll have to deal with relativity and clock ticks. I suppose you would have to decide on an a set of arbitrary points in the cosmos and call their inertial frame of references "fixed" which you would use to compute temporal differentials via a kind of relativistic triangulation... say clocks in three star systems that transmit their time beats out to the universe and based on the time you read from each at your point in space you can triangulate your position and time-shift due to relativistic effects. But I think I may be getting a few centuries ahead of myself.

And, it doesn't matter what I think anyway. It's not like anybody in a position to influence these decisions and ideas reads Slashdot. If you started now you could probably get all the digital clocks in the world to work on these principles in about a hundred years.

I thought of this issue years ago, and had actually sat down and done the math at one point... basically, to solve the time discrepancy, just slightly lengthen the second. Everything lines up.
Of course, every book, piece of software, scientific instrument, medical equipment,... well, basically everything in human civilization... would need to be re-build, re-calibrated, re-programmed, re-manufactured, etc. If nothing else, we'd stimulate the living hell out of the world's economy.

Of course, the real problem is that the rotation of the Earth is not constant (the leap seconds are mostly driven by fluid motions in the core).

Originally, back in the 1960's, instead of the leap seconds, they (the BIH at the time) adjusted the rate of the UTC seconds with respect to TAI. This was widely viewed as not a good thing once it was tried and was dropped, IIRC in 1972.

The first AC reply to your idea is correct but I have a feeling you still might not understand his point. The "leap seconds" which we are talking about are not, like the extra days in leap years, always added to the length of the day. Sometimes they are subtracted.I am not an expert, but the "exact second" calculation you want to make, averaged over a long enough period of time, seems to me to depend on the motions of every sizeable object in the Solar System and probably also (or maybe even more strongly)

Yay, nothing like reliving the thrill of Y2K. Except that we don't have to.One second in 600 years is about 1/18921600000 or roughly 0.000000005%. In a day, the difference between the two ways will produce an offset of 1/220000th of a second, or about 5 nanoseconds. With the possible exception of atomic clocks, no analog or digital device is this precise.

Since any "precise" timekeeping requires periodical synchronization with the world's atomic clocks and astronomical observatories, we'd only need to chang

You're off by a factor of 3600. It's "leap hours" that are being proposed; We already have leap seconds. Of course, I'm not sure the math from TFA makes too much sense anyway, as I don't recall having an average of 3 or 6 leap seconds every year.

Besides, the value of units of measurement lies in their consistency. Changing the second is worse than leap years or leap seconds or leap hours, because any time someone needs a precise measurement, they turn to the second.

How about going the other way... leap microseconds. Many times during the day. Then nobody will hardly notice.

Actually it sounds like a good idea. As someone else suggested, the difference due to leap seconds is so small that only atomic clocks are precise enough to need to take them into account. And since we're all synced on atomic clocks anyways we could just make that happen transparently upstream.

One event every 10 years does not cause lots of disruption, and being a minute out of sync with solar time is not large enough to be a problem. You'd notice an hour's difference if you're in a northerly latitude and have Daylight Saving Time...

One event every 10 years does not cause lots of disruption, and being a minute out of sync with solar time is not large enough to be a problem.

Except for all the millions of cron jobs that run at a minute granularity.If the same minute occurs twice, should the job run twice? If a minute is skipped, should the job not run at all, or run a minute early, or a minute late?

This is the same problem as the witching hour every year when switching to and from daylight savings time. The remedy for that is to ensure

This is the same problem as the witching hour every year when switching to and from daylight savings time. The remedy for that is to ensure you don't schedule jobs for those hours, or get vendor assurance of what, exactly, will happen for jobs scheduled at the start, middle or end of the witching hours.

Nope. cron, like all Unix services, runs to UTC and doesn't give a crap about daylight savings time.

Cron may think in UTC, but the crontab is in the system's local timezone.

Worse, different systems have different implementations. There's bsd, sysv and vixie's implementations, plus numerous variations, and all seem to do their own stuff.

An example: You have four boxes located in the:Europe/Paris time zone, one Solaris box, one AIX box, one HPUX box and one RHEL box, with daily jobs scheduled at 01:00, 01:30 and 02:00. Let's call them job1, job2 and job3.Which of the three jobs will run on each box on March 30, 2008?Which of the three jobs will run on each box on October 26, 2008?Which of the three jobs will run twice on October 26, 2008?

If anyone (except perhaps Arthur D. Olson) can answer that without investigating, I'd be very surprised.

Sometimes the vendors themselves can't say for sure, due to the time adjustment occurring in a different process, and depending on availability of interrupts and CPU time on the system, the cron interrupt may see either the old time or the new time when it wakes. One of the above vendors thus recommends that jobs scheduled for the start/end of the witching hour are moved one minute outside it.

Anyhow, the parent to your post deserves to have the "+1 Informative" stripped, because it's plain misinformation.

Some safety critical real time systems such as radar trackers need an accurate time reference to be able to work at all. They don't care about the time of day but do care a lot about each hour, minute and second being exactly the same length.

I think we need two references. One time reference which never, ever changes, and another which tracks the diurnal cycle. For the latter, leap minutes would be fine.

Sorry, no, 12:00 hours at zenith won't work even at the exact middle of each time zone, among other things because the earth doesn't move around the sun in a perfect circle, but has an elliptical orbit, and also because the earth's tilt varies (we wobble).

I live at 5 degrees east. Thus, I know that because I'm at GMT+1, the sun will be exactly in the south at 12:40 PM. If we change to the "leap hour" strategy, I'll have to remember what the offset is now, and that offset will change all the time...

I live at 5 degrees east. Thus, I know that because I'm at GMT+1, the sun will be exactly in the south at 12:40 PM....Except the exact time the meridian passes under the sun varies throughout the year since the Earth's orbit isn't circular.

Don't get me wrong - I think removing the leap second is just silly but your point is rather bogus.

I suggested [sun.com] that everyone on the ITU committee should be asked to read David Ewing Duncan's book "Calendar - Humanity's Epic Struggle to Determine a True and Accurate Year." [davidewingduncan.net] Ponder the fact that it has taken thousands of years of struggles, scientific advancement and setbacks to get human time synchronized with astronomical time. Great rifts developed in societies and wars were fought over the accurate calculation of time. (Check out the Irish/Roman/Orthodox rift over the calculation of Easter). Now wi

Actually, the leap second makes the most sense to me. But a leap hour in 600 years, when we do an entire day about every four years is absurd. If we had to abandon the leap second, it should only be replaced by the leap minute,. Likely few people would notice the time being off by as much as a minute (just don't use that sextant any more, or if you do wear two watches or set your to heavenly time). But time being off as much as an hour would pretty much muck things up (think of the effect of daylight saving

The leap second is required because the earth's spin is slowing down in a complex, non-linear way.

Changing the length of the second simply won't work, in a couple of hundred years we'll be right back to where we started again. See http://en.wikipedia.org/wiki/Leap_second [wikipedia.org] for details.

The leap hour is a daft idea, why change something that isn't broken, if a tad inconvenient.

Anyone who has to do navigation, the driving force behind many of the improvements in timekeeping, would disagree with you. Units of measurement do not exist in a vacuum. They are invented to solve real problems.

It is broken.Leap seconds are lost moments in time depending on the time system you use. Linux time [wikipedia.org] is a good example. Every time there is a leap second Linux time deviates further from UTC.

In this day and age, do we really have to keep lining up our time system to astronomy events, rather than realizing that time is actually linear, and so should our time system be? Over time our time system will not be perfectly synchronized to every event that happens to occur in the universe, nor should we try to force

Why have a leap hour in 600 years time? Surely it would be easier for all countries to just change their local time offset to UTC by 1 hour. So, for example, instead of Pacific time being UTC-0800/UTC-0700, it would become UTC-0700/UTC-0600. (Or maybe 0900/0800)

I don't really care what they do with leap seconds, but IMO their time would be better spent abolishing that routine-breaking, parent-killing, accident-causing abomination which is Daylight Savings Time.

The only benefits I can see is slightly later barbecues in summer and a six-monthly reminder to check smoke detector batteries about the house.

DST is set by local governments. This is an entirely different thing, an international standards body messing around with time, instead.

BTW: I'm of the opinion that it's not DST that should be abolished, but non-DST. Non-DST time is a good mathematical division of the day, centred equally around 12:00 (+- 30mins). Unfortunately, as a society, we seem to have decided to centre our actual lives around 13:00 instead. Switching permanently to DST would fix this.

Unfortunately, as a society, we seem to have decided to centre our actual lives around 13:00 instead. Switching permanently to DST would fix this.

Until our society decided to center our lives around 14:00.

For the past couple summers, I've been protesting DST by simply not changing any of my clocks. It takes a bit to get used to, but once you learn to translate times, it works out. And as someone who doesn't mind getting up earlier in the morning (though I do like to sleep in when possible), it does help y

Its actually even worse.You might think of the "9-5" workday when saying that the center is 13:00.But in reality, its more like 15:00 (most people wont be a lot of time awake _before_ going to work, but lots of time after...

We could just fire off some nukes every six months or year to control the orbital speed of the earth around the sun.

Congratulations, you completely failed to understand the fundamental difference between a day and a year! A feat accomplished by few to this day!

What defines the day is the rotation speed of the Earth around itself, not the orbital speed around the Sun. Besides, as some other people pointed out, this whole leap second thing is irregular, or if you prefer, one step forward, one step back, because the speed of rotation of the Earth varies slightly.

Run computers on TAI (International Atomic Time). Keep it constantly flowing, and never add or remove seconds, as per the definition. Then simply calculate UTC in software from a published leap offset between the two, which compensates for the leap seconds:

UTC = TAI - leapseconds

Then define all the timezones off of UTC as normal. All this basically does, is make the calculations for the timezones into a few hours plus or minus a few seconds. This makes a lot more sense, because then you actually have a fundamental time (TAI) which doesn't have discontinuities, but if you want to consider your astronomical orientation, you look at UTC or your local time. We don't need to redefine these types of time, because these already exist. We just need to use them more intelligently.

Run computers on TAI (International Atomic Time). Keep it constantly flowing, and never add or remove seconds, as per the definition. Then simply calculate UTC in software from a published leap offset between the two, which compensates for the leap seconds:

UTC = TAI - leapseconds

Then define all the timezones off of UTC as normal.

This is basically what they do in one area I have experience in where keeping precise track of time is important: spacecraft navigation. Ephemeris Time [wikipedia.org] (not actually obsolete as the article claims) is generally referenced as the number of seconds since January 1st, 2000, 12:00:00 TT, is the "official" time that you work with when computing the positions of heavenly bodies (and spacecraft). The transformation from ET to UTC (the human-readable time) changes when leap seconds are added. When using UTC to com

Interestingly enough, that is exactly the relationship between GPS time and TAI. It's defined the other way, obviously, but GPS time does not have leap seconds, and the GPS signal includes the size of the correction needed so that receivers can display UTC time.

If one little (leap) second is worth all the fuss, where's the uproar to finally rid us of the dangerous practice of needlessly, senselessly changing almost all clocks in existence (in an age where every other gadget has one) twice a year by one whole whopping hour [wikipedia.org], with all the trouble that entails?

I bet it would be a considerable challenge to find 12 watches synchronized within 30 seconds of each other. So we're worried about seconds of mismatch between sundials and the only computer on earth that isn't connected to the internet? I agree with the article. Leave UTC time alone and synchronize to GPS time instead. The rest of the world will go on being happy having their watch within a couple minutes of the "official time."

Inherently, those who want to get rid of leap seconds also want to get rid of time zones (at least they indirectly do).

Having our clocks NOT agreeing with astronomical time, completely eliminates all the benefits of time zones.

Whether you actively think about it or not, our sense of direction is substantially driven by the combination of our clocks, and the Sun. We use it as a reference all the time (why do you think it's harder to find your way in a new area, when it's dark?). Even if there's no other defining features, there's still the Sun to tell us which way is North (or South), and our clocks give us a reference to relatively where the Sun should be. Subtly change someone's clocks, and you'll see them having a slightly more difficultly with their (otherwise good) sense of direction.

Seems to me, the only argument here is that there are a few groups who _really_ just happen to need TAI time, but they see that it's just much easier to access sources of UTC time, and so want to redefine UTC (eliminating leap seconds) so that it is monotonic, and strictly corresponds with TAI at all times. Did I miss anything?

As much as we play around with daylight savings time, more often then not local earth time and the relative position of the sun overhead don't match anyway. More importantly, it has been even longer since most people cared. The philosophical questions are now moot, the scientific and engineering questions have workarounds (no one measures anything serious in local time, they just convert to it), and all that is left is the question of whether or not we need to expend the effort to adjust our clocks every time they are just one second off from some fully imaginary standard.

The question is what do you want to do with the time of day. Should it be astronomically based? This is not a trivial question.

Many electric grids are required to be timed with accuracy of better than 10 milliseconds. Remote Telemetry Units need to record events with a time stamp that might mean something to an operations control center. The problem is what do you do with leap seconds?

The POSIX standard time epoch doesn't include leap seconds. So you're left with a terrible morass of a problem. Do you do what the NTP deamon does, by slewing the clock at some known rate? The problem with that is that while events remain in sequence, the time between events is not accurate. Do you simply include a second 59th second? The problem there is that events will be recorded out of order and they can't be sorted back.

And yet, many also have legal requirements to adhere to a UTC based time standard.

Ladies and Gentlemen, the problem isn't the leap-second concept. The problem is our damnable entrenched software standards. We're trying to fix this problem by creating another.

The basic idea is not to demand that the year be an integral number of days. The New Year will be "born" at varying times of the day. I clearly remember my mom cooking up the New Years feast and then waiting patiently for the new Year to be born, which would shift by about 6 hours every year. The Hindu calender will state the next new year, "Sowmiyan" or "Sadharanan" (there are 60 named years) will be born at 1:06 PM or 7:36AM or whatever. Typical South Indian New Year will begin on April 14 for about three years (like 7AM, 1PM, 7PM) and on April 15 (1AM) for a year and then the leap year in western calender will bring it back to April 14.

"It is inappropriate to require that a time
represented as seconds since the Epoch precisely represent the number
of seconds between the referenced time and the Epoch." - IEEE Standard 1003.1b-1993 (POSIX) Section B.2.2.2

For UT1, eliminate the concept of hours, days, etc. Time will be told by the second only. Maybe even call it something else like a "chron". You can talk about hectochrons, millichrons, kilochrons, etc. In fact, start the counting of "chrons" at January 1, 1970.

Now, if you use chrons, there is no more link between days or years, and no more leap seconds. Computer systems like GPS or space travel which get thrown off by leap seconds, but don't really depend upon the concept of "day" or "year", can use chrons. People who depend upon the astronomical time can use seconds and live with leap seconds. To each, their own. And, converting between the two units is quite really simple.

The real silliness of the whole proposal is that these scientists actually think their decision will eliminate the leap second. Astronomers will simply ignore the whole thing and go back to GMT. So will all the governments which means all the atomic clocks will still use leap seconds. UTC will simply disappear, and we're back to square one.

If leap seconds come too often, and leap hours allow the time to diverge too much, how about leap minutes? Official time doesn't deviate from solar time by much, and yet we only need one every hundred years or so.

Of course, this doesn't fix the real problem: that the Earth's rotation is gradually slowing, so any system based on a foundation with a fixed number of fixed-length seconds will always become gradually more unwieldy.

Decimalisation is over-rated anyway. People think it makes all the sums easier. Does in some ways, but for everyday life, other systems are easier. We used to have 120 pence to the pound in the UK. Much simpler when divvying up the bill at restaurants. Try dividing 100 by three people, four people, six people. Now try it with 120 or multiples thereof. But what about five and ten? Yeah - much harder with 120 (sarcasm).

Anyway, this ignoring the leap second is sounds like the usual case of wishful thinking

There were 240 pence to the old (pre-decimalisation) pound, comprised of 20 shillings each worth 12 (old) pence. Do you remember guineas, crowns, half-crowns, shillings, tanners (6-penny piece), threepenny bit, pennies, half-pennies, farthings (a quarter penny)? I do. I suspect that I am quite a bit older than you and I cannot ever remember there being 120 pence to the pound. So either please provide a citation or confess that you are mistaken/talking bollocks.:-)

But the main thrust of your post was correct with regards to dividing sums of money easily. Or at least it was until the education system decided that mathematics and mental arithmetic were not the most important subjects in life. I'm not sure how some of today's young people could cope with such problems.

But the main thrust of your post was correct with regards to dividing sums of money easily. Or at least it was until the education system decided that mathematics and mental arithmetic were not the most important subjects in life. I'm not sure how some of today's young people could cope with such problems.

12 is a nice number, but I will not support it for a standard until we grow another pair opposable thumbs.

Young people today are nothing compared to what is to come. e-ink restuatant bills that calculate the price for everyone, and even takes into account if you had 2 drinks or 3.

Give another 50 years, and what we call basic math will be indistinguishable from magic for large parts of the population.

Among these was Levenshulme's Tina Farrel, a 23-year-old who admitted "she had left school without a maths GCSE". She explained: "On one of my cards it said I had to find temperatures lower than -8. The numbers I uncovered were -6 and -7 so I thought I had won, and so did the woman in the shop. But when she scanned the card the machine said I hadn't.

There are two people, Tina Farrel and a sales assistant that need to be darwinised.

There are two people, Tina Farrel and a sales assistant that need to be darwinised.

Personally, I think the people who judge other people fit to be "darwinised" - especially based on a page-long Web article - are the ones we could do without, rather than the people who's worst known flaw is that they can't count below zero.

I count by placing my hands palm down just above a surface (very much like I'm about to start typing). Then I move my fingers up and down. If a finger is touching the surface, that's a 1 bit. If it is not, it's a 0. Lately I've been not using my thumbs, thus giving me one byte, conveniently broken into two nybbles for hex conversion.

Chinese didn't "invent" decimal time. Phrases like "in the 1/10000 th part of a chand" and words like paramchand (not accurate transliteration; chand = second) etc., are very common in Sanskrit text. Add the fact that Decimal system itself was invented in India only means that Decimal time was "invented" in India.

Why I am using double-quotes for "invented"? Because no one can invent time. As a human you want to divide time to keep track of it. And you can only do that using the numeral system you know! Indians knew decimal system so they divided it into factors of 10, Sumerians used sexagesimal system, so they divided it into 60.

It is not the division that bears any importance in invention. It is the device which one can use to measure. If you don't have clocks to measure 1/10000 th part of second, it means nothing to write it down. Ancient Chinese are no different.

The smallest unit is the "Moment", and then the "While" (or, less used, the "Whilst"). A while is about 14.4 moments. Then you have the "long while", which is 13.8 whiles, then the "time", and "long time"...

For example, it took me a while and three moments to write this comment. I'm not a quick typer...

Is it the time from perihelion to the next perihelion?Is it the time from zenith on the shortest day to zenith on the shortest day next year?Is it the time for when a star within our galaxy is in the same position again?Is it the time for when a star outside our galaxy is in the same position again?

The earth's orbit rotates, and the solar system rotates, in a galaxy that rotates. And speculation is that the universe rotates too.

Just remember that you're standing on a planet that's evolvingAnd revolving at nine hundred miles an hour,That's orbiting at nineteen miles a second, so it's reckoned,A sun that is the source of all our power.The sun and you and me and all the stars that we can seeAre moving at a million miles a dayIn an outer spiral arm, at forty thousand miles an hour,Of the galaxy we call the 'Milky Way'.

Our galaxy itself contains a hundred billion stars.It's a hundred thousand light years side to side.It bulges in the middle, sixteen thousand light years thick,But out by us, it's just three thousand light years wide.We're thirty thousand light years from galactic central point.We go 'round every two hundred million years,
And our galaxy is only one of millions of billionsIn this amazing and expanding universe.

The universe itself keeps on expanding and expandingIn all of the directions it can whizzAs fast as it can go, at the speed of light, you know,
Twelve million miles a minute, and that's the fastest speed there is.
So remember, when you're feeling very small and insecure,
How amazingly unlikely is your birth,
And pray that there's intelligent life somewhere up in space,
'Cause there's bugger all down here on Earth.

The real question isn't what is a year, but "what is a day". Measurements were taken of the length of the "mean solar day", which is the average time between noons, which itself varies over the course of a year due to the elliptical shape of the Earth's orbit. (Because we're closer to the Sun during the Northern Hemisphere winter, we're revolving faster but rotating at the same speed, so the time between true astronomical noons is slightly longer than in the summer.)

Leap years are to deal with correcting the length of the year, which isn't an integral number of days. Leap seconds [wikipedia.org] are to deal with the fact that the length of a day changes slowly and at a variable rate. It's not the same problem at all.

Or, why don't we just redefine the second to deal with all of this in the first place?

because a non-constant second would make most of physics a serious pain. Basing such a fundamental unit the ever changing motion of a ball or rock in space seems rather silly too.

The underlying cause isn't that we end up with a fraction of a second left over due to the Earth's rotation time not being an integer multiple of a second, but because the Earth's rotation is slowing down.