Posted
by
Soulskill
on Tuesday July 09, 2013 @04:37PM
from the about-time dept.

bmahersciwriter writes "The new type of clock, called an optical lattice clock could replace the cesium fountain clocks used as the standard for time keeping. Indeed, it could redefine the second. The cesium fountain is predicted to keep time within one second over 100 million years. While other atomic clocks are better than that, researchers suspect the optical lattice is better still and could one day replace the standard."

Ah now, back in the day, the IBM PeeCee (which had an 8088) kept time at a rate of 65536 ticks per hour. How this divides into a second is left as an exercise for the interested reader.

The second question for the interested reader is how on Earth anyone could achieve any sort of calculation at all on a 6502 what with it being quite the most pathetic excuse for a CPU ever devised by person-kind.

The second question for the interested reader is how on Earth anyone could achieve any sort of calculation at all on a 6502 what with it being quite the most pathetic excuse for a CPU ever devised by person-kind.

I hope you're kidding. The 6502 helped make the personal computer revolution possible. For a time, it was the lowest price workable 8-bit CPU available, making computing available to the masses. The original Atari game console (the 2600) was based on the 6502, as was the Apple II computer, which really launched the small computer revolution, and the 6510 (a slightly enhanced version of the chip) powered the extremely popular Commodore 64 computer. True, pricier CPUs like the Z-80 and others were a bit m

My first computer was a Vic-20 (6502) and I wrote my first program on that when I was about 7 using a book my dad got me. I had the cassette tape drive, a modem (with no-where to call) some cartridges. Man, I thought that thing was magic. My friends had Atari's and stuff, but I could actually make my computer do stuff...

The cesium fountain is predicted to keep time within one second over 100 million years. While other atomic clocks are better than that, researchers suspect the optical lattice is better still and could one day replace the standard.

It took me a couple of extra readings, too. The tricky unstated part: while other atomic clocks are better than cesium clocks, they are not the standard.

TFA doesn't explain why trapped-ion clocks (the "better clocks" mentioned in TFS) aren't used to define the standard. Presumably, that's just the glacial pace of international standards setting, and perhaps a trapped-ion clock standard is working its way through the system but has not yet become the new standard. That's just my guess, though.

The better clocks would have to be sufficiently better to justify the change. The current cesium fountain clocks are accurate to about 1 second in a hundred million years. There are better ones, but it's difficult to improve much upon that, at least not sufficiently to justify the change.

Which is what I suspect is going on here. The current clock is likely to already be more accurate than the means of conveying the standard to other time keeping devices.

Cesium fountain clocks last about 20 years, some a bit longer, before they need to get parts replaced for maintenance. The standard is transferred to a backup during this time, then transferred back. Improving on the lifetime of the individual clocks would help more than improving on the accuracy of the clocks.

Behind all this accuracy is on the assumption that the atomic constants, such as Planck's "constant" are truly constant, because all atomic behavior is governed by this. There is evidence that for long periods of time these so-called "constants" have drifted and still are drifting, because they are related to the size of the universe. All the equations for atomic behavior contain a time element as part of these so-called "constants". The equations for gravity on the other hand to not contain any terms refer

There is evidence that for long periods of time these so-called "constants" have drifted and still are drifting, because they are related to the size of the universe

There is no good evidence of that and a lot of evidence to the contrary, everywhere from astrophysical data to on going work in the labs. And it is not like everyone is assuming so, people are actually checking and running experiments from such things. I've seen some of them first hand considering former colleagues of mine ended up on such a project.

The equations for gravity on the other hand to not contain any terms referring to time.

Then you must be about 100 years behind the times, as gravity is closely tied to the rate of time passage, both in theory and thoroughly demonstrated by exper

Nothing in nature is as constant as change, especially over long periods of time. What reason can you give that the so-called "constants" of physics have remained the same over millions or even billions of years?

The history of science has shown over and over again that mainstream scientists, the vast majority have often been wrong in their interpretations of experiments and observations. Modern cosmology and astrophysics is no exception to this. In 1929 Edwin Hubble measured the red shift for the first time. That measurement was puzzling and he as well as others have sought an explanation. The Doppler shift of sound commonly experienced was the basis of the interpretation that applied this principle to electromagne

Why have milk and honey been associated with good and abundance for thousands of years in many cultures? Can you show me some articles based on real research that show that milk and honey are NOT healthful foods?

In nature everything always changes over time, especially over millions or even billions of years. There is some evidence that the so-called "red shift" observed from distant galaxies is caused by fundamental changes within the atom because Planck's constant was considerably smaller at one time in

The history of science has shown over and over again that mainstream scientists, the vast majority have often been wrong in their interpretations of experiments and observations. Modern cosmology and astrophysics is no exception to this. In 1929 Edwin Hubble measured the red shift for the first time. That measurement was puzzling and he as well as others have sought an explanation. The Doppler shift of sound commonly experienced was the basis of the interpretation that applied this principle to electromagne

That's not what I meant, but kind replies are always nice. I understand the function, but just like most of us don't spend our off hours learning string theory and pondering just how small plank's constant is (at least, rarely), I doubt many of us keep abreast of experiments that use this type of precision or what truly interesting things that will come of it - at LEAST what truly interesting things that justify the seemingly inflated giddiness of the article.. Hence my dissatisfaction.

Look, not for nothing, but this summary really is exceptionally poorly written. "...while other atomic clocks are better than that, researchers suspect the optical lattice is better still..."? Is this the 10th grade? "Better than that" is how we summarize a new type of clock? Why don't we just throw style out the window and write that it's very, very, very, very, very, very, very good? Additionally, can we possibly have even a quick sentence to explain of why the optical lattice is "better still"?.
I

The new (less than a decade old) optical latice clocks (OLC's) in which 10,000 atoms of strontium-87 are trapped in (what else) an optical lattice have been shown to be better (within 1.5x10^-16) than the current world standard cesium fountain clocks (within 3x10^-16), but haven't yet beat the best clocks, which are measuring emissions from single ions trapped in an electro-magnetic field (within 1x10^-17). But researchers are hopeful that OLC's will eventually emerge as the new standard because 10,000 atoms beat 1 atom for measurement statistics and because the other two technologies measure frequencies in the microwave spectrum, while the optical lattice clock is measuring in the visible spectrum. Statistics and higher frequencies should eventually win out as the technology matures.

Ion clocks are also based on optical transitions.
So your new summary is incorrect.
Also, it's not really "statistics" that win out, its the ability to probe your ensemble (either 1 ion or many atoms) with lower Quantum Projection Noise. For example, everytime your laser has excited your ion to a quantum superposition of ground and excited state all you get back is one bit of information: 1 - excited state, 0 - ground state. So to discover where your laser (aka clock oscillator) is detuned with respect

Not all ion clocks are optical. Linear ion trap microwave clocks based on Yb and Hg were developed in the 90s. Some Hg clocks operated as part of the NASA Deep Space Network for a number of years and there's currently an active project to develop a highly miniaturised Yb clock. So the summary should say that lattice clocks can beat single ion clocks on QPN and fountains because optical beats microwave.

Between this and the WWVB anniversary it's been a good run the last few days for time nuts.

"A man with one clock knows what time it is. A man with two clocks is never sure. But I would add further: A man with three clocks is more sure than a man with two clocks."Quoted from one of the quintisential time nuts athttp://www.leapsecond.com/ [leapsecond.com]

What about the 4 dimensional "time crystal" that not only has perfectly repeating latices structure in the 3 dimensions, but also in the 4th time dimension. If it truly has perfect repetition in the 4th dimension, shouldn't that be the "perfect" time piece?