In April, one of the open source code movement's first and biggest success stories, the Network Time Protocol, will reach a decision point. At 30 years old, will NTP continue as the pre-eminent time synchronization system for Macs, Windows, and Linux computers and most servers on networks?

Or will this protocol go into a decline marked by drastically slowed development, fewer bug fixes, and greater security risks for the computers that use it? The question hinges to a surprising degree on the personal finances of a 59-year-old technologist in Talent, Ore., named Harlan Stenn.

Rather, atomic clocks work by funneling super-cooled cesium atoms down a tube and exposing them to precisely-tuned radio waves. If the frequency is just right, the atoms resonate and change their energy state.

To get precise frequency on output we need... precise frequency on input. This looks like paradox. How can this be explained? Generally how can more precision be obtained starting with less precision?