MIT Technology Review reports that Batelle earlier in 2008, field tested a prototype millimeter communication system. The team was able to send a 10.6-gigabit-per-second signal between antennas 800 meters apart. And more recently, the researchers demonstrated a 20-gigabit-per-second signal in the lab.

Whereas Wi-Fi and cellular networks operate on frequencies of 2.4 to 5.0 gigahertz, millimeter-wave technology exploits a region from about 60 to 100 gigahertz. Much of the millimeter region is unlicensed and open for use; it has only been neglected because of the difficulty and expense involved in generating a millimeter-wave signal, encoding information on it, and then decoding at the other end. Usually, data is encoded by first generating a low-frequency wave of around 10 gigahertz, then converting it into a higher-frequency signal. The drawback is that encoding data on a 10-gigahertz signal limits the data rate to about one gigabit per second.

The Battelle team was able to better this by more than a factor of 10 using off-the-shelf optical telecommunication components. The researchers modulated data on two low-frequency laser beams, then combined the two. When these two beams combine, they create a pattern of interference that acts as a 100-gigahertz signal. "It looks as though we have a laser beam that has a 100-gigahertz frequency," Ridgway says.

"We want to expand our photonics business to include all communications in the range of 100 nanometers on a chip all the way up to 100 meters between systems," the HP executive added. "In the near term we want to connect boards and blades with photonic interconnects. In the long-term we want to build on-chip photonic connections which we think will break the core-to-memory bottleneck."