In this age of instant telecommunications and social media, one of the reasons why people still attend professional conferences is to run into old friends and catch up with them. I certainly experienced this at CLEO:2011 as I ran into OSA staff members, OPN contributors and past OSA presidents at various sessions and the conference reception.

It’s also gratifying to see people I’ve written about for OPN in the past and learn about their current work. For example, last year I wrote in the Scatterings column about a three-dimensional, near-infrared “invisibility cloak” created by a team at the Karlsruhe Institute of Technology (KIT) in Germany. This week at CLEO, Joachim Fischer of KIT reported that his group has pushed the technology to the edge of the visible realm by making a 3-D cloak that works for 700-nm light.

To build their near-IR cloak, the KIT researchers used a fabrication technique called direct laser writing to build a tiny “woodpile” photonic crystal. Because a visible-light cloak would require even finer detail, the team incorporated stimulated-emission-depletion (STED) fluorescence microscopy into the laser-writing fabrication process. The resulting cloak worked not just with monochromatic light from a Ti:sapphire laser, but also with a white-light source passed through a red filter.

“Seeing the cloaking action with one’s own eyes is an amazing experience,” the KIT team wrote in the CLEO proceedings. We couldn’t agree more. Their paper, with Fischer as the lead author, is now available in the “Early Posting” section of Optics Letters.

This blog can’t possibly cover everything that has been happening at CLEO this week. If you hunger for more information, point your browser to the CLEO social media hub and drink in the postings. The OSA booth at CLEO has a Legislative Action Center station where attendees can express their views about U.S. science funding issues, and you don’t have to be onsite to use that website, either. Finally, the CLEO:2011 proceedings will be published on OSA’s Optics InfoBase in the near future.

This year’s CLEO conference features such a wide array of interesting scientific findings and technological applications that it’s hard to know where to begin this blog post. So I’ll just dive right in.

The Dawn of "Nuclear Photonics"

Ever heard of “nuclear photonics”? It may sound like a bit of an oxymoron, since photonic inventions and techniques, such as laser spectroscopy, are associated with physics on the atomic level. However, if the folks at Lawrence Livermore National Laboratory (U.S.A.) have their way, super-high-energy beams with laser origins could solve some extremely practical national-security problems.

According to Livermore scientist Chris Barty, researchers at the lab are learning how to make tunable gamma-ray beams by Compton scattering of laser beams off relativistic electrons. The Livermore people call these “mono-energetic gamma rays,” or “MEGa-rays.”

At the 2-MeV photon energy range, MEGa-ray beams would be at least 15 orders of magnitude brighter than synchrotron light, which has its maximum brightness between 10 and 100 keV. Such brilliant beams have the energy to probe not just atoms, but the nuclei within those atoms.

Nuclear resonance fluorescence (NRF) is analogous to the more familiar atomic resonance fluorescence, but it depends on the number of protons and neutrons in the nucleus, so that it can ferret out the spectral signature of isotopes. The narrowband MEGa-rays could selectively excite NRF transitions, and, with the appropriate detector, could provide precise assays of the isotopic content of, and isotopic distribution within, bulk material.

Although no NRF imaging has been done yet, simulations indicate that MEGa-rays could someday help detect highly enriched uranium in the 48 million cargo containers that enter the United States annually, Barty said.

Today, two U.S. laboratories and one in Japan have second-generation MEGa-ray sources for proof-of-principle experiments, Barty said. The next step is to miniaturize the technology – it needs to be able to fit into a truck to be practical for homeland security applications. Livermore is building a nuclear photonics lab for creating a next-generation source that combines compact X-band linac technology from the SLAC National Accelerator Laboratory with Livermore’s high-power diode-pumped lasers.

Second Plenary Session

CLEO traditionally has two plenary sessions, and the 2011 conference was no exception. While Monday night’s plenary talks told of technological applications, the Wednesday morning speakers addressed fundamental science.

Mordechai (Moti) Segev of Israel outlined the pioneering work that he and his colleagues have done in Anderson localization of light. A fellow CLEO blogger, James Van Howe, summed up his talk better than I could have done. I liked how Segev, instead of ending his speech with a list of “conclusions,” listed the possibilities for future research in his field. These open questions include localization in honeycomb lattices, localization with entangled photons, sub-wavelength localization of light and solitons in disordered media.

Likewise, Susumu Noda of Japan presented a thoroughly detailed account of photonic crystal theory and experiments as they have developed over the past 20 years. Although photonic crystals occur in nature – as in the scales on the wings of a beautiful blue butterfly – human-made crystals were still in the microwave regime in the early 1990s. Progress has indeed come very rapidly.

The weather outside the Baltimore (U.S.A.) Convention Center has been varying wildly, from warm and summery to cool and rainy. Indoors, however, the atmosphere of the CLEO:2011 conference was steadily abuzz with exciting applications of the latest photonics technologies.

Ultraviolet LEDs Can Disinfect Water

Although CLEO is primarily a laser conference, some tracks focused on other photonics technologies, such as photovoltaics and quantum computing. Following a joint symposium on semiconductor ultraviolet (UV) lasers and LEDs, a session reviewed several practical applications of UV LEDs.

One task for which these devices are particularly suited is the removal of harmful germs and other contaminants from drinking water. Gordon Knight, a research manager at Trojan Technologies (Canada), explained that UV light penetrates the cell membranes of bacteria, viruses and protozoa and permanently alters their DNA so the critters can’t reproduce and infect humans. UV rays can also break down organic contaminant molecules, as long as the molecular absorption spectrum matches the output of the UV sources.

Water treatment specialists are primarily interested in the UV-C spectrum (200 to 280 nm), in which the peak absorption spectrum of germ DNA falls, Knight said. The industry’s workhorse has been the low-pressure mercury arc lamp, which has a strong emission peak at 254 nm. However, solid-state UV sources could be more energy-efficient and could maintain their steady output for five times longer than the mercury lamps.

Although some technical challenges remain in the development of UV-C LEDs--namely, cost and the need to boost individual chip output above 5 mW--Knight is confident that these sources will provide efficient instant-on operation for future water treatment devices, both in municipal plants and perhaps even in household-sized systems.

IARPA: An Opportunity, Not a Misspelling

You’ve heard of DARPA, but what about IARPA? The Intelligence Advanced Research Projects Agency, a new branch of the U.S. government’s spy agencies, recently started searching for “high-risk, high-payoff” research programs to boost America’s intelligence-gathering efforts.

According to IARPA official Michael C. King, the agency is especially interested in significant advances in techniques to gather biometric data from distant, moving human subjects. Successful proposals require not just a good idea, but also a capable leader to guide the research project. One U.S. team followed King’s talk with a discussion of their own technique for so-called “standoff biometric identification” of people. According to Brian C. Redman of Lockheed Martin (U.S.A.), Fourier transform profilometry involves bouncing fringes from an 808-nm laser off the subject, capturing it and its two-dimensional fast Fourier transform, then doing an inverse transform and merging it with the original data. The laser pulses are eye-safe and, with a duration of 100 microseconds, short enough to freeze motion at a brisk walking speed of 1.5 m/s. The near-infrared light can even “see” through most sunglasses, Redman said.

Every scientific advancement has a story behind it. Telecommunications fibers and optical coherence tomography (OCT) are no different. Donald Keck and James Fujimoto--the first two CLEO:2011 plenary speakers--did a great job of telling those true tales.

Donald Keck, a retired Corning Inc. (U.S.A.) scientist who participated in the development of the first low-loss optical fiber, attributed the telecom boom to a “syzygy” of rapid-fire technological developments four decades ago. In addition to that first practical fiber, the earliest computer-network experiments, the room-temperature laser chip and the computer chip all appeared between 1969 and 1971.

Evoking the original notion of the laser as a “solution looking for a problem,” Keck drew chuckles by reminding the audience of schemes for laser cutting of trees, laser-made nipples for baby bottles and Arthur Schawlow’s “laser eraser” for typists. Early proposals for laser telecommunications--by sending light beams down 2-in.-wide coaxial cables--were not much more practical.

Fortunately, British government researchers asked Corning for help in creating glass fibers with attenuation below 20 dB/km, at a time (1966) when the best silica fiber suffered from signal loss of 1,000 dB/km. Drawing upon glass research from the 1930s to the 1950s, Keck and his Corning colleagues started tracking down and eliminating the sources of optical loss in fiber.

Their initial fiber-drawing equipment was crude--including a household vacuum cleaner--but effective. When Keck tested the first fiber with a loss of only 17 dB/km, he was so impressed that he wrote in his lab notebook, “Whoopee!” However, in 1970 an Applied Physics Letters reviewer initially rejected the Corning team’s paper because, Keck said, “it lacked believability.”

Today’s single-mode fibers fulfill Keck’s 1972 prediction of operation with losses of 0.2 dB/km or less at the 1,550-nm wavelength. Progress in telecommunications has come rapidly, especially after the 1984 court-ordered breakup of the old Bell-System AT&T, which created “a lot of fiber-hungry ‘baby Bells,’” Keck said. With the development of fiber that can bend around sharper corners without introducing losses, the industry is poised to use fiber in ways traditionally associated with copper wire.

OCT: Joining Optics and Clinical Science

OCT is a method of imaging using echoes of light--the optical analogue of ultrasound, said James Fujimoto of the Massachusetts Institute of Technology (U.S.A.). In terms of resolution and tissue penetration, OCT bridges the gap between ultrasound and confocal microscopy.

Although Michel A. Duguay and A.T. Mattick first suggested the technique in a 1971 Applied Optics article, the first demonstration of OCT, performed on a cadaver eye, was published two decades later, according to Fujimoto. Since then, progress has come rapidly, with the technique’s extension to living tissue and the commercial development of OCT equipment for clinical use. Today, spectral domain interferometric techniques have improved both the speed and sensitivity of OCT. High-speed CCD cameras and volumetric data-rendering techniques have added to OCT’s ability to track dynamic processes such as capillary blood flow.

OCT is now moving beyond ophthalmic procedures into the world of intravascular imaging, where the technique can identify unstable arterial plaques and guide the medical treatment of those dangerous blood-flow blockers.

Fujimoto said that there has been a huge increase in intravascular OCT procedures in the last three years. The development of tiny fiber-optic catheters and the Fourier-domain mode-locked (FDML) laser have helped make this possible.

Finally, Fujimoto drew the audience’s attention to one of this CLEO’s postdeadline papers, which reports a record imaging speed for OCT using a swept single-mode vertical-cavity surface-emitting laser (VCSEL). OCT promises to be an exciting technological field to watch in the near future.

CLEO:2011, the annual laser conference organized by OSA and other scientific societies, got off to a strong start today with a full range of sessions on pure laser physics and applications of the technology. Here are a few highlights from Monday‘s sessions:

Could lasers make your automobile burn gasoline more efficiently than spark plugs do? Researchers from Japan and Romania have developed all-ceramic micro-lasers that could zap the insides of car-engine cylinders with multiple sub-nanosecond pulses to ignite a lean mixture of fuel and air. The speaker for the group, Takumori Taira of Japan’s Institute for Molecular Science, declined to predict when laser-powered cars will hit the market. The team’s research has been getting a lot of press, including a mention by a New York Times blogger.

Although General Electric Corp. may be more famous for manufacturing light bulbs, appliances and jet engines, the company has made several important contributions to laser technology. In his talk on laser materials processing, Marshall G. Jones of GE Global Research (Niskayuna, N.Y., U.S.A.) recalled that Robert N. Hall of GE invented the semiconductor injection laser--forerunner of the innards of all laser printers and CD players--and Joseph P. Chernoch devised the face-pumped laser in 1972. Today, GE is more concerned with laser drilling, welding and cladding techniques for manufacturing components of locomotives and turbines. Its scientists are developing fiber lasers that meld the high stability of Nd:YAG lasers with the high efficiency and sharp focusing ability of CO2 lasers.

It’s taking awhile, but eventually the European Space Agency will launch Aeolus, a wind-lidar satellite containing the first ultraviolet laser to fly in space. Dutch physicist Martin Endemann explained that the transmitter assembly for the 355-nm laser must last for 5 billion shots over 39 months in orbit without significant degradation or coating damage. His team found that the system needed to contain low levels (0.2 mbar) of oxygen in order to keep the UV optical components from darkening. The laser should be ready to be installed in the satellite next year, pending a successful extended vacuum-chamber test this fall.

Two Japanese groups demonstrated Thursday night at the OFC/NFOEC 2011 postdeadline paper sessions that they can send more than 100 terabits per second (Tbps) through one hair-thin optical fiber.

Qian et al. from NEC Laboratories America reached 101.7 Tbps over standard single-mode fiber using pilot-based phase noise mitigation. The team sent 370 wavelengths each with data rates of 294 Gbps over 165 km of standard single-mode fiber to achieve the results. The team said it achieved spectral efficiency of 11 bits/s/Hz, which it considered the highest reported spectral efficiency to date for wavelength-division multiplexing transmission.

A separate team, Sakaguchi et al. from Sumitomo Electric Industries in Japan, demonstrated 109 Tbps using spatial division multiplexed signals over a seven-core fiber. The Sumitomo group sent 97 colors through each of the cores at data rates of 172 Gbps (two 86 Gbps QPSK signals). The team sent the data over 16.8 km of fiber.

The 34 postdeadline papers came from a variety of sources, including Oracle Labs, IBM, NEC Labs America, Hewlett-Packard, ZTE, the University of Southampton, the Technical University of Denmark, the Heinrich Hertz Institute, Alcatel-Lucent Bell Labs, AT&T Labs, TE Subcom, Sumitomo Electric Industries, the University of Melbourne, Karlsruhe Institute of Technology, the University of California-San Diego, the Technical University of Berlin, Infinera, Alcatel-Lucent, the Technical University of Carolo-Wilhelmina zu Braunschweig, Nokia-Siemens Networks and NTT Photonics Labs.

C. David Chaffee (cdcfiber@aol.com) owns Chaffee Fiber Optics, a Baltimore-based firm that specializes in analyzing developments in fiber optics and publishing on the state of the industry.

Andrew Bach, keynote speaker for the Service Provider Summit at OFC/NFOEC Wednesday morning, presented the daunting challenge the New York Stock Exchange (NYSE) is facing--having to transmit millions of trades in microsecond timeframes.

Bach painted the picture of an ever-growing network demanding the newest communications technologies for the Exchange to simply keep up with the ballooning number of trades at real-time speeds. “A delay of five or six microseconds could cost several hundred thousand dollars,” he noted.

Bach made it clear that the NYSE is living in a terabit world. “Get me one terabit pipes, please,” said Bach. The Exchange currently uses 2.5 Tbps and the demand for more is going up quickly, said Bach.

What are the volume levels like? There are between 100,000 and 400,000 messages delivered every second on the NYSE itself. By the end of the decade, Bach said the exchange expects to grow to 10 million messages every second.

Whatever delay there does seem to be is related to the need for the exchange to store the information in its New Jersey data center, according to Bach. This is critical for investigations that might come in the future.

The NYSE is “now a heavy consumer of dark fiber; we are lighting it ourselves,” says Bach. An important advance has been the Exchange's ability to operate the data center remotely, a condition that was necessary recently as the result of blizzards in the area did not allow people to be physically present at the center as markets stayed open.

The Exchange currently uses 6,000 routers and switches, 200,000 1Gbps ports or below, and 10,000 10 Gbps ports, according to Bach.

What are the challenges? “As the bandwidth keeps going up and up, the jitter has to be brought down to near zero if not zero,” according to Bach. “We have far too many switches waiting in case something breaks. If I cant start putting in one terabit links, it's just not worth it.”

Likewise, latency needs to keep becoming less of a factor. It is one reason why the data are not encrypted for security purposes. However, as Bach points out, the data are no longer of use after a second or two anyway as they become stale after the newest trade.

Something else on Bach's wish list? Hollow core fibers, which he believes will also speed trades.

C. David Chaffee (cdcfiber@aol.com) owns Chaffee Fiber Optics, a Baltimore-based firm that specializes in analyzing developments in fiber optics and publishing on the state of the industry.

“We are on the way to the gigabit society,” said OFC/NFOEC 2011 keynote speaker Bruno Orth Tuesday morning at the plenary session. Orth defines the gigabit society as a mobile broadband photonic network that is all IP. “The price for WDM has gone down tremendously over the past decade,” said Orth. Router performance is much better than Moore's law would estimate.

New networking models are needed to deal with the economics of fiber to the home, Orth said. “The first 20 percent of those receiving it are not the problem,” he observed. “The last 20 percent account for up to 50 percent of the networking cost. Therefore, we need a new model for FTTH infrastructure”

A helpful exercise for service providers that is used at Deutsche Telekom is to assume that all your customers use smart phones, or that all your customers had their full content in the cloud, or that they all used VOIP and roamed freely, according to Orth. He raised the growing fear that many have that smart phones have the potential to stress or even crash the network.

“We are engaged in optics in a way we have never been before,” said Alan Gara, IBM Fellow and Blue Gene Chief Architect. “All interconnects in the new IBM supercomputers will be optical by 2018,” according to Gara. “Without optics we will not be able to continue to build systems,” he continued. “The optical boundary will continue to move in.” The only way IBM will be able to achieve its next gen supercomputing goals will be through optics, he said.

Kristin Rinne of AT&Labs said there has been an 8,000 percent increase in mobile broadband traffic over the last four years, noting that the application behind much of the growth is video. “There is an awfully lot of wireline in the wireless network,” said Rinne, who quoted Dell'Oro report numbers which say that $8 billion will be spent on fiber and microwave mobile backhaul upgrades in the next five years.

The first keynote speaker at OSA's Executive Forum 2011, Basil Alwan, set an optimistic tone for both the Forum and OFC/NFOEC 2011, observing that the optical transport industry now is “really breathtaking.” He even went so far as to say “it is an honor to be associated with this industry at this time.” Alwan is president of the IP Division and Head of Portfolio Strategy for the Networks Groups.

What's so exciting? Well, for one thing the size of the network continues to grow as does the number of its endpoints, Alwan told a crowded Marriott conference room Monday morning. Caching also is a high growth storage application Alwan is excited about. “We are thrilled with the progress in 100 G,” he continued, “we are really pulling the rabbit out of the hat with 100 G.”

Alwan went on to say that 400 G is “achievable and practical,” and that a terabit “will be necessary.”

Perhaps giving insights into what Alcatel-Lucent's optical research involvement will be going forward, Alwan said “it is not an option to leave anything on the table any more. We can't ignore any options. Each one may be a silver bullet.”

In the first panel following Alwan's talk, Verizon's Glenn Wellbrock agreed with Alwan's assessment of the coming need for 100 G. The popular Wellbrock, who is speaking on some six panels this week, said optical transport routes between New York and Chicago, “deserve 100 G, probably several 100 G lines.”

A question that dominated the panel was how rapidly the network operator should move to IP optical using packet transport. While Google was seen as having the newer, next gen network, Wellbrock observed that any Google search or phone call is going through a Verizon or AT&T network at some point.

And Google Senior Network Architect Bikash Koley, also a member of the first panel, was deferential, observing that, “I think everyone would agree that we are moving to packet-based services.”

However, Koley also noted that “the biggest challenge for us is that a lot of the optical transport equipment that has been designed we don't need.” He acknowledged that “the way to overcome this” is for manufacturers to know what Google needs.

When we caught up with him afterwards, Koley said his comments related to the larger Google core optical transport network, not the 1 Gbps to the residence network the company has promised to bring to one or more communities. However, he did say Google was 'surprised” by the high number of vendors that responded to the 1 Gbps to the home solicitation once it was offered.

C. David Chaffee (cdcfiber@aol.com) owns Chaffee Fiber Optics, a Baltimore-based firm that specializes in analyzing developments in fiber optics and publishing on the state of the industry.

Can you do quantum mechanics with everyday-sized objects? And could such macroscopic quantum objects help detect the most elusive predictions of Einstein’s theory of general relativity?

Yes and yes, according to Nergis Mavalvala, a physics professor at the Massachusetts Institute of Technology (U.S.A.). She and her colleagues are using lasers for cooling and trapping gram-sized and even kilogram-sized interferometer mirrors--just like optical traps for cooling atoms. In a few years, instruments operating at the standard quantum limit will start hunting for gravitational waves.

“If you could do astrophysics with a gravitational wave, it’s like turning on a new sense,” Mavalvala said. “You’ve had eyes all along, and suddenly you have ears and you turn on hearing. It’s bound to provide some very different information.”

Albert Einstein’s theory of general relativity predicted the existence of gravity waves traveling at the speed of light. Such waves, if they exist, would stretch and squeeze spacetime transverse to the direction of propagation. The amplitude of a gravitational wave, also known as strain or h, is a dimensionless quantity defined as the change in length per length, similar to tidal forces. In other words, gravitational stretch and squeeze spacetime by fractional amounts proportional to the distance between two objects.

Gravitational waves, if they exist, would have frequencies of 10 kHz or less and would interact only weakly with matter. Einstein was morose over his own calculations when he realized the difficulty of detecting such waves. Two decades after his death, however, radio astronomers provided the first clue to their existence.

In 1974, University of Massachusetts (U.S.A.) researchers Russell Hulse and Joseph Taylor Jr. found a binary pulsar consisting of two neutron stars orbiting each other at 0.15 percent of light speed. Since the pulsars contain more than the mass of the sun packed within a 10-km radius, they have extremely strong gravitational fields.

Taylor and Hulse showed that their orbits are shrinking at exactly the rate that Einstein’s theory would predict for the emission of gravitational waves from the system. Further observations up to the present day continue to confirm Einstein’s theory. The binary pulsar’s energy loss is widely accepted as evidence for gravitational waves, and led to the 1993 Nobel Prize in Physics for Hulse and Taylor, Mavalvala said.

Direct detection of gravitational waves on Earth, however, is incredibly difficult because the strain on an interferometer would be on the order of 10–21. So the two interferometers of the LIGO project, located in Hanford, Wash., and Livingston, La. (U.S.A.), have 4-km-long arms.

Scientists could measure gravitational waves by measuring the phase shifts of light in an interferometer. However, external forces also push the mirrors around much more than a gravitational wave can push them around, and laser light has fluctuations in phase and amplitude, so both sources of noise must be reduced. The LIGO team designed their interferometers with optical cavities in each arm to increase the instruments’ sensitivity to mirror displacement and thus to gravitational waves.

Advanced LIGO: More sensitivity, more issues

The first-generation detectors known as Initial LIGO, which are now being removed, led to much interesting astrophysical research, but no positive detections of gravitational waves (yet). Construction of the next-generation Advanced LIGO detectors, designed to be 10 times more sensitive than their predecessors, began in October.

But with Advanced LIGO, there’s a catch: The detectors will bump up against the standard quantum limit. According to Mavalvala, the team has to get around the quantum limitations by injecting squeezed states of light, with more precise measurements of phase at the expense of knowledge about the amplitude of the light, in order to reduce the quantum shot noise limit.

To deal with the other type of quantum noise--radiation pressure--the Advanced LIGO scientists are using optomechanical coupling to trap and cool macroscopic mirrors down to very low quantum states--the way lasers trap and cool atoms.

Initially Mavalvala and colleagues tested a 1-gram mirror suspended from a pair of specially made glass fibers (designed with fewer impurities and flaws than commercial optical fibers). Starting at room temperature, they shifted the mirror’s oscillator frequency from 10 Hz in the mechanical regime to 500 Hz, and cooled it to just under 1 mK. In other words, the cooled mirror has 35,000 quanta, instead of 109 quanta in its normal state.

“We are not yet in the quantum regime, a factor of 5 or 35,000 quanta is not yet quantum, but it’s getting close,” Mavalvala said.

In the most recent experiment, the team took one of the 2.7-kg mirrors from the Initial LIGO experiment, with a resonant frequency of just under 1 Hz at room temperature (and containing about 1,026 atoms), and shifted its resonance out to about 150 Hz and cooled it town to 1.4 μK--corresponding to only 200 quanta. (When that number equals 1, the kg-scale object will have reached its quantum ground state.)

Advanced LIGO, slated to begin around 2014, will operate at the standard quantum limit. Scientists expect that the project will detect signals that could be gravitational waves several times per year, Mavalvala said.