The world's largest international conference on optical communications will take place March 6-10 at the Los Angeles Convention Center.

The Optical Fiber Communication Conference and Exposition/National Fiber Optic Engineers Conference (OFC/NFOEC) is the premier telecom meeting where experts from industry and academia share their results, experiences, and insights on the future of electronic and wireless communication and optical technologies. More than 10,000 attendees and an exhibit with 500 companies are expected.

CONFERENCE HIGHLIGHTS

Plenary Session keynote speakershttp://www.ofcnfoec.org/Home/Program/Plenary-Session.aspxSpecial symposia http://www.ofcnfoec.org/Home/Program/Special-Symposia.aspxAgenda of talks and abstracts http://www.ofcnfoec.org/Home/Program/Agenda-and-Abstracts.aspxWorkshops & panels http://www.ofcnfoec.org/home/Program/Workshops-and-Panels.aspx Tutorials http://www.ofcnfoec.org/home/Program/Tutorial-Speakers.aspx SCIENTIFIC HIGHLIGHTSThe conference features a comprehensive technical program with talks covering the latest research related to all aspects of optical communication. Some of the highlights, outlined below, include:

The Internet, of course, is a global phenomenon. Its most ubiquitous application is not called the World Wide Web for nothing. Global reach means Internet traffic often must cross the ocean in fiber optic cables. As these cables serve more users, each requiring more bandwidth, traffic demands invariably rise. TE SubCom researcher Dmitri Foursa says that meeting these demands might be possible in a way that doesn't require pumping up line rates, or bandwidth, the usual approach.

Line rates describe the amount of data that can be pushed through a fiber optic cable per second in each channel. Today the rate for many undersea cables is 10 Gb/s. Higher rates are on the way. Some cables that span continents now have 40 Gb/s rates, while industry R&D labs have demonstrated 100 Gb/s rates and higher.

Foursa studies spectral efficiency, another more subtle metric concerning capacity of fiber optic cable. Basically, the term is a measure of how many information-carrying channels can be packed into a given bandwidth. The comparison is inexact, but you'd have something like improved spectral efficiency if you could tune in 30 stations as you hit "scan" on your car radio instead of the usual 15 stations. The reason you are usually stuck with 15 or so is simple: it is increasingly difficult to faithfully receive a signal as the distance between the transmitter and receiver increases. A similar problem plagues transmission of signals along undersea cables.

Foursa's accomplishment, described in a paper co-authored with several TE SubCom colleagues and to be presented at OFC/NFOEC, is to more than double the distance over which transmission of high spectral efficiency channels is possible at the highest commercially deployed submarine line rate, 40 Gb/s.

In terms high spectral efficiencies over transoceanic distances, previous demonstrations at 40 Gb/s fell short of those achieved at 100 Gb/s. So Foursa's work, in effect, fills in a gap in the technology roadmap for undersea fiber optic cables.

"The point is 40 Gb/s systems are being built and upgraded now, hence the need to optimize the use of bandwidth," says Seymour Shapiro, TE SubCom CTO and vice president of research and development. "Systems operating at 100 Gb/s are somewhat down the road, although new transoceanic builds will be capable of supporting 100 Gb/s when the terminal equipment becomes available."

Foursa notes that his work was mostly done to generate applied knowledge relevant to his industry and might never make it into cables strung across the ocean by companies like TE SubCom. But as is true of all new cable capacity, Foursa's innovation could give options either to meet future traffic needs or cut costs to serve existing traffic. Then there's the technical novelty of it. "This demonstration shows that we can pretty much go 18,000 kilometers at a pretty good spectral efficiency," he says, noting in his paper that such a distance is sufficient to cross the Pacific Ocean, the world's largest. "And that hasn't been achieved with any other system before."

The revolutionary online social network Facebook now has more than 500 million active users, which is roughly double the count from just two years ago. With each new member comes more status updates, photo albums, and Web links. More than 30 billion pieces of content are shared each month on the site. How all that information is stored in warehouse-sized data centers is a constantly evolving process.

To keep up with the growing "human" network on Facebook, the company is scaling up its own network of computers and other hardware. "It's like painting the Golden Gate Bridge," says Donald Lee, a Facebook network engineer. "By the time you are done upgrading all parts of the network, you have to return to where you started and begin scaling everything again." In the data centers, more bandwidth is needed between servers, but Lee says that traditional network switching hardware has fallen behind, and novel switching technologies need to mature before they can be used in a real-world data center.

What may be needed are new rules for how network routing and switching is done in a large data center. In his presentation, Lee plans to discuss the current requirements of a large data center, while laying out the role he thinks optical fiber innovations can play in future data center scale-ups. He will also provide some concrete visuals of cloud computing, in which an application like those offered on the Facebook website is performed remotely on a set of shared servers.

As optical networks begin accepting more traffic, they will need a more efficient way to move data from place to place. One option is to use a single dedicated optical bus, called a light-trail, which has some of the same advantages that a subway line has over multiple intersecting city streets. The "tracks" for a light-trail would be essentially permanent, so fewer resources would be needed for deciding how to route data. Recent simulations show that light-trail communication can outperform other network options for certain applications.

Many city-wide optical networks are complicated webs connecting multiple nodes together. Current data transport techniques navigate data from one node to another by setting up temporary light-paths that the optical signal can follow. These strategies have worked fine for data rates of 1 Gbit/s, but the speed limit is set to increase to 10 Gbit/s. The light-trail's approach, which was conceived in 2003 for intra-city data transport, involves setting up a one-way channel that continuously connects multiple nodes. When one of these nodes seeks to send data, it is allotted a specific time slot, as well as a small section of the available bandwidth over which to broadcast. A short burst of optical data is generated by the sending node and travels over the pre-determined light-trail to all the downstream nodes, including the destination node. One of the advantages of having a constantly open channel is that intermediate nodes can obtain a time slot and use the same light-trail without having to reconfigure the network, explains Arun Somani of Iowa State University.

Besides city networks, the light-trail approach may be appealing for data centers and cloud computing. To examine this broader potential, Somani and Ashwin Gumaste of the Indian Institute of Technology in Mumbai performed network simulations comparing light-trails to other management protocols. The researchers looked at efficiency, energy consumption and response time – all of which are important parameters to network operators. The results that will be presented show that light-trails performed better at high-traffic rates than all the alternatives.

Cable TV won't be "cable" for much longer. The eventual transition to all-optical-fiber networks means there will no longer be a coaxial cable running to each customer's house. But getting the full potential from the optics will require replacing the signal-producing devices. Some cable operators want to continue sending radio frequency over glass, or RFoG, as a way to upgrade to fiber while postponing a complete overhaul.

Currently, a lot of homes get their cable TV and Internet access over a hybrid fiber-coaxial (HFC) network, in which the signal travels over optical fiber from the cable company to a neighborhood node and then switches to a coaxial cable for the last mile to the house. The light transmitted through the fiber is modulated by a radio frequency (RF) signal that carries both video and Internet data. In order to convert this optical signal into an electrical signal for the coaxial cable, the HFC nodes require active components that have high energy and maintenance costs. Therefore, the push is toward passive optical networks (PONs) that can reduce overhead by extending fiber all the way to the user.

A true PON system will boost data rates by encoding the signal in optical rather than radio frequencies. But this will require cable companies to invest in new hardware and new management practices in order to generate the optical data stream. The intermediate solution, RFoG, is to continue transmitting the same RF signal on a fully fiber network. It wouldn't give all the advantages of PON, but RFoG would offer a modest increase in data rates over current HFC hook-ups. "RFoG provides cable operators a way to break into fiber to the home without having to disrupt their operational procedures," says Jim Farmer from Enablence – an optical communication company based in Ottawa.

RFoG has been tested in several field trials, but wider adoption is likely now that a standard was recently adopted. Farmer will report on these developments and how RFoG could make for an easier transition to PON in the coming years.

The Netherlands-based SURFnet is among the most advanced research and education networks in the world. The network is similar to the Internet2 protocol in the United States, which brings together select networks of universities and industrial research centers, and is a potential boon to anyone dealing with vast amounts of data and large computational problems. It's also something of a mystery to everyday users of the Internet, something that Cees de Laat aims to remedy in his OFC/NFOEC talk on eScience applications on SURFnet.

De Laat is professor at the University of Amsterdam whose research focuses in part on SURFnet innovation. The main difference of the Dutch SURFnet network is its so-called hybrid nature, a quality shared with other national research networks like Internet2. On a hybrid network, users can access underlying network architecture and circuitry. For most people interacting with the Internet via a browser, these underlying resources are static and occasionally swamped by traffic, a fact that's annoyingly obvious to anyone who has dealt with choppy Web videos or dropped Skype calls.

SURFnet's hybrid architecture is already helping astronomers and filmmakers. This is thanks to two applications de Laat says might well be prototypes for addressing the inevitable increase in data intensity and increase in demand for network services.

One application he will discuss is the Software Correlator Architecture Research and Implementation for the e-VLBI (SCARIe) project. SCARIe takes observational data from telescopes mostly around Europe pointed at a similar part of the sky. This data is transported via SURFnet to a central location, where it is stitched together, or correlated, to build a high-resolution radio map of the sky.

"In earlier days, radio astronomers would put their data on hard disks or tapes and ship it to each other; this meant correlation always took weeks or months," de Laat says. "Now they can do it in almost real time."

The second application is CineGrid, which aims to help those in the entertainment industry take full advantage of advances in parallel computing and photonic networking. These advances are especially important because the digital tools used to create films generate increasingly huge digital files that need to be shared around the world for tasks such as dubbing audio, adding computer generated animation, correcting color and so on.

Less than four years ago, a UCLA research team, led by Daniel Solli, was trying to discern the initial conditions and mechanisms that led to rare, unusually steep and large waves in an optical system based on a nonlinear optical fiber near the threshold of soliton-fission supercontinuum generation. A giant flash—far larger than expected—would suddenly appear, seemingly for no reason. Solli's team determined, using experiments and simulations, that a slight bit of random noise having just the right characteristics could set off a nonlinear chain reaction that created the super-sized solitons.

Soon thereafter Solli recalled reading about anomalously tall solitary ocean waves—up to 100 feet high—that were the stuff of sailors' lore: battering and sometimes sinking even large ships at sea. Could the phenomena that created these rare gargantuan waves in both light and ocean water be related? And if so, might optical experiments yield insights that could help mariners predict or avoid oceanic "rogue" waves?

These conjectures set off a big wave of their own – a sudden surge of international, cooperative research involving experts in such diverse fields as physics, mathematics, hydrodynamics and oceanic engineering. In his talk, one of these scientists, physicist Goery Genty of the Tampere University of Technology in Finland, details the fascinating threads of inquiry and discovery inspired by "optical rogue waves."

"One surprising finding," Genty says, "is that there appears to be a developing consensus that the initial proposal of solitons as rogue waves is probably not as valid as we thought. But we now know that the pre-soliton stage of fiber propagation corresponds very closely with the deep water environment. We can test optically hydrodynamic predictions that are difficult to assess in the natural environment."

Since 1975, the Optical Fiber Communication Conference and Exposition (OFC) has provided an annual backdrop for the optical communications field to network and share research and innovations. In 2005, OFC joined forces with the National Fiber Optic Engineers Conference (NFOEC) creating the largest and most comprehensive international event for optical communications. By combining an exposition of more than 500 companies, with a unique program of peer-reviewed technical programming and special focused educational sessions, OFC/NFOEC provides an unparalleled opportunity reaching every audience from service providers to optical equipment manufacturers and beyond.

OFC/NFOEC, www.ofcnfoec.org, is managed by the Optical Society (OSA) and co-sponsored by OSA, the Institute of Electrical and Electronics Engineers/Communications Society (IEEE/ComSoc) and the IEEE Photonics Society. Acting as a non-financial technical co-sponsor is Telcordia Technologies, Inc.

Die letzten 5 Focus-News des innovations-reports im Überblick:

Controlling electronic current is essential to modern electronics, as data and signals are transferred by streams of electrons which are controlled at high speed. Demands on transmission speeds are also increasing as technology develops. Scientists from the Chair of Laser Physics and the Chair of Applied Physics at Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) have succeeded in switching on a current with a desired direction in graphene using a single laser pulse within a femtosecond ¬¬ – a femtosecond corresponds to the millionth part of a billionth of a second. This is more than a thousand times faster compared to the most efficient transistors today.

At the productronica trade fair in Munich this November, the Fraunhofer Institute for Laser Technology ILT will be presenting Laser-Based Tape-Automated Bonding, LaserTAB for short. The experts from Aachen will be demonstrating how new battery cells and power electronics can be micro-welded more efficiently and precisely than ever before thanks to new optics and robot support.

Fraunhofer ILT from Aachen relies on a clever combination of robotics and a laser scanner with new optics as well as process monitoring, which it has developed...

Plants and algae use the enzyme Rubisco to fix carbon dioxide, removing it from the atmosphere and converting it into biomass. Algae have figured out a way to increase the efficiency of carbon fixation. They gather most of their Rubisco into a ball-shaped microcompartment called the pyrenoid, which they flood with a high local concentration of carbon dioxide. A team of scientists at Princeton University, the Carnegie Institution for Science, Stanford University and the Max Plank Institute of Biochemistry have unravelled the mysteries of how the pyrenoid is assembled. These insights can help to engineer crops that remove more carbon dioxide from the atmosphere while producing more food.

Our brains house extremely complex neuronal circuits, whose detailed structures are still largely unknown. This is especially true for the so-called cerebral cortex of mammals, where among other things vision, thoughts or spatial orientation are being computed. Here the rules by which nerve cells are connected to each other are only partly understood. A team of scientists around Moritz Helmstaedter at the Frankfiurt Max Planck Institute for Brain Research and Helene Schmidt (Humboldt University in Berlin) have now discovered a surprisingly precise nerve cell connectivity pattern in the part of the cerebral cortex that is responsible for orienting the individual animal or human in space.