How optical processing can solve some of the world’s most complex problems.

10

There’s a famous puzzle in computer science called the traveling-salesman problem, where people try to find the fastest way to visit all the cities in a given territory.

It’s simple enough with three or four cities. But every time the salesman adds a new city to his territory, it adds a new potential link to every other destination already in the set. With four locations, there are only six possible routes. With five locations, there are 12 routes. With 15 locations, there are more than 40 billion possible routes."

By the time you reach 30 cities, there are more possible routes than there are stars in the universe. Before long, there are so many possibilities that calculating the answer requires enormous computational resources.

Real-world logistics planners don’t even try to find the best route. Instead, they use various algorithms to find routes that are good enough.

Ray Beausoleil wants to solve the traveling-salesman problem. And he wants to solve it quickly. Over the last decade, Beausoleil, Senior Fellow at Hewlett Packard Labs, has delved into quantum mechanics and carved computer chips from diamonds in an attempt to do so.

Part of the socket align system for The Machine's photonics (Photo: Michael Blumenfeld)

“It’s a question of how long you can wait for your answer, and how much computing power it’s going to cost you.”

Ray Beausoleil, Hewlett Packard Labs

Beausoleil is convinced that problems like the traveling salesman conundrum won’t be solved with more powerful processors or better algorithms, but rather with a new approach. “I want to discover an entirely new way to compute,” he says.

Beausoleil, tall with a baritone voice and a graying goatee, is developing new computing technologies based on photonics, the science of light waves. He helped invent the first optical mouse in 2001, then spent more than a decade researching ways to build a quantum computer before deciding that harnessing light was a better way forward.

Three years ago, Beausoleil and his team at Labs started designing circuits that can process information using photons (light) instead of electrical charges. Much of the world’s data already travels in the form of light, over fiber-optic cables. Those light signals must be converted into electrical signals in order for computers to process them. Computing with light could save a lot of time and energy.

The components necessary to build an optical computing system don’t yet exist in a commercially viable form. Working as part of a Defense Advanced Research Project Agency program called MESO, Labs is developing optical circuitry and testing new materials like gallium arsenide in place of today’s preferred semiconductor, silicon.

“We’re trying to solve a broader set of problems and we have some mind-boggling demos of things that we can do in the lab just by controlling light,” says Beausoleil. “The hard part is, we’re sort of building this field from scratch.”

Labs Notebook

Computing with light instead of electrons

The benefits of photonic computing extend well beyond mimicking with light what computers do today with electricity. Photons can do things electrons can’t. Beausoleil and his team are trying to harness the differences to solve problems like the traveling salesman, regarded by computer scientists as “NP-hard,” an insider term that effectively means very, very difficult to solve. (NP-hardness—the NP stands for nondeterministic polynomial-time—describes, among other things, a situation where the time it takes to solve a problem grows faster than the size of the input data.)

“There are thousands of problems of this nature, in many different areas of life,” says Kenneth W. Regan, an associate professor at the University at Buffalo in New York who researches computational complexity. A system that could quickly find accurate answers to NP-hard problems, he says, could be applied to medical diagnostics, genomics, fault detection in complex mechanical systems, analysis of social networks, fraud detection, and much more.

Starting small

The essential building block of modern computing is the transistor, a tiny switch activated by electricity. A transistor has two possible states, on or off, which form the basis for the binary system of 1s and 0s that computers use to process and store data. A circuit is a set of transistors that work together to accomplish a task. One type of circuit, a “logic gate,” can respond to inputs with a different output—the basis for computation.

The Labs team’s first challenge was to create a set of components that could compute using light waves instead of electricity.

One result is an optical version of a component called an AND gate, which performs a simple computation based on two or more signals. The signals come together into a single input. Based on how the signals interact with one another, a tiny component called a micro-ring resonator determines whether their combined input warrants producing an output signal.

This interaction, called interference, is similar to the way waves cross in water. “If you drop two rocks into a pond, based on how the ripples move and cross each other, you can calculate not only where the rocks were dropped, but also when they were dropped relative to each other,” Beausoleil says.

Using interference, an optical-processing circuit could use the interaction between two signals to calculate information about the source of each signal. For example, Beausoleil says, the interference between optical signals from two or more shop floor machines could indicate that one of the machines is not operating correctly.

“It’s a question of how long you can wait for your answer, and how much computing power it’s going to cost you.”

Ray Beausoleil, Hewlett Packard Labs

Quantum leaps

Beausoleil’s initial efforts to solve the traveling-salesman problem focused on exploiting the mind-bending properties of quantum mechanics to build a quantum computer.

In brief, subatomic particles behave in strange ways. They exhibit a property called entanglement, in which two particles can affect one another’s behavior even if they aren’t physically connected. Particles also exhibit coherence, which allows them to interact with each other in a way similar to how waves interact.

In theory, a quantum computer could harness these properties to execute multiple computations simultaneously. “Quantum computing promises exponentially greater computation power for a range of tasks,” says Jeremy O’Brien, director of the Centre for Quantum Photonics at the University of Bristol in England.

Beausoleil spent about a decade working on quantum computing. That’s when he experimented with chips made out of diamonds. His team found that it was too difficult to create enough diamond chips that had exactly the same qualities, making it impossible to manufacture them in a repeatable and predictable way. This particular approach to building a quantum computer wouldn’t scale.

Beausoleil concluded that photonic technology offered a faster and more practical route to success than building an actual quantum computer. “When somebody does manage to create a true quantum system with entanglement, that’s going to be awesome,” Beausoleil says. “Right now, we’re leaving that one on the table and just trying to take advantage of coherence.”

“Just writing down the quantum state of a molecule becomes exponentially more difficult as the molecule grows in size,” he says. “Using a quantum computer, you could simulate molecules and calculate their qualities in minutes, which would take years on conventional computers today.”

Practical science

One lesson of Beausoleil’s quantum research is that something that works in theory is of no use if it can’t also be built. So while his current team is discovering new ways to compute with light, its focus is squarely on creating real systems that can be manufactured.

The idea of “photonic circuits” has been around since the 1970s, but no one has yet figured out how to produce them at scale. By late 2014, the Labs team had created schematic diagrams of the primary logic gates and other components they wanted to build. They’d also written various software algorithms to help account for variations that normally occur in fabricating new components.

“You know that working with current fabrication tools, you won’t get exactly what you’ve designed, so you have to use tuning algorithms to make sure that the optical component does what you want it to do,” says Labs researcher Thomas Van Vaerenbergh.

So far, they’ve succeeded in building an optical circuit with about 1,000 components, said Dave Kielpinski, a former senior researcher at Labs. Getting to this point involved a lot of trial and error.

The team started by fabricating its most basic components, such as AND, NOT, and OR logic gates. It then moved on to more complex elements.

Optical computing is an emerging field with experimental components. The challenge is to raise the yield of working components to a commercially viable level. The Labs approach is to create a round of equipment, test it to identify flaws, then determine necessary changes in the chip design or manufacturing method.

Eventually, the researchers borrowed a method from electronics manufacturing to reduce the defect rate. Called overprovisioning, it builds duplicate components in a single circuit in case one or more versions fail during operation. “Where you might only need one unit to do the job, you include six of them,” Van Vaerenbergh says.

The next challenge is to build hundreds or thousands of working circuits. “That means the sweating, bleeding, cursing of working with external fabrication companies to make these devices reliable enough, with a high enough yield” of viable product, Beausoleil says.

Return of the traveling salesman

Labs researchers are also exploring new applications for their optical circuits. For example, they’ve designed a system called an energy minimization computer that changes the state of an optical circuit to find the configuration that consumes the least amount of energy. This concept applies perfectly to solving NP-hard problems like the traveling salesman.

A conventional computer solves NP-hard route optimization problems by sequentially checking each route combination. It can’t give an answer until it’s checked every possible combination. The usual approach is to break problems into different components that can be computed at the same time, and calculate approximate answers rather than exact ones.

Even using that method, finding a solution “is very computationally expensive,” Beausoleil says. “It’s a question of how long you can wait for your answer, and how much computing power it’s going to cost you.”

The energy minimization computer takes a different approach. The first step is mapping the problem onto the physical system—with a node for each city, for example. When the system is turned on, light runs through all the nodes at once, and every node works at the same time to find the most efficient connection path.

“You’ll see the system flicker through different possibilities and then quickly settle on the best answer it can,” Beausoleil says. “Because we can’t fabricate the system perfectly yet, the imperfections prevent us from finding the absolute perfect answer. But our goal is to get a better answer than we could with an ordinary computer.”

This technology won’t replace general-purpose electronic computers, because lots of problems aren’t NP-hard problems. But a photonic system-on-a-chip could be used as an accelerator running alongside a CPU in a conventional computer. Emerging computing platforms like The Machine, a next-generation system under development at Hewlett Packard Labs, offer even more intriguing possibilities. The Machine will hold huge amounts of data in memory and allow users to plug in different processors as needed, depending on the type of computation they want to perform.

“Our goal is to get a better answer than we could with an ordinary computer.”

Ray Beausoleil, Hewlett Packard Labs

Vision quest

Further research may uncover other uses for light-based computing. For example, optical circuits may be useful for preprocessing visual data. Human vision demonstrates this concept. We aggregate two slightly different optical signals—one from each eye—adjust for the phase difference, and identify information in a busy visual field to which the brain should pay attention.

“Optical circuits might be able to do the same thing in computing, reducing the number of pixels you actually need to process in your electronics system,” Kielpinski says.

Optical circuits could also handle information routing, saving energy and simplifying the path data takes to reach its destination. This would allow organizations to re-architect their data centers in a way that gets rid of switches.

Beausoleil envisions computing systems that incorporate multiple types of problem-solving systems. Each task would be handled by the appropriate tool.

“You will end up classifying each problem in order to determine which of the physical properties of the underlying computational tools are interesting,” he says. “The powerful thing about this research is that it could lead to optical accelerators that work nicely within both the classical-computing paradigm of today, and the potential of quantum computing tomorrow.”