Quantum Computers Appear – and Are Put to Work

Actual quantum computers are likely a decade or more away, but advances today are bringing them closer and closer.

You can’t yet buy them online or at a big box store, but the day of quantum computers has drawn closer, thanks to recent events. Many are advances involving the production, storage and manipulation of particles of light. That makes sense, given that photons are often the carriers of choice for quantum information.

Progress in quantum computing and associated technology has been so substantial that the field is drawing significant commercial attention. In March, the founders of smartphone maker BlackBerry announced a $100 million private equity fund to commercialize quantum technology breakthroughs. Defense contractor Lockheed Martin of Bethesda, Md., is making quantum computing a key part of its business, using the technology to tackle tasks too tough to solve with standard computers.

Not every company might bet on this unconventional approach, but “it’s a new way of doing computing that is completely different from today’s high performance architecture,” said Brad Pietras, vice president of technology at Lockheed Martin.

Lockheed Martin plans to use a recently upgraded 512-quantum-bit, or qubit, computer from D-Wave Systems of Burnaby, British Columbia, Canada, to solve a vexing problem: As software grows increasingly complex, it becomes ever harder to fully verify. For today’s traditional computers, the time required to step through all the possible software states goes up exponentially with increasing complexity.

In contrast, a quantum computer can crank through the problem with a linear increase in time. Binary bits are either a 1 or 0. A qubit, on the other hand, is a superposition of two states, allowing them to be a 1 and 0 at the same time. As a result, a quantum computer can be thought of as testing all possible problem solutions simultaneously, leading to a reduction in solution time as compared to that required by a conventional computer. This difference grows as the problem becomes more complex. Besides software testing, such time savings could also benefit engineering, drug discovery, machine learning, network analysis and other areas.

The probability distribution of balls moving randomly through a series of pegs is easy for a classical computer to compute (left). The same is not true for identical photons propagating through a network of waveguides (right). Boson sampling, a form of specialized quantum computing, solves this problem, proving the power of quantum calculation.
“[Quantum computing] really is an enabler for innovation, down the road,” Pietras said.

The D-Wave device doesn’t use any photonics. Instead, it employs superconducting integrated circuits, with loops of superconducting metal storing qubits as circulating current. The qubits are excited into a state representing the problem and then allowed to settle into a ground state, thereby revealing the solution. This is a special-purpose machine designed to solve particular optimization problems and is not a universal quantum computer.

Another example of a special-purpose, or intermediate, quantum computer was announced late in 2012 by two research groups in papers that appeared in Science (Broome et al, doi: 10.1126/science.1231440 and Spring et al, doi: 10.1126/science.1231692). Both groups built boson-sampling machines in which photons were injected into crisscrossing waveguides. Due to quantum effects, the photons, which are bosons, clump up in the output in a way that is very difficult for a classical computer to calculate.

Laser light highlights a portion of the crisscrossing waveguide network used in boson sampling. During an experiment, single photons are sent through rather than a beam.
In experiments, researchers sent three or four photons into the waveguide network and detected where they appeared. The photon counts matched calculations done on a standard computer, thus proving the soundness of the concept. This is an important finding, because while it’s possible to calculate the outcome for a few photons, this check cannot be done for 20 or 30 (or so) photons going into a network with hundreds of nodes.

Conducting the 20- to 30-photon boson-sampling experiment will prove that a quantum computer really can outperform a standard computer. However, it won’t be easy to get there, said Ian Walmsley, a professor of physics at the University of Oxford, who headed one of the research groups comprising investigators from Oxford, the University of London, the University of Southampton and Shanghai Jiao Tong University in China.

Science paper lead author Matthew Broome of the University of Queensland at work on a boson-sampling device.
Boson sampling requires a supply of single photons of identical quantum states, a low-loss way to route them and very efficient detectors. For the detectors, the probability that an n-photon event is detected scales as the efficiency of the detector to the nth power – that requirement has implications for just how good the detector needs to be.

“Trying to get to 20 photons, the efficiency better be darn close to 1; otherwise, you’re going to wait an awfully long time to even get an event,” Walmsley said.

The Oxford-led group generates its photons in pairs, using one to signal that another is ready to be used; this works, but there’s no control over exactly when the process happens. One solution would be the development of quantum memories. These devices would store and release photons on demand without altering their quantum states. It’ll be about a decade before the challenges are overcome and the 20- to 30-photon experiment is done, Walmsley predicted.

Andrew White, a professor of physics at the University of Queensland in Brisbane, Australia, led the first group listed above that announced boson-sampling success. In addition to researchers from Australia, others on the team were from MIT in Cambridge, Mass.

One of the barriers to scaling up the boson-sampling experiment is a lack of a suitable photon source, White said. One possible solution would to be use quantum dots, which could be constructed so that there is a high probability of photon
production. But no matter the method, the required photon generation rate might be achieved by multiplexing sources,
with each charged with creating only some fraction of the total.

In the quantum optics lab, researchers investigate the creation, manipulation and detection of single and entangled photon sources, work that could lead to a quantum communicator and computer.
Achieving the 20- to 30-photon boson-sampling threshold not only will prove the power of quantum computing, but also could yield other benefits. There are various intermediate quantum computing models, with some using photons while others exploit atomic spin or another physical phenomenon. These models are intended to solve particular and different computational problems. Despite these differences, there’s a strong feeling among complexity theorists that these various models are somehow interrelated.

“That suggests they have a deep connection. So making progress in any one thing – like boson sampling – may throw light on all the others,” White said.

As for the ability to get photons where needed, that faces its own problems, but progress is being made. Bulk optics offers a low-loss way to manipulate photons while integrated photonic circuits allow this to be done in a small volume. Advances in silicon photonics components promise to allow the creation of structures that are small enough for losses to be acceptable.

Specialized optics equipment used to prepare, manipulate and send single photons for quantum key distribution, a specialized form of quantum information transmission.
It helps that photonic quantum computers may be able to tolerate relatively high internal losses. Studies have indicated that the loss of as much as a third of the photons entering a controlling gate would still allow a universal quantum computer constructed of such components to be scaled indefinitely.

One of the challenges facing researchers is the nature of photons. On one hand, they tend not to interact with each other or other material. That property allows them to carry quantum information for long distances. It’s why they’re used for quantum communications, a version of which is quantum key distribution. There already are commercial products that offer this, with the selling point being increased communications security.

However, sometimes quantum schemes call for photons to do more than simply pass through a medium or by each other. In quantum memory, for instance, information has to be transferred from and to a photon, which means the photon cannot simply ignore the medium. In computing, qubits need to be able to talk to each other in different ways, as when one qubit takes a particular action or assumes a given configuration depending upon the state of another.

Progress is being made in the area of getting photons to interact on command. For instance, scientists at NIST and the University of Maryland reported successfully storing optical images in a cloud of rubidium atoms (Clark et al, New Journal of Physics, doi: 10.1088/1367-2630/15/3/035005). Studying how the atoms react to manipulation could help provide the information needed to store and retrieve quantum data, stated team member Paul Lett, a physicist, in discussing the results.

A very simple version of quantum memory could be useful for both computing and communications, said Raymond Laflamme, director of the Institute for Quantum Computing at the University of Waterloo in Canada. One of the drawbacks for quantum communications is that it can only span tens of kilometers if optical fiber is used to carry the photons. Quantum memory could overcome this by allowing the quantum information from an incoming encoded photon to be stored, retrieved and then sent on its way again. A string of such repeaters could allow spanning of continental distances.

In speaking of the emerging quantum technology, Laflamme draws a parallel with the evolution of transistors, which depend upon quantum effects to work. When semiconductor devices first appeared around the middle of the 20th century, they were not practical, not reliable and not all that useful. However, hard work overcame a host of issues, with life- and society-altering effects.

“It has been quite a ride for the last 50 years, if we think about where transistors have gone,” Laflamme said. “The same thing may happen for the field of quantum information.”

Also known as QDs. Nanocrystals of semiconductor materials that fluoresce when excited by external light sources, primarily in narrow visible and near-infrared regions; they are commonly used as alternatives to organic dyes.