The Dawn of Scalable Quantum Computers

How close are we to the quantum computational revolution? Quantum computers promise drastic speedups for tackling the most complex mathematical problems. Nonetheless, current precursors of quantum computers cannot be scaled efficiently to reasonably sized systems. Now, researchers have realized a new setup that can be scaled more easily than ever before.

The heart of the quantum computer. This is where the ions are physically stored and processed, surrounded by lasers, electronics and vacuum systems. Tiny trap segments located at the end of this bar confine and control the ions. Quantum information processing and cooling are done by shining laser beams onto the ions. Credit: J. Jost/NIST.

Imagine an engineer having to work with materials that are constantly changing: iron morphing into wax, wood draining off as water, or cement disintegrating into ashes. Luckily, in classical physics this is a not very common problem — making life relatively easy for classical engineers. In quantum physics, however, things are different: even the most slightly uncontrolled interaction can potentially turn quantum objects into classical ones. As a consequence, it is extremely difficult to build large setups that utilize the quantum nature of matter, such as scalable quantum computers. Now, researchers led by David Wineland at the National Institute of Standards and Technology (NIST) in Boulder (Colorado, USA) have been able to realize a scalable setup for quantum computation.

Classical computers have increased their computational power enormously over the last decades — so why are scientists interested in the possibility of quantum computers? On the one hand, advances in microelectronics have depended largely on continued miniaturization, and engineers are now starting to reach the fundamental quantum-physical limit that will make it impossible to further miniaturize classical technology. On the other hand, classical computers are intrinsically sequential: they execute a list of instructions one after the other. Their limits become evident with certain tasks that do not naturally conform to a sequential solution, such as factorizing prime numbers, sorting long lists, or simulating complex systems. Today's high-power computers, therefore, use multiple processors which share the workload like mechanics building different parts of an apparatus. Even this parallel approach, however, relies on a substantially sequential approach which can only cope with a few small subsystems at any one given time.

Quantum computation schematic. (1) Qubits are prepared or read-out individually, in spatially different zones. (2) Two qubits are brought together in order to perform a two-qubit operation. (3,4) Qubits can be transported to and from other trap segments, allowing for the experimental realization of complex quantum algorithms involving several interacting qubits. Credit: NIST.

Quantum computers would be able to perform intrinsically parallel, collective algorithms. All the parts of the quantum system can be directly connected and made to simultaneously respond, thereby potentially solving complex problems a lot faster. Being so deeply connected, however, is extremely delicate, since it also implies that any uncontrolled interactions may break the quantum parallelism needed for quantum computation. Of course, if a quantum particle loses its quantumness — a process called decoherence — it does not usually disappear, it simply turns itself — and possibly even part of the system it is connected to — into a classical object that can no longer be used for quantum computation.

Today, most setups for quantum computation use about 5-10 qubits (quantum bits). This is not yet enough to outperform classical computers. For ground-breaking results, far more qubits — between 50 to several 1000 — are expected to be necessary. Therefore, it is important to find ways to scale up a quantum computing device to a large number of qubits. Scalability here refers to how easily the setup can be implemented for a larger number of qubits. At present, miniaturization — scaling the size — is not yet a major concern since current error rates are still too high for performing large quantum computations reliably, anyway.

Quantum algorithms can be implemented by operations of at most two qubits at a time. Therefore, given enough qubits and small error rates, sequential treatment of one, or at most two qubits is enough to implement intrinsically parallel quantum algorithms — nature itself will take care of the rest.

The most complex physical, mathematical and engineering problems could thus be tackled in a completely new way, possibly revolutionizing science and technology. One consequence, for example, would be that we would no longer be able to trust current cryptographic protocols used for credit cards and internet security. At the same time, much safer technologies would be implemented exploiting related quantum information technologies, such as quantum key distribution.

There are currently many different approaches to quantum computation. Physically, qubits can be realized using, for example, neutral atoms, ions or even superconducting materials. The NIST setup, in particular, uses an array of radio-frequency traps, each able to cope with a small number of ions. The setup was divided into regions for storing qubits, regions for performing quantum operations and regions for transporting the qubits. The big problem with this approach is heating up of ions during transport because hot ions are likely to emit photons, thereby altering their internal state and thus the qubit.

NIST researchers have found a way to inhibit the adverse effects of heating during transport, thereby securing the scalability of their experiment. "Our trick was to use two types of ions in our experiments," Jonathan Home explains, "one for carrying the quantum information and one as a cooling agent." Home performed the NIST experiment and points out how important the controlled interplay between the ions is. "Beryllium ions have a favorable atomic level structure for storing our quantum information which renders them essentially insensitive to remaining external magnetic fields," Home continues, "and Magnesium can be cooled efficiently without disturbing the Beryllium ions." Beryllium-Magnesium couples were used for their experiments. During transport the ions heated up, once at the target location, however, they could be cooled down to the low temperatures necessary for performing quantum operations using dedicated lasers. Achieving this with single quantum bits, as well as with pairs, constitutes the proof that all the necessary operations for a quantum computer could be achieved using this same, scalable setup.

"Today, quantum computers are probably at the stage at which conventional computers were in the first half of the 20th century," says Renato Renner, head of the Quantum Information Theory group at the Swiss Federal Institute of Technology in Zurich (ETH). Ion trap technology, according to Renner, is a very promising way for studying quantum computers, even though he expects further technological and scientific breakthroughs — similar to a transistor replacing tubes for classical computation — to be necessary before real quantum computers will be built. "Nonetheless," Renner insists, "being able to perform all the relevant steps, including transport, in the same scalable experiment is a great achievement." "The usefulness of current setups, as both Home and Renner agree, is in exploring quantum," Renner adds, "even quantum simulators, the precursors of general purpose quantum computers, might soon be useful for studying open problems in quantum physics, for example." "Quantum information," Home concludes, "has been a very fast-paced topic over the last couple of years and it is simply extremely fascinating to see how we advance in our concepts in mathematics, our understanding of physics, and our possibilities in engineering. It is truly exciting to see how all of this evolves!"