NIST demonstrates method for reducing errors in quantum computing

By William Jackson

May 01, 2009

A team of researchers working at the National Institute of Standards and Technology in Boulder, Colo., have demonstrated the effectiveness of using microwave pulses to suppress errors in quantum bits, or qubits, the media for carrying and manipulating data in the still experimental field of quantum computing.

The dynamical decoupling technique using microwave pulses they tested is not new, said John Bollinger, lead scientist on the project.

“It’s something we borrowed from the [magnetic resonance imaging] community that was developed in the ’50s and ’60s,” Bollinger said. “Our work is a validation of an idea that has been out there.”

But the experiments also advanced the theories, said Michael J. Biercuk, a NIST researcher who took part in the work. By using new pulse sequences, researchers demonstrated that the number of errors introduced into quantum computing through environmental noise could be reduced by an order of magnitude. This means the expected error rate can be brought down to well below the threshold for fault tolerance in quantum computing.

The ability to suppress errors before they accumulate is important because qubits are to subject to the introduction of errors through stray electromagnetic “noise” in the environment. To date, there is no practical way to correct these qubit errors.

Quantum computing uses subatomic particles rather than binary bits to carry and manipulate information. While a traditional bit is either on or off, a 1 or a 0, a qubit can exist in both states simultaneously. Once harnessed, this superposition of states should let quantum computers extract patterns from possible outputs of huge computations without performing all of them, allowing them to crack complex problems not solvable by traditional binary computers.

The researchers used an array of about 1,000 ultracold beryllium ions held in a magnetic field as the qubits. Sequences of microwave pulses were used to reverse changes introduced into the quantum states. The pulses in effect decouple the qubits from electromagnetic noise in the environment.

Work on using the technique for suppressing quantum errors began a decade ago, Biercuk said. “Our work validated essentially all of the work” that had been done up to this point. It also introduced new ideas by moving the pulses relative to each other in the patterns, rather than increasing the number of pulses. The results showed an unexpectedly high rate of error suppression.

The novel pulse sequences are tailored to the specific noise environment. The effective sequences can be found quickly through an experimental feedback technique and were shown to significantly outperform other sequences. The researchers tested these sequences under realistic noise conditions for different qubit technologies, making their results broadly applicable.

Announcement of the work comes a little more than a month after other NIST researchers showed that a promising technique for correcting quantum errors would not work. The technique, called transversal encoded quantum gates, seemed simple at first. “But after substantial effort, no one was able to find a quantum code to do that,” said information theorist Bryan Eastin. “We were able to show that a way doesn’t exist.”

The transversal operations used by Eastin were a “specific case” of error correction, Biercuk said, and the work does not mean that error correction cannot be done in quantum computers. Effective techniques for suppressing errors would mean that any error correction method would also be more effective, since there would be fewer errors to deal with.

But quantum computing still is some years away. Biercuk said that practical quantum computing already has been demonstrated with arrays of several coupled qubits. “That is wonderful from an experimental point of view, but it is not useful,” he said.

A quantum computer useful for doing complex simulations would require an array of about 100 qubits, he said. “That’s at least a decade away.” A computer capable of doing cryptographic factoring on a scale that cannot be done effectively by traditional computers still is 20 to 30 years off, he said.