Scientists raise quantum error threshold

New design allows up to a quarter of qubits to be lost.

Researchers have devised a theoretical quantum computer that could function even if one in four quantum bits (qubits) were missing.

With scientists struggling to build devices as large as three qubits, the new method could bring future applications closer by lowering the engineering requirements of a functional machine.

University of Queensland physicist Thomas Stace worked with Sean Barrett of the Imperial College London to address two quantum information issues: decoherence and loss.

The former pertained to inaccuracies in the information carried by qubits. The latter dealt with the loss of qubits themselves.

Stace explained that quantum computers that used photons - particles of light - as qubits risked losing some of these particles as they were scattered or absorbed.

Some researchers have devised methods that could tolerate the loss of one in two qubits. Other theories allowed for decoherence in one in a hundred qubits.

But none tolerated both decoherence and loss to a great degree until now. Stace said the next most tolerant method, by Queensland physicists Michael Nielsen, Christopher Dawson and Henry Hasselgrove, tolerated 0.1 percent loss and 0.01 percent decoherence.

Stace and Barrett's method, detailed in this week's Physical Review Letters, was based on the work of the University of British Columbia's Robert Raussendorf.

While traditional machines manipulated bits sequentially, using a series of logic gates, Stace and colleagues suggested that quantum computations be performed by measuring qubits initially laid out in a complex pattern.

All rights reserved. This material may not be published, broadcast, rewritten or redistributed in any form without prior authorisation.Your use of this website
constitutes acceptance of nextmedia's Privacy Policy and
Terms & Conditions.