Posted
by
samzenpus
on Wednesday August 31, 2011 @03:45PM
from the top-of-the-class dept.

An anonymous reader writes "Thanks to advances in experimental design, physicists at the National Institute of Standards and Technology have achieved a record-low probability of error in quantum information processing with a single quantum bit (qubit) — the first published error rate small enough to meet theoretical requirements for building viable quantum computers. 'One error per 10,000 logic operations is a commonly agreed upon target for a low enough error rate to use error correction protocols in a quantum computer,' said Kenton Brown, who led the project as a NIST postdoctoral researcher. 'It is generally accepted that if error rates are above that, you will introduce more errors in your correction operations than you are able to correct. We've been able to show that we have good enough control over our single-qubit operations that our probability of error is 1 per 50,000 logic operations.'"

Considering it was a joke, I don't think it was that big of deal that there was a "mistake." In fact, people might not have gotten the joke if the author had written it like you did.... since it would have been kind of ruined. =/

Most early computing errors were caused by memory (not RAM as early technologies weren't random access) The shift from mercury delay lines to magnetic cores saw a serveral-orders-of-magnitude drop in error rates, and the associated increase in the viability of general purpose computing.

I think a lot of us has forgotten how bad old computers were at with hardware errors, and how much environment can effect the old computers. My old Amstrad CPC1512 use cause programs to crash or odd inputs on the screen when ever I turned on the fan.

You don't need such low error correction rates though. The key is that if the error rates are low enough you can then use clever error correcting mechanisms. But if the error rate from stray particles and other issues causing you to repeatedly lose quantum entanglement is too high then you can't use clever algorithms to deal with these problems. But if you have an error rate below about 1 in every 10,000 operations then you can use the good stuff. Note that by the entire nature of quantum computers even if

Er, what? 1 error in 9 billion bits is around ten errors per second. A computer with such an error rate is worthless unless you build your algorithms specifically to handle that. Only if you prove the errors are completely random you can put three chips next to each other and vote.

It isn't at all clear that D-Waves system is using any sort of quantum entanglement at all. D-Wave has had a long history of massive hype. See e.g. http://www.scottaaronson.com/blog/?p=431 [scottaaronson.com]. It isn't at all clear that D-Waves commercial system can do any of the things that we expect a quantum computer to do like say factor integers using Shor's algorithm http://en.wikipedia.org/wiki/Shor's_algorithm [wikipedia.org]. It seems that D-Wave has made a fast computer but there's very little evidence that it actually is using quan

An important thing to recognize is that most of this experiment was done with a single qubit. Practical quantum computing will need to have this sort of error rate for thousands of qubits. The good news is that the methodology they used looks very promising. They used microwave beams rather than lasers to manipulate the ions. This has been I think suggested before but this may be the first successful use of that sort of thing. As TFA discusses, this drastically reduces the error rate as well as the rate of stray ions.

We are starting to move towards the point where quantum computers may be practical. But we're still a long way off. In the first few years of the last decade a few different groups successfully factored 15 as 3*5 using a quantum computer. (15 is the smallest number which is non-trivial to factor using a quantum computer since the fast factoring algorithm for quantum computers- Shore's algorithm- requires an odd composite number that is not a perfect power. It is easy to factor a perfect kth power a bit by looking instead at the kth root. And factoring an even number is easily reduced to factoring an odd number. So 15 is the smallest interesting case where the quantum aspects of the process matter.) Those systems used a classical NMR system http://en.wikipedia.org/wiki/Nuclear_magnetic_resonance_(NMR)_quantum_computing [wikipedia.org] which has since been seen as too limited. There are now a lot of different ideas of other approaches that will scale better but so far they haven't been that successful.

One important thing to realize is that quantum computers will not magically solve everything. They can do a few things quite quickly such as factor large numbers. But they can't for example solve NP complete problems to the best of our knowledge, and it is widely believed that NP complete problems cannot be solved in polynomial time with a quantum computer. That is, it is believed that BQP is a proper subset of NP. Unfortunately, right now we can't even show that that BQP is a subset of NP, let alone that it is a proper subset. Factoring big integers is useful mainly for a small number of number theorists and a large number of other people who want to break cryptography. There are a few other cryptographic systems that can also be broken more easily by a quantum computer but there's not that much else. However, that is changing and people getting a better and better understanding of what can be done with quantum computers. A lot of the work has involved clever stuff involving using quantum computers to quickly calculate stuff related to Fourier series. Moreover, once we get even the most marginally useful quantum computers there will be a lot more incentive to figure out what sorts of practical things can be done with them.

So the upshot is that these are still a long way off, but they are coming. The way it looked in the late 1990s or early 2000s it was reasonable to think that the technical difficulties would make them never practical. They still are a long way from being practical but right now it doesn't look like there are any fundamental physical barriers and it looks like in the long run the problems that do exist will be solved.

Assuming that's even possible to place a QC die in the same CPU package. If the hardware to manipulate a quantum chip is complex enough, it may come in the form of a PCIe card at best. Otherwise, it could take form as a completely separate break-out box using a Thunderbolt interface.

Chances are pretty good that your Quantum Computer will be running at liquid helium temperatures, maybe 4 Kelvin or so. Your general purpose CPU won't. There have been projects to run CPUs at liquid-nitrogen temperatures, and that already tends to get into mechanical difficulties; you're probably not going to be running your overclocked Xeon down at 4K.

Also, the quantum computer isn't likely to be something you're pumping a lot of data through - you're more likely to set it up, have it magically give you

An important thing to recognize is that most of this experiment was done with a single qubit. Practical quantum computing will need to have this sort of error rate for thousands of qubits.

I'd take any quantum computer with 50 qubits and get a Nobel prize for beating the shit out of all current supercomputers simulating quantum systems like high-temperature superconductors, quark bound states such as proton and neutrons, or quantum magnets. Also, keep in mind that Rainer Blatt's group recently succeeded in demonstrating entanglement between 14 qubits [arxiv.org] in a similar setup. And for quantum simulations, the error rates probably don't have to be crazily low anyway, it turns out that such errors typ

You are correct; ion trap systems are in principle scalable, but the task is very challenging (probably much harder than scaling, say, a superconductor-based system). But it is much more than this. The operations needed to manipulate a single qubit are significantly different from the operations needed to interact two qubits. (There is no need to directly interact three or more qubits, since such gates can be built up out of two-qubit operations, much like the two-bit AND and one-bit NOT are universal fo

Am I the only one who has difficulty thinking of quantum computers as things that actually exist and do calculations? It's like my brain has placed "quantum computer" firmly into the category "things that are theoretically possible but unable to be built with current technology", and refuses to change it, even to "things that exist in the lab but won't be commercially viable for decades outside classified government work".

The problem is that people are not generally aware of what a quantum computer would be useful for. Why should I care if there is a quantum computer sitting under my desk? How do I benefit from quantum algorithms?

There are indeed tangible benefits to quantum computing, beyond just attacking public key cryptosystems. As an example, quantum computers can speed up certain search algorithms, which is one of the promised commercial applications of a quantum computer.

That's because as of now it still is in the "theoretically possible but unable to be built." Keep in mind this new technique is for one (1) qubit. You need more (at least 8? Or do quantum computers work that differently from normal ones) to do anything practical. And it only meets the theoretical requirement, once you use ECC. Previously, it wasn't accurate enough that you could count on the ECC to be performed right. Making a quantum computer, even in the lab, is a few years off yet.

"Existing but not able to do anything practical" is still a pretty big difference from "Could exist but nobody's built one yet". It's like the early airplanes - they existed, but they weren't exactly useful for anything besides proving that heavier-than-air aircraft could work.

What we really need is a "standardized" open-source quantum computing language so that we can develop and exchange quantum algorithms to prepare for the day when quantum computers are real.

Right now we have the QCL [tuwien.ac.at] language, QCF [sourceforge.net] for Matlab/Octave, and the Cove framework [youtube.com] that could be used with any language, but it looks like there is really only a C# implementation right now.

None of these have really taken hold as a "standard" though, and probably elements of all of them could be brought together in somet