If you think testing a chip with a gazillion transistors is a challenge, try testing a handful of qubits in the quantum computing world. To confirm all the possible states of just eight qubits needs four billion or so measurements.
The problem of characterization, as it is known, is the target of a technique developed by a team …

as the good doctor Fedrizzi and his hench-boffins are apparently taking questions...

I think you're asking a question I've been wondering about ever since this quantum crap started, namely that they were promising too much to be credible. Quantum computation apparently grows (from what I've read) non-linearly as qbits are added [*], yet accuracy is always the problem. I've long wondered if there's a fundamental link between computation and accuracy in that there's an upper limit to one which, if exceeded, starts to eat into the second. But no-one's ever raised this point that I've seen. Anyone here know any better?

[*] for a given unit of computing power, you can expand this by a cube power in 3 dimensions ie. a box 1 foot per side holds 1 unit of computation, double the boxes in each direction & you get a box 2 foot per side & holding 8 units of computation. You can't do better in 3d space, but quantum promises a much higher scaling, so there's a conflict.

Pandoran Boxes

"I think you're asking a question I've been wondering about ever since this quantum crap started, namely that they were promising too much to be credible. Quantum computation apparently grows (from what I've read) non-linearly as qbits are added [*], yet accuracy is always the problem. I've long wondered if there's a fundamental link between computation and accuracy in that there's an upper limit to one which, if exceeded, starts to eat into the second. But no-one's ever raised this point that I've seen. Anyone here know any better?" ...... Anonymous Coward Posted Thursday 5th May 2011 01:16 GMT

AC, Hi,

As has been explained, is quantum computation, by the very flexible and liquid nature of its elementary and component parts [its qubits and qubytes], on an altogether different plane of Command and Control, and you may like to consider that it is not at all concerned with accuracy but rather more developed and driven by truth.

A quantum computer is a virtual machine [is it not], and something ethereal which just delivers advanced intelligence and core information on which to build future programs ....... in SMARTer Cloud Networks and Semantic Webs?

It is certainly something which is being trialled and live betatested in earnest for virtual reality production, by others........ http://amanfrommars.blogspot.com/p/ai-and-its-virtual-os.html

Thermodynamics

There's definitely a link between computation and energy consumption - it takes a certain amount of energy to "know" a bit of information at a given temperature. Can quantum machines bypass this limit?

Or, as with the univere's speed limit, does Mother Nature always have a gotcha no matter how clever you try to be?

Well...

Here's my take:

Consider the quantum computer as an analog computer. You initialize its N-qbit array [giving you a vector of length 1 in a 2^N - dimensional space of complex values, each dimension corresponding to one of the 2^N N-bit-patterns and the magnitude of the vector along that dimension corresponding to the probability density of getting that bit pattern on readout], then you let it run for a while while not touching it [the Hamilton implied by the hardwiring of the computer will evolve the vector to another one, keeping its length constant or reducing it], then you read it out [the vector is projected into a subspace, as the probability of giving certain bitpatterns reduces obviously to 0 if you don't get them on readout].

This procedure is inherently probabilistic! You may not get a solution to your problem at the end. In that case, one has to run the computation again. This should be manageable.

The "projection" or "readout" may happen early. If interaction with the surroundings occur, your computation will be messed up. This also seems to be manageable through "quantum error correction", for which there are some nice encouraging theorems I hear. They say something about how the computer has to be built to reduce the impact of early readouts.

The above is related to the question of "how many qbits can you actually harness until your computer becomes seriously classical". "Becoming classical" apparently happens exponentially fast as you scale up (the probabilities of being in a not-yet-projected state go to ~0 really fast, which is why Schroedinger's cat is alive or dead but never both [A superselection rule for large systems, see also: http://en.wikipedia.org/wiki/Superselection]. So at some point, your "quantum error correction" will no longer manager to keep the system in superposition... but I don't know when that happens. Probably well beyond 1000 qbits.

Love it

Maths, tsk, tsk...

"factoring very large prime numbers" is really easy. Any prime has 2 factors, 1 and itself, regardless of its size. I think you mean either testing very large numbers for prime-ness or factoring a large number into primes - both of which quantum computers should be good at.

Citation

Here's a link to the relevant abstract, Anonymous Coward :

http://prl.aps.org/abstract/PRL/v106/i10/e100401. Let us hope that the next time the Reg decides to report on matters of this sort, an author qualified to understand the subject matter will be chosen !...

arxiv version

backslash?

Hate to be a dweeb (hmm, no I don't), but shouldn't that be a forward slash ('/') in the formula and not a backward slash ('\')? Methinks we have a Windows user who thinks that '\' is the normal slash. It's not - think of the idea of "A or B", sometimes written as "A/B" - that's a regular, forward slash.

Re: backslash?

Thank you for the dweeby well-spotted typo in the formula. In fact, there wasn't meant to be a slash at all, but in copying the formula from an e-mail, some of the symbols got scrambled. The \ was nothing more than an artefact of this, and I've removed it.

You'll have to be happy with 'almost'

Our universe is probablistic, not deterministic, so our computers perforce are the former not the latter. They too get right answers 'most of the time'. There is no 'always' about it, nor can there ever be.

Go Stanisław!

So let me get this right...

With your quantum computer you can throw in your fiendishly complex problem, and it will calculate it in a flash. But then it takes forever to actually read the answer... Wouldn't it have been just as good to skip the quantum computer in the first place?

Okay, so these guys have managed to speed the whole thing up, by taking a guess effectively. Just guess the answer in the first place and save yourself the hassle. That leaves you with a load of time on your hands, and a grant for a q-computer to spend. Pub o'clock?