Posted
by
samzenpus
on Thursday June 19, 2014 @07:09PM
from the you're-doing-it-wrong dept.

sciencehabit writes The D-Wave computer, marketed as a groundbreaking quantum machine that runs circles around conventional computers, solves problems no faster than an ordinary rival, a new test shows. Some researchers call the test of the controversial device, described in Science, the fairest comparison yet. "...to test D-Wave’s machine, Matthias Troyer, a physicist at the Swiss Federal Institute of Technology, Zurich, and colleagues didn't just race it against an ordinary computer. Instead, they measured how the time needed to solve a problem increases with the problem's size. That's key because the whole idea behind quantum computing is that the time will grow much more slowly for a quantum computer than for an ordinary one. In particular, a full-fledged 'universal' quantum computer should be able to factor huge numbers ever faster than an ordinary computer as the size of the numbers grow." D-Wave argues that the computations used in the study were too easy to show what its novel chips can do.

No. They are trying to measure the growth of the problem. So its not important which one is absolutely faster, but which one takes relatively more time as the problem becomes more difficult.

The conventional computer should take exponentially longer as the problem becomes mroe difficult. The quantum one should not.

In this test, both took exponentially longer. So either the d-wave doesn't work, or as the manufacturer has claimed, the problems were not setup to demonstrate the class of problems where the d-wave will show better performance relative to problem complexity growth than a conventional computer.

Seems odd to me though that they can't provide easily verified sample problem spaces where their device works better than a conventional PC as the problem gets 'bigger'.

In this test, both took exponentially longer. So either the d-wave doesn't work, or as the manufacturer has claimed, the problems were not setup to demonstrate the class of problems where the d-wave will show better performance relative to problem complexity growth than a conventional computer.

Or the maximum problem size handled by the machine is too small so see the sub-exponential growth. Complexity theory deals with performance on very large data sets, and, if I remember correctly, the D-wave machine is limited to a small problem size (max a hundred or so qbits)

I'm not sure I get this argument. The guys selling this stuff have said for a while that their device is fast enough at quantum annealing to be useful for learning to program quantum computers, and that when their manufacturing ramps up they'll have many more qubits, and I think the implication is that the speed doesn't scale linearly. They were telling the Googles and the Lockheeds, 'look you need to invest in our product and services so you can be ready in the quantum computing space when the better hardware emerges'.

That it's not absolutely faster than a conventional computer at this point is interesting, academically, but not terribly relevant to their sales pitch, unless we can show that the problem at hand fits inside their limited qubit space and the types of algorithms its supposed to be able to handle at this point, and still does not do what's expected of it.

Also: did a tiny Canadian computer company produce a computer that's as fast (within the problem space) as a modern Xeon on their slim budget? That would almost seem revolutionary - AMD can't even do that with GlobalFoundary's fab on their side.

Maybe it is a scam, but this kind of analysis seems somewhat orthogonal to their claims. By all means, pop one open and find the i7 inside, and there won't be any question, but that's not really where we are today.

I'm not sure I get this argument. The guys selling this stuff have said for a while that their device is fast enough at quantum annealing to be useful for learning to program quantum computers, and that when their manufacturing ramps up they'll have many more qubits, and I think the implication is that the speed doesn't scale linearly. They were telling the Googles and the Lockheeds, 'look you need to invest in our product and services so you can be ready in the quantum computing space when the better hardware emerges'.

Where do you get this interpretation from? I do not see anything like that on their website. I have, however, seen them publish spurious performance comparisons, which seems like an odd choice if their intent is solely to provide a test system. If they are fully aware that their system cannot outperform even a cheap traditional computer, perhaps they should stop making performance claims. Of course then the only thing they have to sell is the unverified claim that their machine is actually doing quantum ann

well they still haven't provided an actual use for a computer they are selling for an actual use.

if you think they're not selling it for actual use as better than traditional computing, just go to their fucking website and see.

and afaik, it's not disclosed now what the auxiliary computers running the quantum annealing chip do. point being, we're not allowed to take it open and look at everything that happens AND it has plenty of oldschool computing power attached to it.

"The D-Wave machine is not a universal quantum computer, however, but a more limited "quantum annealer."", which according to wikipedia seems to mean some sort of global minimum finder (given how to find all the local minimum solutions, find the lowest one).

With a mere 512 quibits available on the D-Wave device I'm more than willing to believe they may be still in the area of small inputs where an O(n) algorithm can still beat an O(log n) algorithm (e.g. http://cse.csusb.edu/dick/cs20... [csusb.edu] )

That's been the big question with D-Wave all along. What does it really do, how does it really work, what's it good for, is it real?

Everybody knows what a universal quantum computer is good for - running Shor's algorithm to do factoring and totally wrecking public-key cryptography, plus whatever other problems people care about in the real world. But general-purpose quantum computers so far can't keep enough qbits entangled together to factor numbers bigger than 21 = 3x7, and if anybody's figured out how to do significantly bigger than that, they're keeping it Really Well Hidden (either because they're a government, or because a government will want them to do stuff, or because a government will want them killed.)

Meanwhile, D-Wave has 512 qbits that they claim they'll be able to do something with, and maybe it'll have a chance of being cool or useful. And maybe if you kick in enough megabucks to get a non-disclosure agreement, you'll be able to get some information beyond vague quantumy handwaving. They are the only game in town, after all.

Except there have been quantum annealing algorithms for classical computers for some time now, and if the chip can't show any signs of being better at doing so, it isn't justifying the costs even for research purposes. If they could demonstrate some improved scaling, that might be good for getting investment in further development, PR for the company, and for pure research purposes. But it wouldn't be of much use, even as something to try out for programming purposes when the algorithms, even quantum in n

I've been pretty sure it's a scam for some time now. They have yet to demonstrate anything quantum about it at all, nor can they produce problems that it solves faster than a conventional computer. This is not the first time.

No, you said that they haven't yet demonstrated anything quantum about it, and I provided you with a link proving otherwise.

It may not be more useful, compact, or flexible than an existing well known and well optimized method of completing the same task, but neither were early electromechanical machines or some of the earliest digital computers.

In order to qualify it as a failed project or a scam you'd need to clearly demonstrate that it doesn't do what it claims to do, not that it doesn't do it as well as

you link to shill rubbish, the D machine is just another example of doing simulated annealing poorly, which is why classical machines can solve the fabricated and invented D-Wave problem by means scalable without limit

I do think DWave honestly believes their ideas will work so you would not be able to prove fraud. Also it is very hard for another technolgy to catch up with existing tech when that techngy is still doubling about every two years.

Using quantum annealing to solve non-linear multivariate optimization problems is theoretically faster than using traditional turing computation. It definitely needs more development to overcome a normal workstation or supercomputer, but it will most likely happen eventually.

In which case you'd think somebody could come up with a non-linear multivariate optimization problem to prove its superiority. (Being able to solve those things efficiently would be cool.) I'm a bit of an empiricist: if somebody can come up with good evidence that the D-Wave can solve one of those with lower asymptotic complexity than a properly programmed supercomputer, I can be convinced.

The drawing board. I always thought the quickest computer was one that contained the most specific cores to handle specific types of calculations and specific operations. We seem to have slowed down in that regard, getting more cores but not specific operation cores, also getting more functions inside the CPU itself rather than scattered all over the motherboard, with data having to travel huge distances.

I always thought the next big speed jump would be stabilising the OS and GUI and getting it inside the CPU, for much faster boots and better multi-tasking ie all OS functions running parallel and separate to application functions, the advantage of FOSS OS, more general acceptance. Endless upgrades driven by M$ wanting the sell the same software over and over again seems such a waste.

Vacuum tubes weren't particularly quantum, as long as you don't count "electron acting like a charged particle" as quantum and are dealing with large enough currents that you don't care about counting the precise timing of individual ones. Basic electrical forces do the job fine.

Transistors may be doing quantum stuff, and tunnel diodes are the classic quantum thing.

The electron emission part is a quantum effect. It can't be explained in purely classical terms. Fortunately the end result can be reduced down to a couple of handy equations simple enough for non-quantum-physicists to make use of the tubes.

Heisenberg's Uncertainty Principle: If they knew for a fact their claims were fraudulent, they would not know what their claims were. Since they in fact know that they are claiming to have a quantum computer, they cannot know whether the claim is fraudulent or not.

This could mean that D-Wave isn't quantum. Or it could mean that quantum computing in general isn't faster than normal computing. I seem to recall some physicist making a bet that quantum computing would be proved equivalent to classical computing.

Actually, the only real reason to do research in that direction is to verify some aspects of quantum theory. Quantum Computers (if possible) face extreme scalability challenges that are basically sure to prevent scaling to anything really useful. remember that QCs cannot do parallel computing with several machines, cannot do problem division and cannot do larger numbers of sequential steps.In short, they cannot do any of the things that allow conventional computers to scale.

I've been surprised time and again that D-Wave has kept afloat as long as it has. It *will* fail in the end, the only question is how much of that investment money Geordie Rose got safely stashed before the collapse. Their approach is fundamentally not quantum computing.

Because the machine costs ten million dollars and the people selling it are obviously not going to publish information that portrays their machine in a bad light. Very few people have access to these, and those who do often have a vested interest in convincing people the machine is worthwhile.

Because nobody in CS actually competent wants anything to do with this scam. This was done by a physicist, and they do not face the risk of being regarded as incompetent Computer Scientists when associating with this thing.

I think its a pretty nonsensical claim. It is not a quantum computer but only 'a computer that uses quantum effects.' Well transistors use a quantum effect so we could really make the same claim about any computer.

I have done work on the edges of quantum computation, and have to say that I think that real quantum computers are still miles off. The model I am working on is totally fringe - but it emerged from looking at the human brain. Now there is a machine that is a real 'quantum computer' and works at r

Alot of hype about these quantum computers, D-Wave in particular making claims.Theres a limited class of problems that that type of computing is applicable to and thses things are not the equivalent of a general purpose computer.

Way too easy? Well let them supply a spectrum of appropriate quantum problems (and not just the subset their hardware's quantum like effect can handle) so that it can be properly demonstrated.

What we need to know about is the existence or non-existence of unfair comparisons, i.e., problems that favor the putatively "quantum" computer.

Since I don't expect a quantum computer to be faster at everything, then finding a bunch of solutions to problems that aren't any faster on the "quantum computer" doesn't prove anything, even if the problems look like the kind of problems you'd hope would be quantum-computery. There's not much more you can do than point to the absence of evidence when the burden of proof isn't on you.

The burden of proof is on the vendor here, and standard of "proof" is conceptually simple at least: demonstrate that for some task this device offers any practical advantage whatsoever over the best available conventional technology. That could be in absolute performance against the best available tech(e.g. ASICs and supercomputers), in relative performance over similarly priced systems, or in some practical measure other than performance, such as power consumption. Any clearly identifiable and verifiable advantage counts as positive proof the vendor has something worth paying attention to.

Of course even comparable performance by a novel architecture on some class of problems is interesting, because of the huge advantages a mature technology enjoys. Performance of a new design even in the same ballpark as a mature design suggests future improvements might be in the works. But it's only a suggestion.

The burden of proof is on the vendor here, and standard of "proof" is conceptually simple at least: demonstrate that for some task this device offers any practical advantage whatsoever over the best available conventional technology. That could be in absolute performance against the best available tech(e.g. ASICs and supercomputers), in relative performance over similarly priced systems, or in some practical measure other than performance, such as power consumption. Any clearly identifiable and verifiable advantage counts as positive proof the vendor has something worth paying attention to.

That's why this whole thing smells to me.

They're the vendor. They have access to the hardware. They're the experts on how it works.

So, release some benchmarks. Release some source code to a problem the machine does well at, demonstrating that it behaves like a quantum computer, even if it is only for certain problem sets. Release your methods so that anybody else can run them on the machine they buy from you.

If I claimed that I had a machine that could turn lead into gold, and for $20M you can buy one a

But the makers of the computer can't find a single problem it solves well. Why is that?

Aside from profit, why is that question even relevant? It took a century for the geocentric model to give more accurate results than the old heliocentric model. Here we appear to have quite a few independent observers who know quantum annealing when they see it, I am not one of them. Sure it could be a scam but so far I have seen zero evidence supporting that hypothesis.

It took a century for the geocentric model to give more accurate results than the old heliocentric model.

But the "benefits" were immediate. The geocentric model was complicated and incomplete. The heliocentric was "better" day-1 for some things. But the accuracy was more a limit of the measurements at the time. It also took time for others to come on board. People were afraid that if they admitted the heliocentric model was better, they'd be sent to Hell.