Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

greg65535 (1209048) writes "Following the trend of on-line coding playgrounds like JSFiddle or CodePen, Google researchers unveiled the first browser-based, GPU-powered Quantum Computing Playground. With a typical GPU card you can simulate up to 22 qubits, write, debug, and share your programs, visualize the quantum state in 2D and 3D, see quantum factorization and quantum search in action, and even... execute your code backwards."

In Chaos theory, things appear random because they are deterministic but you don't have perfect information to calculate the result. Your lack of information introduces randomness. Dice, for example, fall based on their mass, their momentum, air density, the shape and material properties of the surface, etc. These things are themselves imparted by how they're thrown, by the temperature and humidity and make-up of the air, and so on. If you could know all of the

Would a simple botnet be able to easily crack all encryption crackable by quantum computing, or are there better ways to go at it given a botnet?

Yes it is crackable using a bother simulating a quantum computer, in the same sense that you would be able to simulate a quantum computer solving the traveling salesman problem by using a botnet. Or by using a massively parallel supercomputer.

That is to say, the quantum computer simulation is Turing computable. This really doesn't help for anything more than trivial problems, much like pointing out the Halting Problem is decidable if you "simply" observe the Turing machine for the appropriate Busy Beaver [wikipedia.org] function's number of execution steps.

More succinctly, the simulation would gain you nothing over a direct parallel processing attack on the key space, and in fact the quantum computer simulation would add execution overhead that would reduce efficiency compared to straightforward brute force attacks.

It might be interesting if they introduced some user-selectable amounts of simulated decoherance, though -- perhaps to allow for simulation of quantum error correction, etc. Looking at this locally, it could be non-unitary (though I'm not sure the extent of the environment that one would model for such a computer simulator). Fun stuff, in any event.

no, it's not ironic. the simulator is just a fucking project they're hosting because it's 'cool'. they are not investing anything in it beyond a smidgen of bandwidth and disk space, and they are not endorsing it.

they're also not "betting heavily" on D-wave. it was a stab-in-the-dark just-in-case thing which they could afford with the coins under Sergei's couch cushions, and despite that i wouldn't be surprised if they're still regretting how hopeless their investment turned out to be. D-wave is bullshit.

Well let's compare. Geordie Rose spent years and millions of dollars trying (and succeeding) in building a computational device that works on radically different principles than existing computer tech, is actually useful for a lot of real-world tasks, and consumes virtually zero power - a huge feat in itself, even if it's not really a "quantum computer" in the traditional sense of the word. Whereas those people disagreeing with him are all ivory tower academics who have not built and do not plan to build any hardware. The most egregious of which is Scott Aaronson who is known for his delusional rants on everything from neuroscience to fundamental physics. I wonder which one has their head grounded more firmly in reality.

But seriously though, the fundamental principles of gate-based and adiabatic quantum computing aren't that different; it's more a continuum where on one end you have highly decoherent classical behavior, on the other you have pure quantum behavior, and in the middle you have quantum+noise behavior where tiny entanglements are being generated and decohered on a rapid scale that is too short to do quantum computing but long enough to do adiabatic quantum computing. It's possible that by investing in AQC technology, as the technology matures it will give better and better entanglement and eventually approach a pure quantum computer in capability.

I'm sorry if my post sounded like a commercial, it's just that I've done a lot of research on D-wave's hardware and it's really impressive what such a small team managed to pull off. At least they're doing something.

What's empty is your straw-man argument. Of course most academics do excellent work.

What the original poster claimed was academics in the QC hardware business dismissing D-Wave. The most outspoken critic is a theorists. Is it too much to ask to get a link to a more hardware oriented academic going on the record with regards to D-Wave?

For a regular computer to be reversible it needs reversible logic gates. For example, a standard XOR gate loses one bit of information, so given the output you cannot construct the input perfectly (as there are two possible inputs for each output).

For a regular computer to be reversible it needs reversible logic gates. For example, a standard XOR gate loses one bit of information, so given the output you cannot construct the input perfectly (as there are two possible inputs for each output).

But the output from the opcode isn't stored back to both input memory locations at once ergo, XOR itself is reversible at the chip level, even if it writes back to one of the inputs just XOR the output with the other input. You're conflating the theory of computation with the actual computation. In THEORY you can delete bits, but in practice you actually can't -- Well, using the arrow of time created by sub-atomic entropy (quantum foam) you might be able to... but that will remain beyond your grasp for so

It's because of dudes like you that I am cross with Scott A. He has every right to be critical but his rhetoric is so over the top that he created a kind of parallel universe, that doesn't even allow for this kind of adiabatic quantum computation to be tried and tested.

Ignoring the typical Slashdot cynicism (and often lack of understanding disguised as such), this is actually pretty damn neat! Quantum mechanics and quantum computing using the gates model aren't intuitive, especially not for people without a physics background, so this could really help learning the fundamentals of quantum computing. Being able to visualize the state of the qubits at each step of the process as something other than a big formula is a pretty big deal.

Parent post is so full of (intentional?) disinformation that it hurts.

Why haven't we been doing this for decades? We have. The only novel part here is "in a web browser." Simulation is not a new concept. Any nondeterministic computing problem can be simulated by a deterministic machine, and vice versa.

Second, instruction runtime on the simulated machine does not correlate with the runtime on the physical machine -- at all. A deterministic machine can simulate a nondeterministic one in O(2^n) by trying every

You wrote "512kiB". This is incorrect. It should be "512KiB". Although "k" is the prefix for "kilo-", there is nothing such as an "iB", so the use of "k" is inappropriate here. Note that the prefix "Ki" is for "kibi-" and it applies here to "B" for "bytes."

Hold on a minute. If it's possible to simulate qubits using, at the bottom, bits, and, if qubits and quantum computing allow for performing NP calcs in parametric time (and hence breaking crypto), then haven't we already been able to do all of these things for decades?