Quantum Computing: it just sounds cooler. Instead of using simple binary values of 0 or 1, quantum computers use quantum bits qubits, values that can be both 0 AND 1. Stay with us here.

Scientists are using the outer electron of Phosphorous (really) because an electron’s polarity gives it a magnetic spin, meaning (like a compass needle) it’ll align with a magnetic field and hang out there at its base state. But with a little force, you can yank that electron 180-degrees in the opposite direction, its highest energy state, coiled like a watch spring, ready to TWANG back to its base state.

Base state, highest state. How is that different from the binary 0 and 1?

Here's the rub. Quantum particles can hang out at BOTH states, at least until you try to measure state. So which is it, base state or highest state? As Hamlet might have said, "To be zero, or to be one. That is the uncertainty." (Thanks a lot, Heisenberg).

Instead of pairing up normal computer binaries into bits ("01" is an absolute, constant value), we resort to the ultimate in scientific precision: *best guess*. We know that qubits can have spins that are—

both at base state,

the first at base and the second at highest state,

the first at highest state and the second at base state,

or both qubits at their highest state.

What are the chances that it will be one or the other of these possibilities? These percentages become our coefficients of likelihood. Most immediately obvious is the fact that two qubits can be in four different positions. That’s twice as many positions as a standard bit (always 00, always 10...); if we had THREE qubits, then each set can be in eight different positions (as opposed to always 000, always 100...). And this is the really big deal—qubits can hold exponentially more data than binary bits!

But lest we forget, once you measure the electrons, they fix in a state (either base or highest) and the exponential advantages disappear. Nonetheless, we can turn uncertainty to our advantage.

What this does: exponentially reduces the number of steps between a computational question and an answer.

What this doesn’t do: help your Netflix downloads finish faster.

Many aspects of our computing environments are classically-absolute (i.e. this screen pixel is absolutely this color) and classical, binary processors and processes are perfectly suited to such computing tasks.

Fine, we’ll have to accept our hedonistic greed for instant gratification, while we wait for technology to catch up and make Quantum Computing a reality. How much longer will we have to wait?

D-Wave Systems is the first to claim success, but its D-Wave One Quantum Computer isn’t universally accepted as the real deal. Nonetheless, D-Wave recently announced a collaboration between Google, NASA, and the non-profit Universities Space Research Association to build and install the first D-Wave Two at the Quantum Artificial Intelligence Lab at NASA's Ames Research Center in Moffett Field, CA.

In the quantum computing purist's corner, we have Serguei Kouzmine, a Russian physicist, who has started with the goal of investing in promising ideas that could lead to genuine quantum computers.The fund has already distributed $7 million to a handful of competing startups with different methodologies, but it could be some time before we see the results, perhaps even a few decades.

So don't count your Quanta, or qubits, before they've stopped spinning. Or more practically speaking, you might have better luck downloading the entire World Wide Web than waiting for a real quantum computer to download your movies.