THIS summer, physicists celebrated a triumph that many consider fundamental to our understanding of the physical world: the discovery, after a multibillion-dollar effort, of the Higgs boson.

Given its importance, many of us in the physics community expected the event to earn this year’s Nobel Prize in Physics. Instead, the award went to achievements in a field far less well known and vastly less expensive: quantum information.

Quote:

Classical computers use “bits” of information that can be either 0 or 1. But quantum-information technologies let scientists consider “qubits,” quantum bits of information that are both 0 and 1 at the same time. Logic circuits, made of qubits directly harnessing the weirdness of superpositions, allow a quantum computer to calculate vastly faster than anything existing today. A quantum machine using no more than 300 qubits would be a million, trillion, trillion, trillion times faster than the most modern supercomputer.

I cannot say that I have the least understanding of qubits -- both my quantum mechanics studies and my philosophy classes all ended in the 1970s.

However, there's an interesting article on quantum cryptographics in ars technica which I found fascinating -- this is a real world application of quantum computing in the early stages of deployment, utilizing a "pseudo single photon source".

The article is long, hopefully those interested in either quantum computing or cyrptography will find it of value: