Quantum goodness without either quantum memory and entanglement.

Whatever the message or the medium, the essence of communication is the transfer of data between two points separated in space. The distance can be small (between components inside a computer chip, or people sitting across a table) or very large (from Voyager 1 back to Earth), but the principles are the same. Quantum communication changes the nature of the data—it's the state of a quantum system—but that information still needs to be carried across space.

In most modern research, that process involves entanglement: linking the quantum state of the transmitter and receiver. Unfortunately, these quantum states are fragile. While a bit stored in computer memory is relatively stable, a quantum state can be altered by random interactions with its environment. But new model for quantum communication has been proposed that would not require either memory or entangling the quantum states of the transmitter and receiver.

W. J. Munro, A. M. Stephens, S. J. Devitt, K. A. Harrison, and Kae Nemoto have designed a system where quantum bits (qubits) were transferred by individual photons, but interpreted using a special algorithm designed to contain a lot of redundancy and avoid data loss. Since the states of the transmitter and receiver were not entangled (or copied), they don't need to remain coherent, obviating the need for quantum memory. The actual data transfer could take place over fiber optic cables, and the receiver could itself be used as a transmitter, forming a repeater for larger networks.

While this is a conceptual model for the moment, if implemented, it could help resolve some of the problems in quantum communications, or at least provide a workable stopgap solution until entanglement-based networks come of age.

The data in a quantum system is encoded as its quantum state—all the relevant physical parameters, including photon polarization. (An atom's quantum state is typically very complicated, involving the nuclear properties and configuration of every electron. As a result, quantum computing and communication based on atoms often relies on isolating a handful of those properties, such as the spin of one electron.) Quantum communication involves sending this state information between points.

A great deal of the research on quantum communication and computing involves entanglement: the linking of the quantum states of two systems separated in space. Measuring the state of one of those systems reveals the state of the second system instantaneously, no matter how widely they are separated, which is why entanglement is so useful for quantum communication. The problem then becomes that quantum systems can "decohere" due to environmental influences, changing their state and losing the information.

(Since no data is exchanged through entanglement, it's still necessary to use additional communications channels to tell the receiver what to do with their measurement. Additionally, the act of preparing entangled systems usually involves exchanging a photon, which while very fast is far from instantaneous. So, quantum networks are still limited by the speed of light.)

Thus, in these networks, quantum memory is essential for entanglement-based communication—storing the quantum state for a sufficient period. They're limited by the speed of establishing the entangled transmitter-receiver pair, and by how long the states remain coherent. The proposed model bypassed that problem entirely by not requiring the transmitter's quantum state maintain coherence longer than necessary to interact with a messenger photon.

In the researchers' proposal, both the transmitter and receiver contained a simple matter-based quantum system, such as a single electron spin. The state of the transmitter system was prepared in a particular way (the data), which was then encoded in the state of a messenger photon, in the form of a redundant code. The photon can then be sent down fiber optic cable to the receiver, where interaction with a similar system extracts the state of the original. At no point are the two matter-based systems required to be in the same state (as in quantum teleportation) or entangled, so their relative configuration is irrelevant to communication. Similarly, even though the transmitting system was prepared to be in a specific state, it didn't need to maintain that state once it interacts with the photon.

A lot of the paper was devoted to a discussion of the redundancy algorithm, which I will happily abstain from explaining. However, its purpose was to make it possible for communication even with photon losses, which are inevitable in fiber optic communication.

Without the need for entanglement or quantum memory, and with the redundancy algorithm, this model could potentially provide a much simpler method for quantum communication. The reduced number of steps also would result in faster speed, not for the communication itself, but for preparation and analysis of the quantum systems involved. All of this would have big benefits for quantum computers, which will need to manage communications between different components (like the memory and processor), and have data shuffled into them, possibly from remote locations.

A practical implementation of this project hopefully will be forthcoming soon.

This sounds far more doable with current technology than the "spooky" kind of quantum information, although I'm trying to figure out if the article's subtitle is some sort of pun.

My understanding of the technique is something like making a painting of your data, and while the paint is still wet rolling a messenger all over it in a specific pattern. Send the messenger to a remote post and have him roll over a blank canvas in the opposite order to recreate the original painting. Obviously skipping several nontrivial steps, but seems to be analogous.

My concern is that this type of setup would be vulnerable to man-in-the-middle attacks, the avoidance of which was the whole point of quantum communications.

Still, the data is stored by the quantum spin.But yeah, you only need to encode and decode, while you just send the information via "light" if you so will.No need to store if it's just send away. It's a no brainer...

I don't know if this is too much of a simplification, but I think if both types of communication are implemented, the unentangled method would be a fast and unreliable method, like UDP messages. The second method would be more complicated to use, but much more reliable, like TCP-IP messages. Maybe we can use both methods for different purposes.

I don't know if this is too much of a simplification, but I think if both types of communication are implemented, the unentangled method would be a fast and unreliable method, like UDP messages. The second method would be more complicated to use, but much more reliable, like TCP-IP messages. Maybe we can use both methods for different purposes.

That doesn't really work by analogy. The single biggest reason entanglement-based communication was being pursued wasn't reliability or performance, but rather the fact that it's theoretically much more secure, since it's impossible to eavesdrop without destroying the entanglement, and thus MITM attacks are easily detected/defeated. The new method still allows for communication of quantum states as is needed for general purpose quantum communication, but doesn't have the security benefit that a working form of entanglement would.

I don't know if this is too much of a simplification, but I think if both types of communication are implemented, the unentangled method would be a fast and unreliable method, like UDP messages. The second method would be more complicated to use, but much more reliable, like TCP-IP messages. Maybe we can use both methods for different purposes.

That doesn't really work by analogy. The single biggest reason entanglement-based communication was being pursued wasn't reliability or performance, but rather the fact that it's theoretically much more secure, since it's impossible to eavesdrop without destroying the entanglement, and thus MITM attacks are easily detected/defeated. The new method still allows for communication of quantum states as is needed for general purpose quantum communication, but doesn't have the security benefit that a working form of entanglement would.

For there to be a security benefit there would have to be working long-distance quantum communication, there's no room for a MITM in the lab.

Also, there is no need for this kind of security in intercomponent communications.

-- The reduced number of steps also would result in faster speed, not for the communication-- itself, but for preparation and analysis of the quantum systems involved. All of this would-- have big benefits for quantum computers, which will need to manage communications between-- different components (like the memory and processor), and have data shuffled into them,-- possibly from remote locations.

-- A practical implementation of this project hopefully will be forthcoming soon.

My guess that a practical implementation will be forthcoming in the next 5 years.

I'm sorry, I read through all that, basically just to summarize what's already known? You didn't say one qubit about what they're doing, other than "we had this idea - don't use entanglement, just encode the state in photons and reconstruct at the other end".. I feel like I got ripped off..

Correct me if I am wrong. Perhaps I have misunderstood what photon is. Form all the previous readings I have on Ars and wiki so far I understood that when a photon collided with another photon inside the faber optic or under vacuum out-of-space and whatnot it spin-off into two or may be multple photons and the original two photons that collided with each other died on collision. If this is the fact, the memory bit(s) the photon carrying died with it. And since it's no need for quantum memory or whatever that means, the data would be lose in lalaland? Was this the scientists trying to prevent, the lose of data bits?

Unless I misread the article this concept to me is no differences than the scientists were attempting to store a data bit in the thin air. Why not? A data bit in an oxygen atom?

I don't know if this is too much of a simplification, but I think if both types of communication are implemented, the unentangled method would be a fast and unreliable method, like UDP messages. The second method would be more complicated to use, but much more reliable, like TCP-IP messages. Maybe we can use both methods for different purposes.

That doesn't really work by analogy. The single biggest reason entanglement-based communication was being pursued wasn't reliability or performance, but rather the fact that it's theoretically much more secure, since it's impossible to eavesdrop without destroying the entanglement, and thus MITM attacks are easily detected/defeated. The new method still allows for communication of quantum states as is needed for general purpose quantum communication, but doesn't have the security benefit that a working form of entanglement would.

For there to be a security benefit there would have to be working long-distance quantum communication, there's no room for a MITM in the lab.

Correct me if I am wrong. Perhaps I have misunderstood what photon is. Form all the previous readings I have on Ars and wiki so far I understood that when a photon collided with another photon inside the faber optic or under vacuum out-of-space and whatnot it spin-off into two or may be multple photons and the original two photons that collided with each other died on collision. If this is the fact, the memory bit(s) the photon carrying died with it. And since it's no need for quantum memory or whatever that means, the data would be lose in lalaland? Was this the scientists trying to prevent, the lose of data bits?

Unless I misread the article this concept to me is no differences than the scientists were attempting to store a data bit in the thin air. Why not? A data bit in an oxygen atom?

Definitively wrong in parts. Photons do annihilate each other if they have the same wave form with the right phase shift and polarization. This is generally true for any kind of waves, although, other waves like sound are simpler. Photon splitting is only possible when hitting for example an electron (Compton efect) resulting in energy state transfer and changes and emission etc.Of course you lose information on the way.. And if it's for sound or video communication 50% with every other bit is not that tragic. If it's digital data you can still have protocols like everything we have with CRC checks send for each package. Unless the data generation itself is not reproduce able like live feed then you can't get around to record it anyway.

Correct me if I am wrong. Perhaps I have misunderstood what photon is. Form all the previous readings I have on Ars and wiki so far I understood that when a photon collided with another photon inside the faber optic or under vacuum out-of-space and whatnot it spin-off into two or may be multple photons and the original two photons that collided with each other died on collision. If this is the fact, the memory bit(s) the photon carrying died with it. And since it's no need for quantum memory or whatever that means, the data would be lose in lalaland? Was this the scientists trying to prevent, the lose of data bits?

Unless I misread the article this concept to me is no differences than the scientists were attempting to store a data bit in the thin air. Why not? A data bit in an oxygen atom?

Definitively wrong in parts. Photons do annihilate each other if they have the same wave form with the right phase shift and polarization. This is generally true for any kind of waves, although, other waves like sound are simpler. Photon splitting is only possible when hitting for example an electron (Compton efect) resulting in energy state transfer and changes and emission etc.Of course you lose information on the way.. And if it's for sound or video communication 50% with every other bit is not that tragic. If it's digital data you can still have protocols like everything we have with CRC checks send for each package. Unless the data generation itself is not reproduce able like live feed then you can't get around to record it anyway.

Beam me up Scotty. *beams* Oops, just lost the beam alignment.

Photons don't ever annihilate each other. Annihilation is the destruction of the incoming particles and the production of other, distinguishable outgoing particles (edit: or some other distinguishable form of energy, like vacuum energy, in the case of virtual pair annihilation). Photons are bosons, so that doesn't ever happen to them; they are completely transparent to each other (which is why light beams can't "collide").

What *does* happen to photons is interference, but that is not a conversion of mass energy from one form to another. Interference is simply the combined probability of a particle (photon) being measured at a particular location. Destructive interference can result in a 0 probability of a photon being measured at a particular spot, but the energy of the photons is not "converted" or modified in any way; they continue to pass through each other, with a continuing non-zero probability of each photon being measured to be located somewhere else farther along their paths, which proves (by conservation of energy) that their is no "conversion" going on.

For example, you can cross two light beams in such a way that their phases cancel out (destructive interference) at the point of crossing, but if there was annihilation going on, the two beams would not pass through each other and continue unchanged, which is the case. You cannot take a beam of light and destroy it (edit: or change it in any way) with another beam of light crossing it.

edit: to be exactly correct, I suppose that at very high energies pairs of photons at the same location could undergo some some kinds of conversion due to quantum tunneling, but I suspect those processes would be practically nonexistent at the kinds of energies that exist outside of accelerators on the Earth.