Demolishing Heisenberg with clever math and experiments

Good, general measurement choices eliminate uncertainty.

Heisenberg's uncertainty principle turns up everywhere in quantum mechanics. The idea is that certain types of measurements like position and momentum are paired. By measuring one of the pair, we generate uncertainty in the other. This is also referred to as quantum back-action: the thing you are measuring pushes back on the measuring system, which generates uncertainty in some other property. This fundamental idea has some serious consequences when it comes to measuring very small stuff, like gravitational waves.

There is, however, an end run around Heisenberg's uncertainty principle. If you choose to measure things that aren't paired, the precision with which we measure these properties is limited only by how good our measuring stick is. It is important to realize that these measurements are not back-action free, but that the back-action is self-canceling. In the past, researchers have decided that these types of measurements, called quantum non-demolition (QND) measurements, are either trivial or useless. Now, a paper in Physical Review X uses a more general description of QND, along with examples, to show that it is useful and that it is being applied already.

Authors Mankei Tsang and Carlton Caves start by defining the idea of a quantum-mechanics-free sub-system. The idea is that, although you have a quantum system, you can describe a certain part of it by classical dynamics. Normally this would mean that there was sufficient noise and that the quantum behavior was washed out. But this is not what they mean. What they are referring to is that the way in which a quantum system changes is often continuous and precisely described by some mathematical representation, it is only the measurement of some property that has some uncertainty. So, if you can create a measurement that has no intrinsic uncertainty, then the mathematical description looks like a classical description where everything changes and is predicted by a set of equations.

That description appears to violate the very principles of quantum mechanics, but it cannot be dismissed out of hand, because Tsang and Caves are very precise in their wording. An entire quantum system cannot be described this way, only a part of it. The back-action that would create uncertainty does not vanish but instead turns up in some property that is not of interest. Everything that can be measured without uncertainty goes in the box labeled "quantum-mechanics-free," and everything else doesn't. This cannot be done arbitrarily, though. For instance, one cannot take a position and momentum pair, and measure them without uncertainty. Instead, one has to measure some combination of positions and momentums that jointly have no uncertainty.

It is perhaps easiest to describe this in terms of one of their concrete examples: a harmonic oscillator. A harmonic oscillator can be described by the position and momentum of the particle undergoing oscillations. Any measurement on the particle position destroys knowledge of the particle momentum and vise-versa. This is a quantum mechanical system.

The same system can also be described by two harmonic oscillators, one of which has a negative mass. Now, instead of measuring the position of one particle, we have four different possible measurements that form two pairs. One pair is given by the summed position of the two particles, and their average momentum. The second pair is the separation between the particles and the difference in their momentum. When we examine these, we find that we can measure the difference in momentum and position of the two oscillators as accurately as we like, because the back action affects the sum of the particle positions and their average momentum. This is the definition of a QND, and some think it is useless because you only get the difference in position for two oscillators, not the absolute position of either of the oscillators.

You might be thinking that a harmonic oscillator with a negative mass is entirely unrealistic, but nothing could be further from the truth. A concrete example is the mechanical vibration of a mirror that forms part of an optical cavity. The light in the optical cavity consists of a series of discrete colors: the light should travel an integer number of half wavelengths as it travels from one mirror around the cavity and returns to the starting mirror. These different colors or frequencies are referred to as the modes of the cavity. If the cavity mode spacing is chosen such that it is the same as the natural frequency of the mirror's mechanical vibrations, then something very cool can happen. The movement of the mirror doppler shifts the frequency of the light, and light starts to build up in the cavity at two colors, one at a slightly redder wavelength than the original and the other at a bluer wavelength.

Let's ignore the mirror and look instead at the light. In fact, let's jump right into the picture and ride along with the light at the center frequency. In this picture, that light vanishes because we can't see it oscillating anymore. Instead we see two oscillators, the red and blue shifted light fields associated with doppler-shifts from the mirror. In this picture, riding along with the center mode, the red shifted field has a negative frequency, and to increase its amplitude involves adding a unit of negative energy. In other words, it behaves exactly like an oscillator with a negative mass. The blue shifted light field still requires positive energy to increase its amplitude, so it is a positive mass oscillator.

As a result, measurements of the amplitude and phase of either one of these light fields—which would tell you the position and momentum of the mirror—are limited by the uncertainty principle. However, the amplitude difference and phase difference can be measured with no uncertainty whatsoever. From this, the force on the mirror can be derived without uncertainty (but not the absolute position or momentum). If we go back to gravitational wave sensing, this is exactly what you want to achieve.

As mentioned earlier, the feature example is gravitational wave sensing, where very tiny fluctuations in the very fabric of space are supposed to be sensed by the shifting of a mirror. Currently, these shifts are within the noise of the measurement system, and the systems are limited by the uncertainty principle. The idea of using squeezed light to improve these instruments is not new, and that idea fits right into this description of QND.

Those of you who know something about this will be thinking "there is nothing new here." In some respects, you are absolutely right; this paper doesn't present new results as such, and in fact the examples they use are all published. So what's special? Tsang and Caves have created a more general framework. They show how disparate examples, from electron spins in a magnetic field, to vibrating mirrors, to qubit implementations can all be described by the same mathematical ideas. Knowing this, it becomes possible to construct new experimental implementations that take advantage of these QND measurements.

51 Reader Comments

So what's special? Tsang and Caves have created a more general framework. They show how disparate examples, from electron spins in a magnetic field, to vibrating mirrors, to qubit implementations can all be described by the same mathematical ideas. Knowing this, it becomes possible to construct new experimental implementations that take advantage of these QND measurements.

The real test is when this framework is actually used to invent new things which might have been invented otherwise. But still this is nice: so many things in QM become easier when you remove the Q.

Yes, it is valuable to recognize this principle, and have a general framework for it.

But if it actually proved Heisenberg wrong, then "there is nothing new here" would not have been true even in the limited sense in which it is true. This shows the limits to Heisenberg, not that Heisenberg is wrong.

Entirely parenthetical question: So we can create negative mass and energy? The supposed inability to create negative energy fields is the reason the Alcubierre "Warp Drive" is said to be "impossible" to build. What am I missing here? My education is in engineering, not quantum physics, so a lot of this is over my head.

The measurements are relative to ... what exactly? Our pre-conceived notions are based on assumptions that the universe behaves the same way from one galaxy to the next, one super cluster to the next, etc. Until we can actually prove that the universe is static and follows the same pattern everywhere, then this does nothing to quash Heisenberg's Principles.

Quote:

Historically, the uncertainty principle has been confused [4][5] with a somewhat similar effect in physics, called the observer effect, which notes that measurements of certain systems cannot be made without affecting the systems. Heisenberg offered such an observer effect at the quantum level (see below) as a physical "explanation" of quantum uncertainty.[6] It has since become clear, however, that the uncertainty principle is inherent in the properties of all wave-like systems, and that it arises in quantum mechanics simply due to the matter wave nature of all quantum objects. Thus, the uncertainty principle actually states a fundamental property of quantum systems, and is not a statement about the observational success of current technology.[7] It must be emphasized that measurement does not mean only a process in which a physicist-observer takes part, but rather any interaction between classical and quantum objects regardless of any observer.[8]

Since the uncertainty principle is such a basic result in quantum mechanics, typical experiments in quantum mechanics routinely observe aspects of it. Certain experiments, however, may deliberately test a particular form of the uncertainty principle as part of their main research program. These include, for example, tests of number-phase uncertainty relations in superconducting [9] or quantum optics [10] systems. Applications are for developing extremely low noise technology such as that required in gravitational-wave interferometers.[11]

Entirely parenthetical question: So we can create negative mass and energy?

No more than we can create antiparticles: some of them are the negative solution to Dirac's equation, which has "negative energy". What the article is talking about is a system described by equation analogous to an oscillator with negative mass.

I think of Heisenberg Uncertainty Principle as more of idea that trying to measuring something is interacting with it which causes issues. When dealing with things so small and "delicate", with his time and the past "thick fingers" we could only measure one thing without affecting the other properties. More "delicate fingers" that don't interfere with the other properties can measure or ascertain more information.

It is something to understand, not a law.... but a principle, who's time in quantum level dynamics might be pushed down to another level - whatever that might be - where we don't have such delicate ways to measure.

I think of Heisenberg Uncertainty Principle as more of idea that trying to measuring something is interacting with it which causes issues. When dealing with things so small and "delicate", with his time and the past "thick fingers" we could only measure one thing without affecting the other properties. More "delicate fingers" that don't interfere with the other properties can measure or ascertain more information.

"It is something to understand, not a law." No, if anything would be like a physical "law" it would be Heisenberg's uncertainty principle. What you describe is often wrongly used as a way to describe the U.P. but that isn't it at all - the uncertainty principle is an really just pointing out an effect of our model of physics called Quantum Mechanics. There is an equation that describes observables like Momentum and Location or Energy and Time, and it is just like the equation describing waves - this is really the key idea in Quantum Mechanics. The UP is one way to describe, based on these equations that if, for example, an event in space-time is constrained to be in a very small time period then the energy possible can vary a huge amount - the model predicts an ever larger range of possible energy the smaller the time. Similarly for other paired observables. It has nothing to do with measurement in essence, and is really the basis for a lot of quantum field theory, For example where there are virtual particles that affect the charge or magnetic moment of an electron. Why can virtual particles exist? because they last for a very short period of time and thus over very short period there is possibility of a lot of energy there, This is directly out of the field equations and it is actually fairly straightforward there, but when one tries to translate that effect to some verbal model based on classical ideas things sound funny.

So anyway the uncertainty principle isn't really related to how well we observe, it is a limitation that is there in the first place. Really this article itself is misnamed. It isn't demolishing the Uncertainty Principle, it is using the uncertainty principle, or rather the model on which it is based to find better ways to observe certain things.

"This is also referred to as quantum back-action: the thing you are measuring pushes back on the measuring system, which generates uncertainty in some other property."

Heisenberg's microscope thought-experiment bites again. Or maybe it's Bohr's later (unsatisfactory) explanations in the Como lectures. It doesn't matter. Even the oldest coherent interpretation of quantum mechanics, the Copenhagen interpretation, has rejected this explanation of the physics of uncertainty for decades. Unfortunately that hasn't seemed to stop it's being trotted out in nearly every article in the recent glut of articles about Heisenberg's uncertainty principle.

I think of Heisenberg Uncertainty Principle as more of idea that trying to measuring something is interacting with it which causes issues. When dealing with things so small and "delicate", with his time and the past "thick fingers" we could only measure one thing without affecting the other properties. More "delicate fingers" that don't interfere with the other properties can measure or ascertain more information.

It is something to understand, not a law.... but a principle, who's time in quantum level dynamics might be pushed down to another level - whatever that might be - where we don't have such delicate ways to measure.

I could be wrong as Im not a physicist or mathematician in any way, but my understanding of the uncertainty principle is that it goes way beyond just being a simple matter of our instruments are not accurate enough.

From a poster above you -

"Thus, the uncertainty principle actually states a fundamental property of quantum systems, and is not a statement about the observational success of current technology.[7]"

At a basic level, what are you measuring? Photons are said to have uncertainty attached to them of either position or momentum, the position is along a wave. It goes against possibility that it is not there...so it is there. I understand the idea that it is the wave that makes the momentum, my thoughts it is more like the nature of the photons.

Moving through transparent or not so transparent materials, if it was a wave how would that effect. To me this says the momentum has placement along the path of the wave, that the momentum moves along a wave precisely. The momentum itself fixed along the wave path. So it is only our instruments that cannot measure both.

As the measuring abilities go up with new craft, our abilities to sense both momentum and position will narrow both down, and that itself says enough for me.

We noted that uncertainty is instead a fundamental property of quantum physics just 2 weeks ago. [ http://arstechnica.com/science/2012/09/ ... -inherent/ ] If you wish it is a Fourier-Fourier map constraint, perhaps best understood by such mappings in distribution theory where the support describes the probability densities involved.

The first correct uncertainty principle, Robertson uncertainty, that "demolished" Heisenberg uncertainty, was published already 1929, two years after Heisenberg's failed proposal. (See the first link.)

Based on the physics all sorts of precision measurements can be done, as noted this part is not new. Besides measuring non-conjugate variables to desired precision, weak measurements can be used to establish precision statistically, or squeezed states can be used to establish one of the paired observables to desired precision. There is no "back-reaction" here, just handling observational constraints.

This work increases the toolbox, but hardly the basic understanding of uncertainty.

I understand likelihood and such. But I see likelihood has little to do with where it is nor its properties. I reread the article link and I still don't see an intrinsic uncertainty in the properties of the photon.

In a state it is so much momentum/energy at a place, with such a form. Usually we observe things though it's interactions with other things. I see it with as we move down to quantum size and energy, interactions changes them. Maybe I'm not seeing something but I still don't see how this principle is intrinsic to the properties of what is being measured, but only on the measurements.

@Mydrrin.You seem to be tripping over the common realization that the way things work at suck a small scale are so far from what we normally observe that they are not intuitive. You have to look at the math. If you want to nail down the exact position of a particle you are going to have to integrate the wave equations for all of it's possible positions. As you sum over more and more possibilities to narrow down the position you are simultaneously doing the opposite for the momentum wave equations. They are in effect reciprocal (I know that isn't mathematically correct, I'm trying to simplify). It does not matter how precise our measuring equipment gets. We could use a microscope from 10,000 years in the future. Unless we find something in the math that no one has anticipated, we will never be able to pin down both at the same time. It has nothing to do with the what you measure with.

So basically they're measuring properties that have no uncertainty attached to them? That doesn't prove the U.P. wrong. It's just a work around to get the data the observer needs to prove/disprove the experiment they're working on.

Great article by the way, definitely like these when they show up here. Thank you

I understand likelihood and such. But I see likelihood has little to do with where it is nor its properties. I reread the article link and I still don't see an intrinsic uncertainty in the properties of the photon.

In a state it is so much momentum/energy at a place, with such a form. Usually we observe things though it's interactions with other things. I see it with as we move down to quantum size and energy, interactions changes them. Maybe I'm not seeing something but I still don't see how this principle is intrinsic to the properties of what is being measured, but only on the measurements.

It's not an intrinsic property of the actual, physical photon. It's an intrinsic property of the QM model. The mathematics. At least according to the Copenhagen interpretation. An electron, for example, actually does have an exact momentum and position at any given time. We just can't know what they are based on the math.

In 1978, Carlton Caves, one of the authors of this paper, was my TA in Quantum Mechanics. Kip Thorne, who was interested in gravitational wave detection, taught the course. He said he had to understand QM better to figure out how to detect the waves, and to understand it he decided to teach the course. A lot of the course was devoted to measurement and uncertainty, and how you could get around the uncertainty principle for certain types of measurements. The ideas in this recent paper date back at least to Thorne and Caves in the 70's.

It was a really good course, somewhat too hard for me; I was the only non-physics major in the class. /offtopic

At least according to the Copenhagen interpretation. An electron, for example, actually does have an exact momentum and position at any given time. We just can't know what they are based on the math.

Hidden variables are not very popular these days.

I didn't mean anything about hidden variables. Just that the Copenhagen interpretation states QM isn't a description of objective reality. It's just a mathematical model. Thus, in objective reality an electron has a definite position and momentum. But in the model, it can't; QM is by necessity probabilistic.

Hidden variables theory states QM just isn't good enough. There must exist a deterministic model that does claim to describe objective reality.

Unless I misunderstood the reply? I haven't had my coffee yet.

Edit: That is, I was in no way advocating hidden variables, simply describing the difference between objective reality and a mathematical model.

At least according to the Copenhagen interpretation. An electron, for example, actually does have an exact momentum and position at any given time. We just can't know what they are based on the math.

Hidden variables are not very popular these days.

I didn't mean anything about hidden variables. Just that the Copenhagen interpretation states QM isn't a description of objective reality. It's just a mathematical model. Thus, in objective reality an electron has a definite position and momentum. But in the model, it can't; QM is by necessity probabilistic.

What you have just stated here turns out to be the essence of hidden variables -- that particles have an objective, deterministic reality, but our model doesn't capture it. What Copenhagen actually says is that the electron does not have a definite *anything* until a property is observed. At that point, the electron's wavefunction collapses and we can begin talking about extant properties of the electron. I think you're conflating Copenhagen's treatment of ensemble measurements with what it says about a single particle's wavefunction.

Incidentally, what happened to all the Bohmians? Five years ago they would have been all over the comments section of an article like this.

At least according to the Copenhagen interpretation. An electron, for example, actually does have an exact momentum and position at any given time. We just can't know what they are based on the math.

Hidden variables are not very popular these days.

I didn't mean anything about hidden variables. Just that the Copenhagen interpretation states QM isn't a description of objective reality. It's just a mathematical model. Thus, in objective reality an electron has a definite position and momentum. But in the model, it can't; QM is by necessity probabilistic.

What you have just stated here turns out to be the essence of hidden variables -- that particles have an objective, deterministic reality, but our model doesn't capture it. What Copenhagen actually says is that the electron does not have a definite *anything* until a property is observed. At that point, the electron's wavefunction collapses and we can begin talking about extant properties of the electron. I think you're conflating Copenhagen's treatment of ensemble measurements with what it says about a single particle's wavefunction.

Incidentally, what happened to all the Bohmians? Five years ago they would have been all over the comments section of an article like this.

I'm not doing a very good job of explaining what I mean, I guess.

The Copenhagen interpretation, as least in its origins, could be seen as a statement about the limits of our knowledge. It didn't pretend QM described an objective reality, what was really, physically happening at a quantum level. It was simply a formal mathematical model for making scientific predictions. It was probabilistic not because we simply didn't have a good enough model, but because the nature of the things being studied required it. A deterministic model simply wasn't possible.

Hidden variables posits that not only doesn't our model capture it, but that it is in fact possible to do so. It states that it's possible to describe the objective reality. And, in essence, QM is just incomplete.

That's where people run into so many problems. All the "weird" things about QM people can't get their heads around. Quantum entanglement, uncertainty, whatever. The average person thinks that's what's actually happening on a physical level. But QM really says no such thing. It's a model.

I guess modern interpretations of the Copenhagen interpretation have gone farther than originally, and posit the electron, for example, really doesn't exist prior to wavefunction collapse. I suppose I'm more in line with the original intent: it might actually exist before observation, but it doesn't make sense to talk about it because it's impossible to describe it.

It's not an intrinsic property of the actual, physical photon. It's an intrinsic property of the QM model. The mathematics. At least according to the Copenhagen interpretation. An electron, for example, actually does have an exact momentum and position at any given time. We just can't know what they are based on the math.

I'm not aware of an interpretation of the Copenhagen Interpretation that supports this. As I understand the Copenhagen Interpretation (but I might be wrong... I don't think about it all that much as I find it easier to just consider the wave-function physical) before a measurement is made, the (lets say) electron is not "hiding in the wave-function somewhere" but in fact its location (or any other observable) is simply undefined - it's not even "nowhere"... it just does not make any sense to even ask the question. I believe that Bohr himself considered any attempt to visualize a particle as nonsensical and preferred to call particles something like "quantum objects entities". If you are suggesting that the CI does not consider the wavefunction physical, then I do agree with that, but I do not believe that it is correct to say that the particle (or any observable) is actually somewhere at all times but we just don't know where for sure.

I also agree with Penn.Taylor above that what you are describing is actually a hidden variable interpretation.

I think of Heisenberg Uncertainty Principle as more of idea that trying to measuring something is interacting with it which causes issues. When dealing with things so small and "delicate", with his time and the past "thick fingers" we could only measure one thing without affecting the other properties. More "delicate fingers" that don't interfere with the other properties can measure or ascertain more information.

It is something to understand, not a law.... but a principle, who's time in quantum level dynamics might be pushed down to another level - whatever that might be - where we don't have such delicate ways to measure.

It's really much simpler than even that. If we consider the basic pair of measurements as position & velocity (simplified from momentum), then even without QM, using a bit of common sense shows up the uncertainty: the smaller & smaller chunk of space we try & measure, the less accurate our velocity measurement becomes (being position change over time); and the more accurate our velocity at a point in time becomes, the distance must become smaller & smaller as well, thus less accurate. Hey presto!

This was even the case in classical mechanics initially, given the limitation of measuring apparatus, though becomes a major headache at a quantum level when we hit limits trying to measure elementary wave/particles with similar sized/energy particles. One book I read long ago described it best: "you can bounce cannonballs off a chair in the hopes of learning something about the chair, but it's not likely to be very accurate" .

So anyway the uncertainty principle isn't really related to how well we observe, it is a limitation that is there in the first place. Really this article itself is misnamed. It isn't demolishing the Uncertainty Principle, it is using the uncertainty principle, or rather the model on which it is based to find better ways to observe certain things.

Well put mrjk! Even if could measure down to any level as small as we wanted, without affecting the observed medium/particles/waves, I wonder what we would see once we went below the level of quarks & vibrating strings, or even past/through M-Branes through to the inner or outside of the fabric of our Universe & into...nilspace? Does anyone know if space-time itself has been effectively proven to be quantifiable at this point, with some branch of string or M-theory I would imagine...? My brain hurts. :-/

"This is also referred to as quantum back-action: the thing you are measuring pushes back on the measuring system, which generates uncertainty in some other property."

Heisenberg's microscope thought-experiment bites again. Or maybe it's Bohr's later (unsatisfactory) explanations in the Como lectures. It doesn't matter. Even the oldest coherent interpretation of quantum mechanics, the Copenhagen interpretation, has rejected this explanation of the physics of uncertainty for decades. Unfortunately that hasn't seemed to stop it's being trotted out in nearly every article in the recent glut of articles about Heisenberg's uncertainty principle.

This isn't a bad read for the layman (taking any Wiki article with a grain of salt):

I wouldn't say that the more recent supersets of uncertainty invalidate Heisenberg, no more than Einstein invalidated Newton, though to a lesser degree! . Entanglement is still so much pie-in-the-sky, but recent experiments do more to clarify the HUP than disprove it in any way. Just my 5c worth & take on it.

I understand likelihood and such. But I see likelihood has little to do with where it is nor its properties. I reread the article link and I still don't see an intrinsic uncertainty in the properties of the photon.

In a state it is so much momentum/energy at a place, with such a form. Usually we observe things though it's interactions with other things. I see it with as we move down to quantum size and energy, interactions changes them. Maybe I'm not seeing something but I still don't see how this principle is intrinsic to the properties of what is being measured, but only on the measurements.

It's not an intrinsic property of the actual, physical photon. It's an intrinsic property of the QM model. The mathematics. At least according to the Copenhagen interpretation. An electron, for example, actually does have an exact momentum and position at any given time. We just can't know what they are based on the math.

This should come as no surprise, considering we've known for a 100 years that particles have wave/particle duality & are in fact probability clouds, rather than anything discrete. How do you measure *anything* definite about something like that? Goes back to that old postulate about there only being one electron in the entire universe, just winking in & out of existence all over the place .

Several people said this doesn't prove Heisenberg's ideas wrong. It was never meant to. Maybe the title of the article was misleading, but if you read into it, there's not a word in there to say quantum uncertainly is false or misrepresented in the standard model.

Chris Lee / Chris writes for Ars Technica's science section. A physicist by day and science writer by night, he specializes in quantum physics and optics. He lives and works in Eindhoven, the Netherlands.