This probably would've slipped by me if not for this comment, but it's a really nice paper, and I'm happy to give it a shot. There's also a commentary by Luis Orozco that you won't be able to read without a subscription because Nature are bastards that way.

The basic idea of this paper is that they prepare a quantum system in a superposition of several different states, and then use an ingenious measurement technique to make repeated measurements of the state of the system. This lets them follow the state as it moves from a quantum superposition of several different states to a more classical state where it has one and only one value. As they say in the abstract, the experiment "illustrates all the postulates of quantum measurement (state collapse, statistical results, and repeatability)," making it a really impressive piece of work.

So, how do they do this?

The basic scheme is very similar to a paper I blogged back in March, where the same group looks at the spontaneous appearance and disappearance of thermal photons. Their quantum system is a superconducting cavity-- basically, two extremely good mirrors facing one another-- in which small numbers of microwave photons can be trapped for long periods of time-- nearly a second. In the previous experiment, they set the system up so there were no photons in the cavity, while here, they load it with an average of 3.8 photons, and watch what happens to the number of photons in the cavity.

Now, that may sound like an obvious thing to do, but if you think about it a bit, you'll see that it's actually very difficult. Photons aren't like billiard balls that you can look at from a distance, and count without perturbing them. If you directly detect a photon, you destroy it-- it gives up its energy to your detector, causing the canonical "click." So while you can count the number of photons in the cavity directly, doing so immediately destroys the state, and you need to start over.

This is the ingenious part of their experiment: they have a way to detect the presence of photons in the cavity without destroying the photons. They do this by passing the atoms from an atomic clock through the center of the cavity. The photons in the cavity aren't at the right frequency to be absorbed by the atoms, but they do shift the energy levels of the atoms by a tiny amount, causing the clock to run a tiny bit faster.

If you think of the atoms like little analog clocks, imagine that an atom passing through an empty cavity emerges with its hands showing exactly 12:00 noon. If there's one photon in the cavity, though, the clock "ticks" a little faster, and the atom comes out showing 12:08. Two photons gets 12:15, three photons 12:23, and so on. In principle, this lets you distinguish any number of photons in the cavity, by looking at the time on the emerging clocks. In practice, it's a little more complicated-- atoms don't have hour hands, so there's no way to distinguish between 12:00 noon and 1 pm, but the important thing is that they have a way to distinguish between states of up to seven photons.

What they do, then, is to load the cavity with a burst of light that leaves an average of 3.8 photons in the cavity. Of course, photons are discrete objects, so you can't really have four-fifths of a photon, so what you've really got is a collection of a bunch of different numbers, ranging between 0 and 8 or so. There's about a 16% chance of finding three photons, a 10% chance of finding five, a 5% chance of finding seven, and so on.

They load the cavity up, and then they start sending atoms in. But this is a quantum system, so the initial state of the system isn't exactly three, or five, or seven photons, but a superposition of all the photon numbers from 0 to 7 (and beyond, though the probability of 8 or more is very small) at the same time. There is no definite photon number in the cavity at the start of the experiment, so when they send the first atom in, they get an indeterminate answer-- the most probable number is three, say, but there's a pretty good chance of it being two or four, or even six. This produces the large, spread-out initial distribution seen in the figure at left (cropped from Figure 2b of the Nature paper).

When they send in a second atom, though, they get a bit more information about the state, and the distribution gets a little narrower. A third atom gets still more information, and a fourth, a fifth, and so on. What they see is that, as time goes on, the system evolves from a superposition of lots of different photon numbers into a single definite number-- by the time they've sent 50 atoms through, the state has pretty much converged to a single number, say five photons, as seen in the figure. And they can track the evolution of the state by looking at each of the individual atoms as it comes out.

This is exactly the sort of thing that people talking about quantum measurement are always looking at: somehow, the process of measurement takes a quantum superposition of several different states, and causes it to "collapse" into a single value that we measure with our classical apparatus. If you repeat the experiment many times, you get a random result every time, but when you put all your results together, you find that they follow a predictable probability distribution-- in this case, a comb of integer values (0, 1, 2, 3, 4, 5, 6, 7), with a very small background level of runs where an incomplete "collapse" ends up being read as 3.5 photons, as seen in the figure at right (Figure 3 from the Nature paper.).

("Collapse" gets scare quotes not just because it's jargon, but because it's a loaded term, implying that the quantum wavefunction actually changes in a discontinuous and irreversible way. In the Many-Worlds Interpretation, the wavefunction continues to evolve in a smooth and mathematically satisfying manner, and for some reason, we only perceive one of the possible outcomes. These views are operationally indistinguishable-- whatever interpretation you favor, at the end of the experiment, you only see one number-- but the terminology may upset some people.)

They can also follow the state after the "collapse" to a single value, and what they see there is pretty cool, as well: the number of photons in the cavity decreases through discrete and random jumps, as individual photons slowly leak out of the cavity, over a few tenths of a second. The transition from, say, five photons to four happens very quickly, in a hundredth of a second or so, and then the cavity will sit at that state for a short time before dropping to three photons, and so on. They tracked 2000 states, and from that can put together a nice description of the photon lifetime, including some odd states where photons hang around for anomalously long times, or where thermal fluctuations cause the number to actually increase for a short time.

It's really an outstanding piece of work. The one thing that it would be nice to see that isn't there is a demonstration that this is a real "collapse," and not just a refinement of the measurement-- some experiment to show that the initial state is really a superposition of many different numbers, and not a state with a definite number that just isn't measured very accurately. I'm not quite sure how one would go about that, though.

Even without that, though, this is an extremely cool paper in quantum optics, offering a really cool way of watching the process of quantum measurement in action.

More like this

The first key concept in probability is called a random variable.
Random variables are a key concept - but since they're a key concept of the
frequentist school, they are alas, one of the things that bring out more of
the Bayesian wars. But the idea of the random variable, and its key position
in…

I'm away on vacation this week, taking my kids to Disney World. Since I'm not likely to
have time to write while I'm away, I'm taking the opportunity to re-run an old classic series
of posts on numbers, which were first posted in the summer of 2006. These posts are mildly
revised.
Ω is my own…

Today is another bit of rubbish from viXra! In the comment thread from the
last post, someone (I presume the author of this paper) challenged me to
address this. And it's such a perfect example of one of my mantras that I
can't resist.
What's the first rule of GM/BM? The worst math is no math.…

There's one kind of semi-mathematical crackpottery that people frequently send to me, but which i generally don't write about. Given my background, I call it gematria - but it covers a much wider range than what's really technically meant by that term. Another good name for it would be numeric…

This does sound very ingenious and I'm sure was quite the technical accomplishment, but what use is it? I don't have any experience with this kind of research, so I'm not sure what can be done with the results.

I have no experience with this sort of research either, but it seems to me that it could be used to better identify HOW a wavefunction collapses. That area seems a little fuzzy research-wise. Having a mathematical model for how the function goes from superposition to a single state.

Hi Chad, thank you for this clear explanation.
I'm little confused, however, by your statement in the penultimate paragraph: "some experiment to show that the initial state is really a superposition of many different numbers, and not a state with a definite number that just isn't measured very accurately."

Do you mean that the results of this experiment don't rule out a "deterministic" alternative to quantum mechanics?

This does sound very ingenious and I'm sure was quite the technical accomplishment, but what use is it?

It's not really any practical use, but it's great basic science, providing a new way to look at the process of quantum measurement, which remains incredibly poorly understood. It's not likely to lead to antigravity shoes any time soon, but it could improve our understanding of the most fundamental processes in the universe.

I'm little confused, however, by your statement in the penultimate paragraph: "some experiment to show that the initial state is really a superposition of many different numbers, and not a state with a definite number that just isn't measured very accurately."

Do you mean that the results of this experiment don't rule out a "deterministic" alternative to quantum mechanics?

Sort of. That sentence-- which was not, I should note, the product of any prolonged period of deep thought on the subject-- was meant to suggest that you could probably construct a model of this system in which the cavity was always in a Fock state of definite number (with the particular number distributed randomly according to some probability distribution), but it takes time to refine the measurement to the point where you can correctly determine which number state it's in. The measurement they do is somewhat contingent on the results of previous measurements, so I think you could try to claim that what they're seeing is really just an artifact of the measurement, and not a real evolution of the state of the system.

I haven't read it carefully enough to see if they've actually covered that loophole (which is in the "the Universe is perverse" category of quantum theories). It may well be that something in the measurement process or the particular evolution that they see rules that possibility out. If not, it'd be a damnably difficult thing to demonstrate, I think.

So please excuse the length of this post, but I've got some questions about all this.

Something that always baffles the heck out of me when I see experiments like this described is the question of what "triggers" the collapse, real or apparent, of the wavefunction. You get these descriptions that imply the collapse happens when the state of the system is "measured", but it seems to be hard to get a definition of what does or doesn't qualify as a "measurement" beyond some hand-waving about the detector being "macroscopic" or "classical" (as if anything in the universe were classical). The clearest answers to this question seem to be the ones that imply that it isn't so much that the detector has some magical "classical" property that triggers collapses as that the detector has a state which is entangled with that of the rest of the universe including the person operating the experiment; and the "collapse" is really just the wavefunction becoming entangled with the detector. But even at looking at things this way, since it still seems like, from the observer's perspective, the system behaves differently before and after that entangling happens, it seems like we ought to be able to measure some solid limits on exactly how much interference can be done before that special entangling moment happens, and exactly what it is about that interference that triggers the change.

(I'm not sure any of the above makes sense, I just wanted to establish exactly how confused I am before I ask my question. Here's the question:)

What's confusing me here is, when they "gradually" collapse the waveform, what is it they're doing exactly that's provoking this gradual collapse? Is there some kind of interaction (for example, between electric fields?) happening between the atoms and the photons that triggers the gradual collapse? If so, does this interaction mean that the atoms and the photons are entangled afterward? And if so, then why doesn't this entangle the photons "all the way" with the rest of the universe, so to speak, once the atoms hit the detector?

And what about during this gradual "collapse", what state is the wavefunction left in? What meaning, if any, does it have for the wavefunction to be "partly" collapsed? My understanding is that the wavefunction is basically a probability distribution-- is the idea that with each bit of information we get we're transforming this probability distribution so that the range of possibilities is more constrained, or something? Is this gradual "collapse" in some way fundamentally different from the all-at-once "measurement"s that collapse things to a single value in other experiments, or is the change in the wavefunction just happening quicker? And how would we interpret this "partial collapse" in the many-worlds interpretation? The set of universes we as observers could potentially be in being gradually constrained? That sounds kinda weird.

If I look at the article abstract, it seems to be saying that the "state" being measured in the experiment is actually composed of a whole bunch of "elementary states", and the thing that's happening gradually when the waveform gradually "collapses" is that some elementary states are being measured and others are being left ambiguous. That could make a lot of sense, I think, but what "elementary states" could possibly together make up photon number? That sounds pretty elementary to me.

--- --- --- ---

Sort of. That sentence-- which was not, I should note, the product of any prolonged period of deep thought on the subject-- was meant to suggest that you could probably construct a model of this system in which the cavity was always in a Fock state of definite number (with the particular number distributed randomly according to some probability distribution), but it takes time to refine the measurement to the point where you can correctly determine which number state it's in.

Just to be clear, is what you're describing here the interpretation of the experiment that, for example, the Bohmian interpretation of quantum mechanics would require us to accept?

It's not really any practical use, but it's great basic science, providing a new way to look at the process of quantum measurement, which remains incredibly poorly understood. It's not likely to lead to antigravity shoes any time soon, but it could improve our understanding of the most fundamental processes in the universe.

So let's say that we're fine with the shoes we've got, we aren't too concerned about "applied" science, and we consider "practical" to include anything that furthers our understanding of fundamental processes. If we look at it that way, is this experiment practical? Is it likely to lead to further experiments or research that lets us measure things we couldn't before or in ways we couldn't before? What's the next step after this experiment, basically?

But even at looking at things this way, since it still seems like, from the observer's perspective, the system behaves differently before and after that entangling happens, it seems like we ought to be able to measure some solid limits on exactly how much interference can be done before that special entangling moment happens, and exactly what it is about that interference that triggers the change.

The problem is distinguishing practical limits (due to experimental noise) from fundamental limits. As the systems become more complex, it gets harder to limit the coupling to "environmental" degrees of freedom enough to see interfence phenomenon. Even if it is theoretically possible to diffract a baseball off a picket fence, the practical difficulties in actually observing a diffraction pattern are insurmountable.

Thank you very much, Chad! Excellent. I thought you would be a good person to ask, as you had commented on their previous paper.

I am not a physicist (at least not consider myself), even though I work in a lab that belongs to a physics department. I had a little bit of physics education, but my knowledge of quantum mechanics is limited and rusty. But I have an amateurish interest in these things.

I think it's a very cool paper, because they are observing gradual "collapse" rather than one measurement that destroy the state. But the question is, as you also asked, whether this should be considered a real "collapse" or a refinement of the measurement.

If the initial state was in a coherent state, can we at least assume that the initial state was indeed a superposition of different number of photons? (But I don't know how to ensure that.)

I thought that the tricky part (if I understood it correctly) is that you cannot actually read that the atomic clock is showing 12:08, 12:15, etc. You can detect the atom in one of two possible states and you can only calculate the probability of which time it was pointing. And as you measure more and more atoms, you become more and more confident about the time on the atomic clock, which in turn indicates the number of photons in the cavity.

If you try to interpret this experiment by the Many-Worlds interpretation, did the branching out of the universe happen when the first atom passed the cavity? In other words, was the number of photons fixed when the fist atom passed but was hidden to the experimenter until more information was provided by measuring more atoms? Or, did the universe gradually "decohere" into many worlds as more and more atoms passed?

I am not a physicist (at least not consider myself), even though I work in a lab that belongs to a physics department. I had a little bit of physics education, but my knowledge of quantum mechanics is limited and rusty. But I have an amateurish interest in these things.

What I would like to see next is the same experiment repeated with a varied delay before the first measurement, and a varied rate of measurement. Presumably you could separate out the rate of decoherence due to measurement from the rate of "spontaneous" decoherence due to environmental interaction. Also by comparing more or less leaky traps you could test the ability to control or engineer the rate of environmental interaction. This could also be tested by running the experiment with the apparatus at different temperatures. I am sure these things will all eventually be tested.

Looking at their data, they say they try to measure about 14,000 times per second. They actually succeed in measuring about 4500 times per second. Their two graphs seem to show the seven-number photon cat decays to a two-number photon cat in about 3 to 5 measurements and then to a single number photon non-cat in about 20 to 40 measurements. The typical lifetime of a four-photon Fock state before a single photon escapes or is absorbed (from a four photon state) is stated to be of order .03 second or of order 100 measurements. However the graphs are chosen to show atypically longer lifetimes which seem to cluster around .1 second. The damping time is stated to be .13 second and my interpretation of their formula also gives the escape time for a single photon to be of order .1 second or of order 1,000 measurement times. They also claim to detect the jump from one Fock state to another in .01 second or less, which is around 100 measurement times.
My conclusion: their data seem to show an initial cat lifetime of order ten measurements or around one millisecond. But they do not claim this or talk about it in this cat language. Instead they say (my paraphrase) that they have observed the progressive collapse to a number state under the influence of measurement. Is this a fair paraphrase?

Actually I read it again. They do not call it a cat, but "an ambiguity between two competing Fock states". It lives for twenty or thirty measurements (.005-.007 seconds), but is dead by fifty measurements (.012 seconds). Another way to say it is the "collapse" appears to take about .006 seconds. Of course, the key question is whether this duration is the duration of the actual collapse or the uncertainty in the time measurement. I think they intend to claim it is the duration of the collapse. They talk of "progressive field-state collapse" and "observing the field-state collapse". Am I wrong?

They also claim to show that the measurement of a quantum jump in this system can be completed in less than ten milliseconds or less than 140 measurements. Their one blown-up example seems to be one of the longer ones, i.e. slower jumps. It seems to show that the jump itself takes almost that long. (The other, not-blown-up ones seem faster and shorter.) So my question is, what is the duration of the quantum jump, and what is the uncertainty in the measurement? They do not claim to have shown a finite duration for the quantum jump, but their graph appears to show that. In fact what it appears to show is a two-Fock-number state cat with a lifetime of about .01 second. So maybe the escaping photon was half in and half out of the cavity for .01 second?

Torbjorn Larsson: I guess my question is, how fast can we make the progressiveness? Instantaneous (ideally, that is)?

It all depends on the strength of the interaction between the atoms and the photons. The stronger the interaction, the faster the collapse. They actually work pretty hard to make it as slow as it is.

Coin: What's confusing me here is, when they "gradually" collapse the waveform, what is it they're doing exactly that's provoking this gradual collapse? Is there some kind of interaction (for example, between electric fields?) happening between the atoms and the photons that triggers the gradual collapse? If so, does this interaction mean that the atoms and the photons are entangled afterward? And if so, then why doesn't this entangle the photons "all the way" with the rest of the universe, so to speak, once the atoms hit the detector?

There is a fairly weak interaction between the atoms and the photons in the cavity. It's not a resonant interaction, so the atoms don't absorb any of the photons, but the interaction does lead to a shift in the energy levels of the atoms, and it's that interaction that ultimately causes the collapse that they see.

People do sometimes talk about quantum measurement using the language of entanglement. The idea is that your apparatus becomes entangled with the state of the system, but that you don't actually measure all the entangled degrees of freedom (that is, you don't keep track of the exact state of every atom making up your apparatus, which would preserve the entanglement, but instead measure some aggregate property of the whole apparatus). This process causes a rapid decay of the quantum coherence terms that characterize a superposition state, leaving you with a classical mixture (that is, an atoms that is definitely in either A or B with some probability of each, rather than an atom that is partly in A and partly in B at the same time). I think you could probably use a similar approach to understanding the current experiment, but I'd have to think about it a bit.

HI: I thought that the tricky part (if I understood it correctly) is that you cannot actually read that the atomic clock is showing 12:08, 12:15, etc. You can detect the atom in one of two possible states and you can only calculate the probability of which time it was pointing. And as you measure more and more atoms, you become more and more confident about the time on the atomic clock, which in turn indicates the number of photons in the cavity.

That's why I think you might be able to write this off as a refinement of the measurement, rather than a real state collapse. You need to have some prior information in order to back the photon number out of the Ramsey measurement that they make, and successive measurements make that work better.

I have to admit that I don't entirely understand how they do this-- the details are in one of the references that I didn't track down and read. From some of the things they say, I think they can address this problem through the preparation of the initial state, which involves using randomly generated initial phases for the probe atoms, but I don't understand it in detail.

If you try to interpret this experiment by the Many-Worlds interpretation, did the branching out of the universe happen when the first atom passed the cavity? In other words, was the number of photons fixed when the fist atom passed but was hidden to the experimenter until more information was provided by measuring more atoms? Or, did the universe gradually "decohere" into many worlds as more and more atoms passed?

Well, I think a strict Many-Worlds view would involve multiple branchings with each detection of an atomic state. At the end of the experiment, you would have a large number of worlds, corresponding to each of the many paths you could have taken to each of the possible photon number states.

So, um, "yes."

Jim Graber: My conclusion: their data seem to show an initial cat lifetime of order ten measurements or around one millisecond. But they do not claim this or talk about it in this cat language. Instead they say (my paraphrase) that they have observed the progressive collapse to a number state under the influence of measurement. Is this a fair paraphrase?

I think that's fairly reasonable.
They don't talk about it as a "cat" state, because that's usually reserved for relatively simple superpositions of only two states. What they really have is a coherent state of the photon field, which is a superposition of many possible states of definite number. In a certain sense, I suppose you could call it a "cat state," albeit a big and messy cat.

The collapse that they measure is the result of those successive interactions between the atoms and the photons. They call this a "non-demolition" measurement, but in some sense, it's really just an extremely slow demolition measurement-- they don't interact strongly enough to cause the absorption of a photon, but they do eventually destroy the initial coherent state.

They also claim to show that the measurement of a quantum jump in this system can be completed in less than ten milliseconds or less than 140 measurements. Their one blown-up example seems to be one of the longer ones, i.e. slower jumps. It seems to show that the jump itself takes almost that long. (The other, not-blown-up ones seem faster and shorter.) So my question is, what is the duration of the quantum jump, and what is the uncertainty in the measurement? They do not claim to have shown a finite duration for the quantum jump, but their graph appears to show that.

I suspect that what you see there is essentially all due to the time resolution of their measurement technique. The quantum jump itself is instantaneous, but it takes time for their measurements to settle in to the new state.

Of course, I say this mostly because they don't make any claim to see a finite time for the quantum jump. If they really could resolve that, and it was real, that would be a very big deal, and they would certainly mention it.

Chad,
I think you are right.
I have studied this paper a bit more and am trying to restate the results in a very basic way, perhaps just one or two steps up from a dog. As I understand it now, the disturbance rate or attempted (weak, almost non-demolition) measurement rate is about 14,000 times per second. The successful weak measurement rate is about 4,500 times per second. They use nature itself to calibrate their experiment and show that they can detect a quantum jump in less than .01 seconds. So this means they have a reliable measurement in .01 seconds, or about 45 weak measurements. Since the measured coexistence of the two different Fock states lasts at least .05 seconds, long enough for five reliable measurements, they are confident that it has been securely demonstrated.

On the other hand, it seems that the transition from an almost totally spread out waveform to one that covers only two Fock states typically also happens in less than .01 second, so even though that is what is expected, perhaps that part is not yet experimentally established. The tricky part to understanding this whole experiment is the transition from the weak measurements to a reliable conclusion.

Photons versus fields. The authors do speak of "repeatedly counting photons in a cavity as marbles in a box", and I definitely had in mind a naÃ¯ve picture where three or four photons in a cavity are like three or four marbles in a box. If instead you think of the field picture, it is not so simple. But for Schrodinger cats, I always thought that the bigger and more complex, the better.

Thanks for your thoughtful replies, which helped me get a better grasp on the results of this experiment.

Of course, I say this mostly because they don't make any claim to see a finite time for the quantum jump. If they really could resolve that, and it was real, that would be a very big deal, and they would certainly mention it.

I thank Chad for the answers and Jim for another illuminating analysis. AFAIU the collapse is progressive, but the partial jumps are still considered instantaneous (or at least below the resolution of the experiment).

Donate

ScienceBlogs is where scientists communicate directly with the public. We are part of Science 2.0, a science education nonprofit operating under Section 501(c)(3) of the Internal Revenue Code. Please make a tax-deductible donation if you value independent science communication, collaboration, participation, and open access.

You can also shop using Amazon Smile and though you pay nothing more we get a tiny something.

More by this author

ScienceBlogs is coming to an end. I don't know that there was ever a really official announcement of this, but the bloggers got email a while back letting us know that the site will be closing down. I've been absolutely getting crushed between work and the book-in-progress and getting Charlie the…

It's been a couple of years since we lost the Queen of Niskayuna, and we've held off getting a dog until now because we were planning a big home renovation-- adding on to the mud room, creating a new bedroom on the second floor, and gutting and replacing the kitchen. This was quite the undertaking…

Another month, another set of blog posts. This one includes the highest traffic I think I've ever seen for a post, including the one that started me on the path to a book deal:
-- The ALPHA Experiment Records Another First In Measuring Antihydrogen: The good folks trapping antimatter at CERN have…

I keep falling down on my duty to provide cute-kid content, here; I also keep forgetting to post something about a nerdy bit of our morning routine. So, let's maximize the bird-to-stone ratio, and do them at the same time.
The Pip can be a Morning Dude at times, but SteelyKid is never very happy to…

Our big home renovation has added a level of chaos to everything that's gotten in the way of my doing more regular cute-kid updates. And even more routine tasks, like photographing the giant pile of kid art that we had to move out of the dining room. Clearing stuff up for the next big stage of the…

More reads

"[The black hole] teaches us that space can be crumpled like a piece of paper into an infinitesimal dot, that time can be extinguished like a blown-out flame, and that the laws of physics that we regard as 'sacred,' as immutable, are anything but."
-John A. Wheeler
To an astronomer on any other world, the most important object in our Solar System wouldn't be the Earth, but rather our Sun. Just…

Vertical agitation meets shame in Fish2Fork, a new seafood conservation effort led by Charles Clover (author of End of the Line), which seeks to highlight which restaurants are best and worst when it comes to the seafood they sell. The focus on restaurants is a great move and I particularly like how Fish2Fork highlights the 'top 10' and 'bottom 10' restaurants.
As a quibble, I wish the "We say…

“Imagination will often carry us to worlds that never were. But without it we go nowhere.” -Carl Sagan
For thousands of years, humanity has looked up at the night sky and wondered at what might be out there. For the first time in all of our history, we not only have the answers to what's present in the Universe, we not only know the nature of most objects we see (and infer), but we even have…