Temporal order and identity both get confused in the quantum world.

One of the stranger features of the quantum world is that light—even individual photons—can behave as a wave or a particle, depending on how you measure it. But, according to papers released by Science today, the quantum weirdness doesn't end there. Researchers have now found a way to put a photon in a quantum superposition where it is both a wave and a particle at the same time. Worse still, one setup allows them to determine the photon's nature as a wave or particle after it has gone through an apparatus where it must act as one or the other.

Got that? Didn't think so, so let's go through it in more detail.

The two experiments use a similar design. Polarized photons are sent one at a time into a device, where they first encounter a beamsplitter, which has a 50/50 chance of sending them down one of two paths. On one of the paths, they will encounter a device that rotates the polarization a bit. Mirrors then send the photons toward an intersection flanked by two detectors.

If the intersection is empty, the photons will act as particles, meaning that they'll travel down one of the two paths, and the two detectors will click with equal frequencies. If, however, a second beamsplitter is present, the two photons can recombine and interfere with each other. In this case, the wave-like nature of the single photon takes over. It travels down both paths, interferes with itself, and, because of this interference, can only strike one of the two detectors. The presence of the second beamsplitter determines whether a photon will act as a wave or a particle.

One objection to this is that there could be some mechanism by which a photon could detect the sort of device it's about to enter and behave accordingly (a possibility that could occur through some hidden variables). The obvious way to deal with these objections is to decide whether or not to put the second beamsplitter in place only after the photon has passed through the first one, and must already be acting as either a wave or particle. But photons move very quickly, so switching a device in time would be exceedingly challenging.

This being quantum mechanics, it turned out to be easier to violate causality instead.

Using two entangled photons, it's possible for measurements of one photon to tell us about the fate of the second. If we send one photon through the device and then measure the second, it's possible to determine whether the first one should encounter the beamsplitter or not. Through the use of a delayed choice experiment, it's even possible to perform this measurement after the first photon has been through the entire apparatus and hit the detectors. In other words, the results of the measurement of photon two dictate what photon one must have already done.

So that's precisely what the authors of one paper did. By entangling the polarization of two photons and sending one through the device, then measuring the second, they could determine whether the beamsplitter was present after the first photon encountered it. Through a test of Bell's inequalities, which account for the presence of hidden variables, they were able to determine that the system was behaving in a truly quantum manner: the photons were both a wave and a particle at the same time.

The second paper did something substantially similar, but also did something a bit more complex with the second photon. Through a careful manipulation of its polarization, researchers were able to set the first photon (the one that went through the apparatus) into a superposition of wave and particle states, meaning it was an indeterminate mixture of the two. By changing how they handled the second photon, they could also control the probabilities of this superposition, making it more wave or particle-like on demand. As with the other experiment, they did this all after the first photon had been through the device and been measured by the two detectors.

Any one of these facts—being a wave and a particle at the same time, a measurement influencing the behavior of events that have already taken place, and a physical beamsplitter being in a quantum superposition of present and not present—are all pretty mind-bending. Putting them all together in a single experiment is especially so.

But an accompanying perspective suggests that the quantum mechanics aren't done yet. If we can get a quantum memory that is stable for extended periods of time, it should be possible to hold off measuring the polarization state of photon two for seconds, perhaps even minutes. In that case, the delay portion of these delayed choice experiments would actually be perceptible to the people doing the experiment.

Promoted Comments

I wish I had more IQ to comprehend that. I love physics but at this level, I can't seem to wrap my head around it.

Sometimes I think if we have got it all wrong, They keep making up more and more complex theories, just to fit the observations. What if some key misunderstandings with basic physics and mathematics have sent us down a path of complex yet wrong theories ?Maybe we should set up an isolated control group and check if they come up with the same theories

It's extremely unlikely that a bunch of completely wrong physics and math would end up so precisely describing empirical reality. It's extremely unlikely that empirical reality would match all the unintuitive and uncomfortable implications of theory if that theory was completely up the wrong alley.

And they aren't coming up with more complex theories to fit the observations.

They're coming up with more complex observations to show that the theory's uncomfortable, unintuitive implications are in fact born out by empirical reality. That all the bizarre things the theory predicts can actually be experimentally verified. That attempts to make the bizarreness go away with things like Hidden Variables don't work.

So it's fine to speculate that QM the theory is completely wrong. The problem is, the bizarre behavior that QM predicts is verified empirical reality, and that reality isn't going to go away. So what theory are you going to propose that is 1) friendly to our aesthetic sensibilities and 2) in line with experiment?

I wish I had more IQ to comprehend that. I love physics but at this level, I can't seem to wrap my head around it.

Sometimes I think if we have got it all wrong, They keep making up more and more complex theories, just to fit the observations. What if some key misunderstandings with basic physics and mathematics have sent us down a path of complex yet wrong theories ?Maybe we should set up an isolated control group and check if they come up with the same theories

Physicists don't consider these kinds of explanations 'the physics'. E.g. the arguments are not of the form of discussions, as we are having them here in the comments section at ars.

The arguments are equations. Lots of them. And data. And how closely the equations, when given good inputs, match the data.

What we are doing here in the forums is attempting to explain those equations in ways that humans can understand "intuitively". We are destined to fail, because we didn't evolve senses that utilize such deep layers of our reality.

If I throw a baseball to you, you can catch it without much thought. Thus, I can describe the physics of Newtonian motion to you in a way that you can understand intuitively - by invoking the motion of the baseball.

You (nor I, nor anyone) have never been forced to action because a photon was or was not entangled. We do not exert inborn electromagnetic field generators in order to manipulate our surroundings at the microscopic scale. True, visceral understanding of WHAT is actually happening at the quantum level is literally impossible for a naturally born human.

But we have the math to explain a lot of it. And we know the math is right because it models reality. All this work going on is to add more equations to the arsenal, or to provide more data points with which to verify the existing math. Not to generate more accurate mental models. As awesome as that would be.

Sometimes I think they just make this quantum physics stuff up as they go along. I can barely grasp the distilled and dummified description of the papers in this article. How many people on the planet really understand the papers themselves?

Sometimes I think they just make this quantum physics stuff up as they go along. I can barely grasp the distilled and dummified description of the papers in this article. How many people on the planet really understand the papers themselves?

The beauty of it is, no one has to. These scientists have been entangled with other scientists that are in cryogenic sleep. When they awake in 100 years, all of this stuff is explained to them, making the present-day scientists mysteriously perform all the right test to move our understanding forward.

The really neat part is, the science 100 years from now is all based on the work being done today.

The wave-partical duality is not a real duality in nature, it's a duality of knowledge. Photons do not "know" what kind of thing they are going to enter, nor do they "react" to measurements the way most armchair physicists believe they do.

Sir James Jeans put it best in his book "Physics and Philosophy" 70 years ago when explaining things.. "Whatever a particle may be in itself, we can never experience it as a point .... The waves in this region depict our knowledge and its imperfections exactly and precisely."

The book is worth the read, especially for anyone taking the mental *constructs* of waves, particles, and so on more literally than they are intended to be taken. They are mental constructs imperfectly representing reality, not an absolute description of reality itself.

It's disheartening that so many people these days, including notable tenured physicists, have forgotten that the mathematical and mental models of Dirac, Einstein, Heisenberg, and all the others were just that -- mental models, not complete (or even partial) descriptions of the true nature of reality. They seem to have stopped trying to come up with better models and are, instead, trying to find new ways to fit observed behavior into the existing models.

I'm off to grab a couple beers and try reading this again. To quote the great Cliff Clavin:

Cliff wrote:

“Well, you see, Norm, it’s like this. A herd of buffalo can only move as fast as the slowest buffalo. And when the herd is hunted, it’s the slowest and weakest ones at the back that are killed first. This natural selection is good for the herd as a whole, because the general speed and health of the whole group keeps improving by the regular killing of the weakest members.

In much the same way, the human brain can only operate as fast as the slowest brain cells. Now, as we know, excessive intake of alcohol kills brain cells. But naturally, it attacks the slowest and weakest brain cells first. In this way, regular consumption of beer eliminates the weaker brain cells, making the brain a faster and more efficient machine.

This from a guy who doesn't understand an iota in QM: The Quantum physics might be made up, the methodical measurements of initial conditions and results of the real-life experiments, are not. If any other simpler theory was able to modelize these real-life measurements as good as QM, I would think QM would have gone the way of the Ether...

The causality problems go away if we assume a many-worlds interpretation. If photon spins are correlated in all "universes", then our measurement of the spin of one of them does not instantly convey information to the other photon faster than the speed of light. Rather, measurements tell the measurer which subset of possible universes she lives in. We deduce limitations on what happened before or elsewhere, but the act of measurement did not cause any change (beyond any physical disturbance caused by the measurement).

I haven't read Sir James Jeans' book, but bsdasym's summary makes it sound like he subscribed to some form of hidden-variable theory (the object has additional properties we can't measure), and Bell's Inequality and the EPR paradox blew those ideas up.

The only explanation I can think of is that all quantum information occupies all of space-time. Would also explain the instantaneous changes over miles between entangled photons. Dark energy, anyone? Maybe Plato was on to something with his Forms.

The causality problems go away if we assume a many-worlds interpretation.

...or if we assume that the detectors are, themselves, entangled with the photon. Perhaps I'm missing something that makes that impossible, but while we are usually used to assuming a detector is simply in one state or another, it seems quite reasonable to assume the detector is also entangled... so the "causality" violation isn't one: the detectors don't decide whether they've seen the first photon or no until after you observe the second photon. The same issue is present in, I believe, all delayed choice experiments, and it's also why I suspect extending the process beyond a few billionths of a second is probably impossible (the entanglement would collapse).

As I always say whenever talking about QM, though, I could be wrong. That's the fun thing: we really have no damned idea what the hell is actually happening, we just have some educated guesses and observations of the end results, which is woefully inadequate (so far) for QM.

I'm not a quantum physicist (wish I at least had more opportunity to study it, as it's monumentally interesting to me). However, what this suggests to me is that perhaps the particle/wave duality that we use to describe photons is starting to become insufficient.

I'm wondering if we don't have to assume a photon can somehow know and decide whether to be a particle or a wave and behave accordingly, and also don't have to assume that we have to break causality.

Perhaps we have yet to understand the photon sufficiently enough to make sense of this. Perhaps the particle/wave duality understanding was sufficient so far, but only an approximation. And because the approximation leads to questions such as whether we are breaking causality.

Of course, I could be wholly misunderstanding something. A surface reading doesn't do these kinds of experiments justice.

Sometimes I think they just make this quantum physics stuff up as they go along. I can barely grasp the distilled and dummified description of the papers in this article. How many people on the planet really understand the papers themselves?

There's a saying in physics:

'If you think you understand Quantum Mechanics - you don't'.

It is that counter-intuitive. Wonderful. Amazing. A pinnacle of human achievement yet utterly obtuse.

I cannot pretend to understand anything except very basic Quantum Mechanics and it blows my mind each and every day that I try to read more into the subject. I believe it is the highest achievement of humankind.

The causality problems go away if we assume a many-worlds interpretation. If photon spins are correlated in all "universes", then our measurement of the spin of one of them does not instantly convey information to the other photon faster than the speed of light. Rather, measurements tell the measurer which subset of possible universes she lives in. We deduce limitations on what happened before or elsewhere, but the act of measurement did not cause any change (beyond any physical disturbance caused by the measurement).

I haven't read Sir James Jeans' book, but bsdasym's summary makes it sound like he subscribed to some form of hidden-variable theory (the object has additional properties we can't measure), and Bell's Inequality and the EPR paradox blew those ideas up.

Oh no not this crap again. Lets swap the non-existent "causality problems" in QM for an thermodynamics screwing infinity of infinites, Great! There aren't any causality problems in QM. Experiment and theory keep showing that you can't use these effects for FTL communication. You just have to accept that electrons and photons, etc are not little balls floating around in space, they are some kind of standing wave like object that while it may have high intensity here or there actually just tapers off into the distance. Once you accept that non-locality, entanglement, wave particle duality, etc really are not that strange. BTW the act of measurement makes your detector (whatever that may be) part of the system its not the measurement that changes the outcome but the mere interaction with the system to have the ability to measure it in the 1st place. QM is a cescription of a system

The causality problems go away if we assume a many-worlds interpretation. If photon spins are correlated in all "universes", then our measurement of the spin of one of them does not instantly convey information to the other photon faster than the speed of light. Rather, measurements tell the measurer which subset of possible universes she lives in. We deduce limitations on what happened before or elsewhere, but the act of measurement did not cause any change (beyond any physical disturbance caused by the measurement).

I haven't read Sir James Jeans' book, but bsdasym's summary makes it sound like he subscribed to some form of hidden-variable theory (the object has additional properties we can't measure), and Bell's Inequality and the EPR paradox blew those ideas up.

Oh no not this crap again. Lets swap the non-existent "causality problems" in QM for an thermodynamics screwing infinity of infinites, Great! There aren't any causality problems in QM. Experiment and theory keep showing that you can't use these effects for FTL communication. You just have to accept that electrons and photons, etc are not little balls floating around in space, they are some kind of standing wave like object that while it may have high intensity here or there actually just tapers off into the distance. Once you accept that non-locality, entanglement, wave particle duality, etc really are not that strange. BTW the act of measurement makes your detector (whatever that may be) part of the system its not the measurement that changes the outcome but the mere interaction with the system to have the ability to measure it in the 1st place. QM is a description of a system

I wish I had more IQ to comprehend that. I love physics but at this level, I can't seem to wrap my head around it.

Sometimes I think if we have got it all wrong, They keep making up more and more complex theories, just to fit the observations. What if some key misunderstandings with basic physics and mathematics have sent us down a path of complex yet wrong theories ?Maybe we should set up an isolated control group and check if they come up with the same theories

Given that we don't really yet know how or why this stuff works as it does, but we do know that it does...

It's becoming painfully obvious that we don't live in a universe that has simplistic Newtonian style causality. Step 1 does not "cause" step 2, which does not "cause" step 3, in a tidy time-forwards linear fashion. The stuff of our universe simply does not work that way. We need to get over our macroscopic perspective misconceptions about cause and effect, time, space, etc.. The universe is not a strictly deterministic, reducible mechanism. Matter simply doesn't work that way, as every next experiment clearly demonstrates. Causality and time are probabilistic and blurred at best, and any illusion to the contrary is an emergent macroscopic generalization of convenience.

Yes, space is an entangled-superposition. The only question that remains unanswered is what the heck is mass.

That's easy!If you mean "mass" as in the common everyday sense as in how much inertia or weight an object has, then mass is just energy. Not a kind of energy, but all energy of the thing being weighed or pushed about. Systems with more energy in them weigh more than ones with less.

If you mean "mass" as in the "intrinsic mass", the mass that certain particles appear to have even when not moving then it's the potential energy with respect to the Higgs field. It's a kind of energy, therefore a kind of mass. But it only contributes a tiny amount of the total energy (mass) of things around you -- only ~10% of a proton's mass is from the intrinsic mass of quarks. The rest is the binding energy of the strong nuclear force between those quarks.

Sometimes I think if we have got it all wrong, They keep making up more and more complex theories, just to fit the observations. What if some key misunderstandings with basic physics and mathematics have sent us down a path of complex yet wrong theories ?

This is quantum theory: If 99.999% of the interference pattern is smaller than the effective aperture of the measuring device, the entire universe is full of nothing but particles.

A virtue gold star to the first person who can say what is wrong with this science.

Yes, space is an entangled-superposition. The only question that remains unanswered is what the heck is mass.

That's easy!If you mean "mass" as in the common everyday sense as in how much inertia or weight an object has, then mass is just energy. Not a kind of energy, but all energy of the thing being weighed or pushed about. Systems with more energy in them weigh more than ones with less.

If you mean "mass" as in the "intrinsic mass", the mass that certain particles appear to have even when not moving then it's the potential energy with respect to the Higgs field. It's a kind of energy, therefore a kind of mass. But it only contributes a tiny amount of the total energy (mass) of things around you -- only ~10% of a proton's mass is from the intrinsic mass of quarks. The rest is the binding energy of the strong nuclear force between those quarks.

E² = (mc²)² + (pc)²

where E is the energy,m is the mass,c is the speed of light,and p is the momentum.

But p = mv, so:

E² = (mc²)² + (mvc)² = m²c²(c² + v²)

Provided, of course, that intrinsic mass is the same as the momentum mass.

Polarized photons are sent one at a time into a device, where they first encounter a beamsplitter, which has a 50/50 chance of sending them down one of two paths... Mirrors then send the photons toward an intersection flanked by two detectors.

So there is one photon, and it has a 50% chance of being polarized, then it goes to an intersection.

Quote:

If the intersection is empty, the photons will act as particles, meaning that they'll travel down one of the two paths, and the two detectors will click with equal frequencies.

OK, if intersection 1 is empty, the particle is detected and the wavefunction decoheres to a particle.

Quote:

If, however, a second beamsplitter is present, the two photons can recombine and interfere with each other. In this case, the wave-like nature of the single photon takes over.

Now suddenly there are two photons? Is this like the double slit experiment where photons build up an interference pattern one by one if they are not detected earlier? There should still just be one photon.

Quote:

It travels down both paths, interferes with itself, and, because of this interference, can only strike one of the two detectors. The presence of the second beamsplitter determines whether a photon will act as a wave or a particle.

Now it seems it is back to one. Seems like the second beamsplitter is allowing the interference pattern to build up somehow.

Quote:

... This being quantum mechanics, it turned out to be easier to violate causality instead.

I think this is a rash thing to say.

Quote:

Through the use of a delayed choice experiment, it's even possible to perform this measurement after the first photon has been through the entire apparatus and hit the detectors. In other words, the results of the measurement of photon two dictate what photon one must have already done.

Why is the word "must" there? If it is retrocausality, you are changing the past from the future. The past was one way, now it is another way after you did X. If you say "what it must have done", it sounds like you are discovering information, not dictating the past.

Quote:

So that's precisely what the authors of one paper did. By entangling the polarization of two photons and sending one through the device, then measuring the second, they could determine whether the beamsplitter was present after the first photon encountered it.

Does that mean they could find out whether the beamsplitter is present, or does it mean that they caused the beamsplitter to either be present or not after the photon had been detected?

Quote:

By changing how they handled the second photon, they could also control the probabilities of this superposition, making it more wave or particle-like on demand. As with the other experiment, they did this all after the first photon had been through the device and been measured by the two detectors.

How did they measure the probabilities of the superposition of the first particle if it had already hit the detector? It shouldn't still be in superposition if it has been detected. Does this mean they read a result off a detector, changed the superposition of the second particle, and then went back and the detector gave a different reading? Clearly not. But what then?

When a photon encounters the two slits, it splits into two entangled, superposition photons. Because they are entangled, they interfere with each other. When one is measured, the superposition collapses allow it to be measured as a particle. (And the other disappears completely.)

Given that we don't really yet know how or why this stuff works as it does, but we do know that it does...

It's becoming painfully obvious that we don't live in a universe that has simplistic Newtonian style causality. Step 1 does not "cause" step 2, which does not "cause" step 3, in a tidy time-forwards linear fashion. The stuff of our universe simply does not work that way. We need to get over our macroscopic perspective misconceptions about cause and effect, time, space, etc.. The universe is not a strictly deterministic, reducible mechanism. Matter simply doesn't work that way, as every next experiment clearly demonstrates. Causality and time are probabilistic and blurred at best, and any illusion to the contrary is an emergent macroscopic generalization of convenience.

I think we have a long and deeply counterintuitive road ahead of us to overcome the mechanistic perspectives we get from living at our scale and truly understand reality.

No. No no no. Seriously, if we throw out causality unconditionally, literally every single thing we know about the universe, including the entirety of quantum mechanics (which, yes, relies on causality for even the most basic claims) is completely destroyed and invalid. Now, things can certainly appear to violate causality, but that means, with near 100% certainty, we don't understand what is actually happening yet (which is the point I already made above).

It's not just that our understanding of the universe would be destroyed if it was truly non-causal, it would be that the universe is inherently and at every level unintelligible. And I am not prepared to accept that (and I can show it isn't true, simply because we can understand the universe to a certain degree). That means it is causal.

It is possible, on some levels, for it to be or at least apparently be, non-causal. If, for example, the causal relationship we observe is the result of a probabilistic outcome of non-causal effects at a sub-atomic level, that could work (or in other words the macroscopic universe would be causal, but the nano-scale wouldn't be entirely so), maybe. It would be similar to how on our level space appears perfectly continuous, yet could (in some theories) be discrete. It would be continuous for most practical purposes, but the continuity would be the result of a infinitesimal discreteness. Similar, not identical, but you get the idea.

Quantum mechanics, so far, appears very much like this. But that could just be an appearance, we don't know quite yet.

Sometimes I think they just make this quantum physics stuff up as they go along. I can barely grasp the distilled and dummified description of the papers in this article. How many people on the planet really understand the papers themselves?

The beauty of it is, no one has to. These scientists have been entangled with other scientists that are in cryogenic sleep. When they awake in 100 years, all of this stuff is explained to them, making the present-day scientists mysteriously perform all the right test to move our understanding forward.

The really neat part is, the science 100 years from now is all based on the work being done today.

No, I think you meant that the science today is all based on the work being done 100 years from now. ;~)

By changing how they handled the second photon, they could also control the probabilities of this superposition, making it more wave or particle-like on demand. As with the other experiment, they did this all after the first photon had been through the device and been measured by the two detectors.

How did they measure the probabilities of the superposition of the first particle if it had already hit the detector? It shouldn't still be in superposition if it has been detected. Does this mean they read a result off a detector, changed the superposition of the second particle, and then went back and the detector gave a different reading? Clearly not. But what then?

My understanding was the manipulation of photon 2 is pre-determined by the experimenter, and the measurements of photon 1 always magically match what was planned for photon 2, even though photon 1 is measured before photon 2 is actually manipulated.

One way this makes sense (relatively speaking) is if wavefunctions have a timelike component, at least in the dimensions involved with entangling. Imagine a complex series of waves on a guitar string that superimpose to create an almost-standing wave near one end that rides down the string at some consistent velocity based on the string's tuning. That would be the timelike wavefunction, which maybe gets messed up relativity-style when the whole system is moving near c. In such a system, interacting with (or literally "fiddling with") the end of the string would cause changes in the almost-standing wave at the other end. Such a manipulation would be felt by any other strings entangled with this one.

Opens a can of worms for causality, but seems as good an explanation for delayed choice experiments as any. Extrapolating, the use of quantum memory to stretch the delay into human-scale time makes the string billions of times longer and therefore much more difficult to manipulate meaningfully.

(Note: using a string as an analogy, not specifically hitching my wagon to string theory.)

I wish I had more IQ to comprehend that. I love physics but at this level, I can't seem to wrap my head around it.

Sometimes I think if we have got it all wrong, They keep making up more and more complex theories, just to fit the observations. What if some key misunderstandings with basic physics and mathematics have sent us down a path of complex yet wrong theories ?Maybe we should set up an isolated control group and check if they come up with the same theories

It's extremely unlikely that a bunch of completely wrong physics and math would end up so precisely describing empirical reality. It's extremely unlikely that empirical reality would match all the unintuitive and uncomfortable implications of theory if that theory was completely up the wrong alley.

And they aren't coming up with more complex theories to fit the observations.

They're coming up with more complex observations to show that the theory's uncomfortable, unintuitive implications are in fact born out by empirical reality. That all the bizarre things the theory predicts can actually be experimentally verified. That attempts to make the bizarreness go away with things like Hidden Variables don't work.

So it's fine to speculate that QM the theory is completely wrong. The problem is, the bizarre behavior that QM predicts is verified empirical reality, and that reality isn't going to go away. So what theory are you going to propose that is 1) friendly to our aesthetic sensibilities and 2) in line with experiment?

This article reads like our current theories are somehow wrong. They aren't, these experiments further confirm basic QM and further disprove our intuitive understanding of the world. There's nothing new here, just the experimental setup making the point clearer.

My understanding was the manipulation of photon 2 is pre-determined by the experimenter, and the measurements of photon 1 always magically match what was planned for photon 2, even though photon 1 is measured before photon 2 is actually manipulated.

You mean they just think about it in their heads and it happens? Sounds doubtful.

I wish I had more IQ to comprehend that. I love physics but at this level, I can't seem to wrap my head around it.

Sometimes I think if we have got it all wrong, They keep making up more and more complex theories, just to fit the observations. What if some key misunderstandings with basic physics and mathematics have sent us down a path of complex yet wrong theories ?Maybe we should set up an isolated control group and check if they come up with the same theories

Physicists don't consider these kinds of explanations 'the physics'. E.g. the arguments are not of the form of discussions, as we are having them here in the comments section at ars.

The arguments are equations. Lots of them. And data. And how closely the equations, when given good inputs, match the data.

What we are doing here in the forums is attempting to explain those equations in ways that humans can understand "intuitively". We are destined to fail, because we didn't evolve senses that utilize such deep layers of our reality.

If I throw a baseball to you, you can catch it without much thought. Thus, I can describe the physics of Newtonian motion to you in a way that you can understand intuitively - by invoking the motion of the baseball.

You (nor I, nor anyone) have never been forced to action because a photon was or was not entangled. We do not exert inborn electromagnetic field generators in order to manipulate our surroundings at the microscopic scale. True, visceral understanding of WHAT is actually happening at the quantum level is literally impossible for a naturally born human.

But we have the math to explain a lot of it. And we know the math is right because it models reality. All this work going on is to add more equations to the arsenal, or to provide more data points with which to verify the existing math. Not to generate more accurate mental models. As awesome as that would be.

Update to prior post: Fuck (excuse the language, working within the constraints of the language of Mellville and Milton).... with all these posts and postulations coming forth, it appears beer is not working