on the so-called “loophole free” Bell entanglement test and other hot developments in QM theory/ applications

⭐ 😮 😀 😎 hi all, its a very lively time for physics news and breakthroughs, its hard to keep up. its been over half a year since the last physics post in this blog and am swimming, almost drowning in topics/ links.

the very big news is a so-called “loophole free” Bell entanglement test, freshly published in Nature and referred to in NYT.[a][a5][a9] and of course lauded almost effusively by the experts/ authorities.[a4] 🙄

they say that they have closed the detector efficiency loophole aka the “fair sampling” assumption.

but wikipedia (currently) says under “fair sampling assumption”:

The “detection efficiency”, or “fair sampling” problem is the most prevalent loophole in optical experiments.
…
There is no way to test experimentally whether a given experiment does fair sampling, as the number of emitted but undetected pairs is by definition unknown.[citation needed]

huh! that seems to be a disconnect there. there is both some blurring between the “detector efficiency” loophole and the “fair sampling” assumption, and it asserts the fair sampling assumption can never be closed. but the Nature article just asserted they closed it! is anyone else experiencing some cognitive dissonance here? (so the big question is, who areya gonna believe, Nature or Wikipedia? and of course wikipedia alone nearly contradicts itself, or at best cant make up its mind!)

have been trying to understand the bell test now for close to 2 decades. it is indeed one of the most subtle areas of physics and quantum mechanics.

now, for the contrarian position. again maybe approaching the unofficial motto of this blog:

the devil is in the details! 😈

my feeling is that there is a new theory right on the threshhold of being uncovered. the difficult aspect of it is that it nearly meshes perfectly with quantum mechanics. in other words, Einstein was right! he gave up on trying to prove any contradictions in quantum mechanics and finally just fell onto a position that its incomplete.

this is a very subtle difference that even eludes many professional physicists. it is not the same as saying its wrong. at heart its saying, simply, theres something more to it. and somewhat paradoxically, physics experiments based on an understanding of QM will never find this incompleteness. therefore in some sense, just as closing the sampling loophole is impossible as asserted by wikipedia, it is impossible to “prove einstein wrong” as is being asserted about the experiment in popsci accounts in the media! aka “you cant get there from here.”

which reminds me of a semifamous quote by einstein:

No problem can be solved from the same level of consciousness that created it.

bells test is indeed a zen-like “what is the sound of one hand clapping” type experiment… but heres a few of my own (rhetorical!) zen-like questions for QM and bell-oriented experiments!

why is it so hard to experimentally and theoretically determine/ understand whether QM is local? why has/ does this debate persist so long, now over 1 century since/ after its inception?

why are there so many “loopholes” in bell tests as cited on wikipedia? [a11]

my feeling is that a new theory is right on the horizon and that glimpses/ bits of pieces are lying around as we speak. this is a very difficult position to take in light of all the massive intellectual edifice(s) at stake (aka, verging on dogma!).

however there is some support. there are nearly weekly announcements of major shifts in QM experimentation techniques. these roughly fall under 4 basic headings/ themes:

weak measurement

“steering” (the “wavefunction collapse”)

assertions/ “proofs” of the reality of the wavefunction

“retrodiction” of measurements

its very subtle how all these interplay, they are all interconnected and crosscutting.[b] internationally there are a few laboratories specializing in each.

so far nobody is so bold as to say that the theory needs revising and that “textbooks need to be rewritten”. oh wait! Daniel Sank, Google/ Martinis lab researcher who occasionally hangs out in the physics chatroom on stackexchange and works at google on quantum qubits has said nearly exactly that in the chat room (eg about weak measurements and steering).[abc] which is exactly how kuhn defined “kuhnian shifts”. Sank is a coauthor on an amazing paper mixing bell and weak measurement.[b4] … oh, and by the way, it demonstrates/ measures an interplay between weak measurements and bell nonlocality metric measurement exactly as one would wonder/ guess about a new theory…! 😮 ❗

➡ ⭐ 💡 ❓ ❗ the 4-way “new features” of QM above being observed and increasingly manipulated (along with others closely intertwined/ related) will eventually show that QM is in fact NOT nonlocal, but that the extreme subtleties of the formalism coupled with experiment have contributed to creating that conceptual illusion in the mathematics/ ideology. a purely local version of QM will be able to be formulated, and it will mesh perfectly with the existing formalism in a nearly magical/ miraculous looking way. of all current theories, it will most closely resemble Bohmian mechanics, but is at heart an entirely new theory of solitons. it will take the Einstein of the 21st century to tie together/ work out the extremely tricky/ intricate nuances/ details/ “connect the nearly-invisible dots”, and some of it is already lying scattered/ around in current papers!

⭐ ⭐ ⭐

whew! bell tests are an entire universe of their own and an entire textbook could easily be written on it, and quite a few have. (my all time favorite is still the Meaning of Quantum Theory by Baggott.)

the next cool/amazing items on the list of physics developments are QM computing advances. there is just so much going on in this area and it seems maybe 2015 is the year its all hitting critical mass. the bits and pieces of devices are now being demonstrated in dramatic announcements worldwide eg control of individual qubits, error correction, silicon-based qubits, photonics etc.[d]

big news on NSA worrying about QM crypto, continued angst over meshing QM and gravity.[c]

section [e] is a list of all the recent experiments and theoretical ideas about the universe being a “hologram” or “simulation”. this ties in a lot with the theme of digital physics explored on this blog and going back decades but slowly coming to much more fruition.

[f] is a big deal to me personally, advances in the soliton theory of reality, after introducing it early on in this blog. very notably there was a review in Nature, essentially a breakthrough.[f13] skepticism of string theory, the only major contender for a “theory of everything” in physics, in a way seems to help along openness or consideration other theories. also observations of/ “officially acknowledged” so-called “anomalies” and “reevaluations” in the massive edifice of mainstream physics help with this.[h] early pioneer (…”with all the arrows in his back”…) Ross Andersons fiery advocacy/ defense of soliton theory appeared and was inspiring.[f22] and Bush the MIT applied mathematician is in my opinion a spectacularly brilliant/ leading visionary in this area, and do assure you, do not say that lightly around here![f1]

another big news item/ massive milestone in the area, Dwave announced their new 1-K-Qubit chip.[g] but so far unlike a few yrs ago with the ½-K-Qbit chip, news/ developments of its analysis/ performance has been very scarce/ quiet.