Pages

Friday, June 24, 2016

Where can new physics hide?

Also an acronym for “Not Even Wrong.”

The year is 2016, and physicists are restless. Four years ago, the LHC confirmed the Higgs-boson, the last outstanding prediction of the standard model. The chances were good, so they thought, that the LHC would also discover other new particles – naturalness seem to demand it. But their hopes were disappointed.

The standard model and general relativity do a great job, but physicists know this can’t be it. Or at least they think they know: The theories are incomplete, not only disagreeable and staring each other in the face without talking, but inadmissibly wrong, giving rise to paradoxa with no known cure. There has to be more to find, somewhere. But where?

The hiding places for novel phenomena are getting smaller. But physicists haven’t yet exhausted their options. Here are the most promising areas where they currently search:

1. Weak Coupling

Particle collisions at high energies, like those reached at the LHC, can produce all existing particles up to the energy that the colliding particles had. The amount of new particles however depends on the strength by which they couple to the particles that were brought to collision (for the LHC that’s protons, or their constituents quarks and gluons, respectively). A particle that couples very weakly might be produced so rarely that it could have gone unnoticed so far.

Physicists have proposed many new particles which fall into this category because weakly interacting stuff generally looks a lot like dark matter. Most notably there are the weakly interacting massive particles (WIMPs), sterile neutrinos (that are neutrinos which don’t couple to the known leptons), and axions (proposed to solve the strong CP problem and also a dark matter candidate).

These particles are being looked for both by direct detection measurements – monitoring large tanks in underground mines for rare interactions – and by looking out for unexplained astrophysical processes that could make for an indirect signal.

2. High Energies

If the particles are not of the weakly interacting type, we would have noticed them already, unless their mass is beyond the energy that we have reached so far with particle colliders. In this category we find all the supersymmetric partner particles, which are much heavier than the standard model particles because supersymmetry is broken. Also at high energies could hide excitations of particles that exist in models with compactified extra dimensions. These excitations are similar to higher harmonics of a string and show up at certain discrete energy levels which depend on the size of the extra dimension.

Strictly speaking, it isn’t the mass that is relevant to the question whether a particle can be discovered, but the energy necessary to produce the particles, which includes binding energy. An interaction like the strong nuclear force, for example, displays “confinement” which means that it takes a lot of energy to tear quarks apart even though their masses are not all that large. Hence, quarks could have constituents – often called “preons” – that have an interaction – dubbed “technicolor” – similar to the strong nuclear force. The most obvious models of technicolor however ran into conflict with data decades ago. The idea however isn’t entirely dead, and though the surviving models aren’t presently particularly popular, some variants are still viable.

These phenomena are being looked for at the LHC and also in highly energetic cosmic ray showers.

3. High Precision

High precision tests of standard model processes are complementary to high energy measurements. They can be sensitive to tiniest effects stemming from virtual particles with energies too high to be produced at colliders, but still making a contribution at lower energies due to quantum effects. Examples for this are proton decay, neutron-antineutron oscillation, the muon g-2, the neutron electric dipole moment, or Kaon oscillations. There are existing experiments for all of these, searching for deviations from the standard model, and the precision for these measurements is constantly increasing.

A somewhat different high precision test is the search for neutrinoless double-beta decay which would demonstrate that neutrinos are Majorana-particles, an entirely new type of particle. (When it comes to fundamental particles that is. Majorana particles have recently been produced as emergent excitations in condensed matter systems.)

4. Long ago

In the early universe, matter was much denser and hotter than we can hope to ever achieve in our particle colliders. Hence, signatures left over from this time can deliver a bounty of new insights. The temperature fluctuations in the cosmic microwave background (B-modes and non-Gaussianities) may be able to test scenarios of inflation or its alternatives (like phase transitions from a non-geometric phase), whether our universe had a big bounce instead of a big bang, and – with some optimism – even whether gravity was quantized back them.

5. Far away

Some signatures of new physics appear on long distances rather than of short. An outstanding question is for example what’s the shape of the universe? Is it really infinitely large, or does it close back onto itself? And if it does, then how does it do this? One can study these questions by looking for repeating patterns in the temperature fluctuation of the cosmic microwave background (CMB). If we live in a multiverse, it might occasionally happen that two universes collide, and this too would leave a signal in the CMB.

New insights might also hide in some of the well-known problems with the cosmological concordance model, such as the too pronounced galaxy cusps or the too many dwarf galaxies that don’t fit well with observations. It is widely believed that these problems are numerical issues or due to a lack of understanding of astrophysical processes and not pointers to something fundamentally new. But who knows?

Another novel phenomenon that would become noticeable on long distances is a fifth force, which would lead to subtle deviations from general relativity. This might have all kinds of effects, from violations of the equivalence principle to a time-dependence of dark energy. Hence, there are experiments testing the equivalence principle and the constancy of dark energy to every higher precision.

6. Right here

Not all experiments are huge and expensive. While tabletop discoveries have become increasingly unlikely simply because we’ve pretty much tried all that could be done, there are still areas where small-scale lab experiments reach into unknown territory. This is the case notably in the foundations of quantum mechanics, where nanoscale devices, single photon sources and – detectors, and increasingly sophisticated noise-control technics have enabled previously impossible experiments. Maybe one day we’ll be able to solve the dispute over the “correct” interpretation of quantum mechanics simply by measuring which one is right.

So, physics isn’t over yet. It has become more difficult to test new fundamental theories, but we are pushing the limits in many currently running experiments. [This post previously appeared on Starts With a Bang.]

Another source of new physics is the problem of neutrino mass. From oscillation data with two mass squared differences, the minimal neutrino mass model is with zero smallest mass: m1 = 0, m2 = 0.009 eV, m3 = 0.05 eV.Experimental indications in cosmological, astrophysical and terrestrial data on the sum of nu-masses Ʃ =~ 0.06 eV may be a test of this model.

I am biased because I like gravity. One set of 'table top' experiments that I find interesting are the 'big G' - Newton's constant experiments. They continue to gain accuracy yet different teams end up with different conflicting results. Perhaps there is something going on. https://physics.aps.org/synopsis-for/10.1103/PhysRevLett.111.101102 for example.

This isn't a post about open problems in physics, but about parameter ranges where new physics can hide. I also don't know what problem you are referring to. The mass term or the masses themselves or the mixing matrix? Best,

A most excellent question... May I postpone the answer to a future post? The reason is that (leaving aside an experiment I proposed myself) I don't presently have a good overview. But it's something I wanted to look into anyway. Best,

I think that the high powered atom smashing experiments have gone beyond the point of diminishing returns. I am still unsure what they discovered when they identified a spike in their sensors reading at some gillons of eV. It was a rather anticlimactic find, gave no clue as to the makeup of quarks as far as I know. In a sense I thinkj we have got the cart before the horse,I think we got to take a step back and punt to make progress on what God's mind is like. For example how do we know that gas moleculues, molecules widely spaced from each other and free to spin and rotate, need to push off each other in order to gain or lose momentum. There is a theory that gas molecules do not need to push off anything to gain momentum, just add energy and they will turn energy into momentum directly and propagate away faster at the higher energy level. It's does not conform to Lagrainian Physics which has newtons 3rd law as the cornerstone, but no one has ever put the theory to the test either.

But what if there is no new physics up the Cohen-Kaplan threshold of about 100 TeV? What if the puzzles confronting the Standard Model are there because we do not have the right analytic tools or because we are asking the wrong questions?

I think you miss the most unexpected, even if it can come from point 6.

Where can new physics hide?

7. Right where the theory states it cannot be, to begin with its own free parameters. Why would you need a new zoo to explain this? It is pretty arrogant to think that the old physics is not "Not Even Wrong" (- and of course the opposite as well). If the new physics hides in a form that is not particles, all I've read from mainstream routes to physics is just to be forgotten. Why should we deny this possibility?

"Have people already proposed experiments which can distinguish between different interpretations of QM?"

Pretty much by definition, if experiments can distinguish between two "approaches", then they are not different interpretations, but different theories.

Of course, this might mean that what has previously been considered an interpretation is actually a different theory. About 15 years ago, Penrose was talking to Zeilinger, but I don't know what came of this.

With regard to hiding places, what are your views on the “transactional interpretation” of quantum mechanics? Feynman said he worked on a similar notion for years, but “I never solved it.” It involves what seems to be a significant paradigmatic shift with its notion of a "sub-empirical" or “pre-spatiotemporal” contextual level of reality. Further, it seems compatible with experiments being done on hydrodynamic quantum analogues if that is at all relevant.

Another place to look for new physics is "theory space", often aided by computers. As we come to believe that we have the right fundamental equations of the universe, tools like lattice QCD and massive N-body simulations of the universe can explore aspects of complex systems in particle physics and cosmology that are otherwise unobservable with the detail necessary to gain real understanding.

For example, no one has ever done a fully relativistic details N-body simulation of a spiral galaxy with a number of bodies even remotely approaching the number of stars in one, because the calculations are too daunting. Instead, they do Newtonian gravitational N-body simulations and then use analytic methods for reassurances that the deviations due to GR shouldn't be too great. But, as numerical simulations become more powerful with more computing power, more realistic models could address seemingly anomalous data like the fraction of LIthium in the universe, and learn that bad models rather than new physics could drive the results (or that results that seemed to fit existing physics don't when more realistic models are considered).

Theory space can also generate explanations for relationships, for example, between constants in the Standard Model, that were previously treated as merely arbitrary, and can determine that theories that were once thought to be competitors are actually equivalent to each other mathematically.

I've worked on a similar notion for years, but never solved it... I'm a big fan of backwards causation, which basically removes the measurement problem. This may or may not be the same as what you are talking about. Unfortunately I can't find anyone to test my idea. (It's fairly model independent, hence not necessary to develop all the theoretical details.) Best,

"Theory space can also generate explanations for relationships, for example, between constants in the Standard Model, that were previously treated as merely arbitrary, and can determine that theories that were once thought to be competitors are actually equivalent to each other mathematically."

Can you link to specific references showing this capability?...thanks!

No, nothing published except for this. I kind of hint at it in the paper, but the most obvious form of superdeterminism is backward causation. It means, basically, the prepared stays knows from the onset it will be detected (future boundary condition). There's hence no mystery in the measurement process because it doesn't have to happen spontaneously.

It's possible to make up some kind of model for this, though it's somewhat pointless because, by construction, anything that happens before the measurement isn't measured. And I have made up some things, but not really happy with either of these. It's one of these little fun projects I do on the side when nobody is looking, so can't say progress has been fast. But I hope that sooner or later something will come out of it. Best,

Sabine, very nice post ( as usual) . We do have one piece evidence in pure particle physics that there is evidence beyondSM, which is neutrino oscillations. However despite last year's nobel prize it seems to me that see-saw models and otherswhich explain neutrino mass are completely decoupled from other BSM physics theories at TeV scales (unless I am wrong).Anyhow, based on non-0 nu mass any hints on where we might be expect new deviations among the possibilities you listed?Shantanu

Ah, yes, the neutrino masses and the endless discussion of whether or not they are part of the standard model...

My guess is 3) high precision and 6) right here. Re 3) There are actually various strange anomalies at high precision that we don't have good explanations for (eg neutron lifetime, proton radius). Re 6) As I've mentioned on various other occasions, I don't think quantization is fundamental, and that it's possible to measure deviations from it. Best,

I have discovered the nature of dark matter: Uncle Al's comments. Think about it. By now there must be more than enough to account for all the dark matter. They show up everywhere but don't seem to interact with anything else, except by weighing things down. They are both macho and wimp at the same time.

One of the most famous examples is the discovery that different competing versions of string theories could be embedded into M-theory.

Another much earlier one was the discovery that Newtonian gravity implied Kepler's laws and more.

Theory space made it possible to explain all of thermodynamics as statistical mechanics, and made it possible to explain all of the quirks of electromagnetism and optics with QED.

Simulations are a leading method by which QCD is conducted and has been used to rule out myriad dark matter models.

The CKM matrix is a theory that reduce dozens of observations into just four parameters each.

Similarly, theory space has made it possible to devise renormalization equations make it possible, for example, to take measurements of the strength of the strong force at all sorts of momentum scales and reduce all of those measurements to a single fundamental constant (the strong force coupling constant at some arbitrary value usually a weak force boson mass).

Even in mathematics, numerical approximations and studies have been pivotal in hypothesis formation and identifying theories that are worth trying to prove analytically because they are likely to be true.

Clearly, there is some confusion in terminology regarding what "theory space" is.

The notion of "theory space" in a four-dimensional gauge theory was introduced by Arkani-Hamed, Cohen and Georgi in 2001 and refers to graphs containing "sites" (gauge groups) and "links" (matter fields). See http://arxiv.org/pdf/hep-th/0109082v1.pdf

http://www.distance-cities.com/search?from=Tokyo%2C+Japan&to=Los+Angeles%2C+CA%2C+United+States 5,481.18 mi (Measured city coordinates, above, are close enough.)

Euclid cannot accurately draw geoid geodesics. 6.8% divergence is empirically wrong. Almost 50 years of gravitation theory are empirically sterile. You belittle the singular original thought that is outside failed theory, measurable in existing apparatus, and consistent with prior observation. Look. The worst it can do is succeed.

Ah, it grieves me to hear that! My hands would like to live in a universe where the future has some small modulus of plasticity.

I read somewhere (can’t find the quote now) that backward causation might make the universe a little more organism and less deterministic mechanism. It suggested a kind of recursion wherein the future could effect the past and that in turn would affect the future. It’s hard to think that one through.

Nevertheless, I am curious as to your thoughts on notion of the physical universe occurring within a "sub-empirical" or “pre-spatiotemporal” context.(John Cramer refers to backward causation as being “atemporal”)

Riffing on that notion one might float words picturing some sort of topological crucible wherein the flame-like physical world emerges. Perhaps also that every physical entity is entangled with a realm wherein there is no time-like or space-like distinction.

“From a more fundamental perspective, our results imply that there exist experiments whose outcomes are fully unpredictable. The only two assumptions for this conclusion are the existence of events with an arbitrarily small but non-zero amount of randomness and the validity of the no-signaling principle. Dropping the former implies accepting a super-deterministic view where no randomness exists, so that we experience a fully pre-determined reality. This possibility is uninteresting from a scientific perspective, and even uncomfortable from a philosophical one. Dropping the latter, in turn, implies abandoning a local causal structure for events in space-time. However, this is one of the most fundamental notions of special relativity, and without which even the very meaning of randomness or predictability would be unclear, as these concepts implicitly rely on the cause-effect principle.” -- Physical Randomness Extractors: Generating Random Numbers with Minimal AssumptionsKai-Min Chung,Yaoyun Shi, Xiaodi Wu, March 9, 2015

I am not certain, but with regards to super-determinism it seems the ball is still in play. As to a larger “pre-spatiotemporal” context, I note that the above statement refers to “events in space-time”.

In any case it is a lot to digest and I wonder if you have any thoughts on the matter.

Cramer refers to atemporal, but as far as I know, the atemporal part of the transaction only addresses the state of a particle communicating with itself between two successive events involving it. I do not see anything wrong with SR there, it rather looks like the particle is a two-ways phone wire between its interactions (measurements). I also understand that as long as the transaction does not break uncertainty (since it must be expressed as an action anyway, not only as phases), there is no problem with QM.

You may well be right on this. On a naive level you have "two successive events" implying time-like separation and "atemporal" is atemporal. In an interview Cramer described a photon reaching his eye from a distant star as an example of an atemporal connection.

The notion of the universe existing within a proto-physical frame is intriguing. It could not be physically probed, but might be inferentially decoded.

One paragraph that caught my attention in John Cramer's "The Transactional Interpretation of Quantum Mechanics and Quantum Nonlocality" http://arxiv.org/pdf/1503.00039v1.pdf spans the bottom of page 12, and the beginning of page 13. Quoting the last part of the final sentence "...in which the advanced and retarded waves before emission and after absorption largely cancel out, leaving little in the way of residue."

He seems to be saying that perfect cancellation between the advanced and retarded waves is not necessarily always the case. That suggests that detection of the effects, of say, a tiny residue of advanced waves might be possible in the laboratory.

So it hasn't been measured as you say by construction. But is it there waiting to be discovered? Is that what your backward causation would settle once and for all. It is there, it exists pre measurement that's alright because an act of measurement in the future brought it into existence in the past so that the future action doesn't end up with egg on its face after a no show ("80% of success is showing up" - woody allen)

and this takes the spooky shit out of the measurement process?(!)

p.s. I'm 99.894% sure that the possibility of pre-existence - regardless of how it may come about - has now been ruled out in a test that finally eliminates all known loopholes (for a long time the loopholes have been immensely implausible to happen, but yes, possible)

inMatrix - it really doesn't matter what we decide the hidden variable is, but what is the answer to whether it is there before it gets measured. Maybe it's backward causation, or retroactive temporal cud-chewing before the grass was eaten. I don't think it matters. The problem is simply being there.

I don't know what it means for something to be "there" if you can't measure it. Honestly, I'm not actually very interested in all these philosophical quantum debates (is it real or is it not), it just bugs me that nonlocal hidden variables have never been ruled out. I suspect it's because they please no side of the debate: yes, you have determinism back, but no, you still have a quantum state, and yes, it's non-local, but no, it's not Copenhagen either. So nobody is happy and nobody wants to even think about it. Best,

Sure, but the current state of play is that there is no superdeterminism theory, so a number of long running questions are effectively settled by the fact there are no more loopholes

"I don't know what it means for something to be "there" if you can't measure it."

The meaningfulness is the closely intertwined relation between that, and things like 'spooky action at a distance', including the matter of whether two entangled particles signal superluminally. After-all, if you don't accept that they do this, then you have to believe the correlation is settled previously. This is because, despite what a lot of people are currently saying, 'non-locality' is exchangeable with 'superluminality' (if the correlation of the entangled particles is set by sub-luminal/luminal signalling, then the action is local.)

"Honestly, I'm not actually very interested in all these philosophical quantum debates (is it real or is it not)"

I totally agree, generally speaking. The matter above is scientific not philosophic for reasons given.

"it just bugs me that nonlocal hidden variables have never been ruled out. "

It's because the only use of hidden variables is to complete a sub-luminal/luminal mechanism, which intrinsically rules out non-locality.

The 99.894 figure was obviously tongue in cheek. As to the point of substance, I am willing to provide this, but I will do so the same way that you would for yourself - googling. Have a go, and if it doesn't work out, by all means respond here and I will be your google monkey :o)

@ Piein, 1."AbstractBell's theorem asserts that predictions of standard quantum mechanics can not be reproduced by other theories based on the concept of 'hidden variables', introduced by A. Einstein. It is argued that contrary to popular opinion, the mentioned theorem scopes only a narrow class of theories with 'hidden variables', but absolutely not the general case. A comparative analysis of proposed “hidden time” concept and relative concepts as “transactional interpretation of quantum mechanics” by John G. Cramer and “theory of elementary waves” by Lewis E. Little is supplied."---2. > superdeterministic In Russian.PartConstant 137Abstract

Quantum fluctuations are seen as the result of subjective renormalization objective world dynamics. On this basis, an attempt to understanding the origin of the fine structure constant.

Hi InMatrix - arguments that there are classes outside Bell's scope that remain untested, are basically talking about something else. There is no possibility of this in terms of the question the Bell theorem actually solved. We know this because the question itself is intrinsically bound by logic. There is no conventional 'outside'. Of course there is always the possibility of new theoretical paradigms, that sweep the terms and working concepts of the original question that Bell solved, off the table. But that's just a truism. For it to happen QM would have to be encapsulated by a new theory (similarly to the way Newton's was)

the main question in my opinion is not about measurement; but what would it bring that is not about measurement. (The reason is that Cramer states that backward causation would cause an equipment breakdown - which he tries to obtain. It may be that he even measured it because the percentage of entangled photons in his experiment is utterly low if compared to similar setups on the same topic. But un-provable - or maybe try with 1000 lasers taken at random and check backward dependence of the entanglement rate w.r.t future usage of the lasers.)

It would bring that the wave function is composite, because there is a-priori no reason for the measurement to trig anything special. The up/down-time stuff need be permanent, and then there is no such thing as a particle - more like a thread. The remaining question being: what is the present time?

I understand you question is about the end of my message. So I'll try to explain better.

In Cramer's theory, time if fully symmetrical (like QED). Then you have to apply the symmetry completely. - In classical thinking (and also in Bell's hidden variables), a particle transports energy and information from T1 to T2, with T1 > T2. - Apply full time symmetry, it also transports information and (anti?) energy from T2 to T1, with T2 > T1. => That's a-temporal, but let us push a little more.

=> Time exists as a true dimension, then T1 and T2 exist concurrently and currently, whatever T(now) is.- Then the wave function transports information both ways.- Then a particle is a pair of physical currents going up and also down the time. => That's what I call a thread (split the wave function with up and down parts (may be just Psi and Psi_star if I remember Cramer correctly).

=> The measurement problem vanishes (most of QM interpretations become fairy tales because the problem is nonexistent).

Now for instance consider two entangled photons.

P1 goes from A to B1 = (x, y, z, t), P2 from A to B2 = (x', y', z', t'). The entangled photon pair transports information (but only some state synchronization) between B1 and B2 (whatever the dates therein.) Information can be transferred via A; then we do not care which observer is considered as long as B1 and B2 are within the light cone of A. So, there is no problem with Mr Lorentz.

The next problem, as I conclude, is to understand what the present is... because there is nothing special about "now" in this reasoning. But (as per Cramer) if there is an initial event (big bang), then the sequence of events becomes oriented. I think it is workable to assume that the initial event propagates at light speed.

"Clearly, there is some confusion in terminology regarding what "theory space" is. The notion of "theory space" in a four-dimensional gauge theory was introduced by Arkani-Hamed, Cohen and Georgi in 2001 and refers to graphs containing "sites" (gauge groups) and "links" (matter fields). See http://arxiv.org/pdf/hep-th/0109082v1.pdf"

As one can discern from any standard dictionary, it isn't uncommon for a word or phrase to have multiple meanings that are not identical, even in the same general field. For example, even within business law, "convey" can mean either "communicate information" or "transfer property." Many words or phrase have half a dozen senses.

Obviously, I'm not referring to that technical sense of the term that Akrani-Hamed, et al. chose to use in a fifteen year old journal article. Instead, I use the term to refer to the abstract concept of the set of all possible well formulated physical theories that could explain Nature.

A couple of days ago I talked to a physicist regarding the issue:gravitationally coupled electroweak monopoles, see: https://arxiv.org/pdf/1605.08129.pdf

The answer of the physicist made me think, that the discovery of such a particle would have a big influence on physics.

I am asking myself: How deeply would the detection of such a particle influence quantum gravity ? Would it cause a fundamental crisis?Monopoles don´t seem to have an important role in quantum gravity or at least they are normally not mentioned in popular articles.

I think new physics could hide behind an gravitationally coupled electroweak monopole