The Fundamental Physics prize continues to be bad for physics.

Ashutosh (Ash) Jogalekar is a chemist interested in the history and philosophy of science. He considers science to be a seamless and all-encompassing part of the human experience. Follow on Twitter @curiouswavefn.

Ashutosh (Ash) Jogalekar is a chemist interested in the history and philosophy of science. He considers science to be a seamless and all-encompassing part of the human experience. Follow on Twitter @curiouswavefn.

Particle tracks from a bubble chamber, one of many indications that physics is a tool-driven revolution relying on hard experiment (Image: Naturphilosophie)

The Fundamental Physics prize has again been awarded to sophisticated mathematical speculation disconnected from experimental evidence. The 2012 Fundamental Physics prize was shared among nine physicists, most of who were string theorists. String theorists continue to dominate the awardees of this year’s New Horizons and Frontiers in Physics prizes. Another string theorist, Alexander Polyakov, received the 2013 prize. The 2014 prize will be announced in December, and the previous trend makes me no more optimistic about who the recipients would be than what I was before.

The brilliance, mathematical abilities and dedication of these men are not in question, but the real-world validity of their theories is certainly uncertain. The first award had already sparked criticism and this year’s award would do little to quell it. While the money is Milnor’s and he can spend it the way he wants to, I find it troubling that the prize sends the wrong message out to young scientists and to the public. It says that the most financially lucrative scientific prize in the world is being awarded to speculative ideas that may or may not have anything to do with reality. It’s also worth noting that a special prize was awarded in 2012 to Stephen Hawking for his “discovery” of Hawking radiation, a phenomenon for which experimental evidence is at best suggestive.

There is another feature of the prize that troubles me even more, the fact that it was awarded to experimental physicists only once, and that too only as part of a special category. Experiment is the bedrock of science; in Feynman’s words, if a theory does not agree with experiment it is wrong, period. It does not matter how beautiful it sounds, it does not matter how smart the person who came up with it is, all that matters is how it compares to numbers from well-designed experiments. Unfortunately recognitions like the fundamental physics prize seem to cherish mathematical elegance, reams of equations, raw IQ, and “star power” much more than hard numerical prediction, let alone hard empirical verification.

Sadly this omission seems consistent with a more general phenomenon that I bemoaned in a previous post – the downplaying of experiment relative to theory in the popular imagination, a flaw in the perception of science that the Milnor prize can only deeply accentuate. Ask most laymen to name a famous physicist and their choices would be both obvious and predictable: Einstein, Feynman, Bohr, Heisenberg, perhaps followed by Dirac, Fermi and Bardeen. Rutherford, Faraday and Curie might be the only experimental physicists widely recognized by the public. Better-informed laymen may perhaps know of Witten, Hawking and Penrose. But few if any would readily remember Millikan, Michelson or Compton, let alone Aspect, Zeilinger, C. W. F Everitt, Mather and Smoot, Pound and Rebka or even the 2012 Nobel Prize winning duo of Wineland and Heroche.

Yet what these scientists have done qualifies every bit as “fundamental science”. Take the work of Aspect, Zeilinger and others who have tested and verified the astonishing phenomenon of quantum entanglement. It’s a prediction of quantum mechanics that is so bizarre that it led some of the greatest physicists of the century including Einstein and Schrodinger to suspend their belief in the theory. Only experiment could have resolved it one way or another. Other experimentalists cited above also made immense contributions. Everitt was the leader of the Gravity Probe B mission which tested a highly fundamental prediction of general relativity (frame-dragging). Mather and Smoot are leaders in the awe-inspiring art of precision cosmology. Three years ago the Nobel Prize was awarded for a startlingbonafide experimental finding that theory just could not have settled – the accelerating expansion of the universe. And going back to the turn of the twentieth century, by finding that there was no ether through which light propagated, Michelson paved the way to relativity and a revision of our view of the universe.

All these experimental discoveries are as deeply in the realm of “fundamental physics” as any major theoretical advance and yet prizes like the Milnor prize ignore them. Instead recognition goes to speculation charted out on pen and paper, not the hard fruits of grease, interferometers and charge-coupled devices. The fundamental physics prize almost completely ignores the fact that physics is as much a tool-driven revolution as an ideas-driven revolution, a trend that will be obvious to anyone who studies the discipline’s history.

This is not the way science is supposed to work. Almost a century ago scientists and philosophers formed the Vienna Circle. Their goal was to emphasize logical positivism, the belief that science can only be defined in terms of things which we can directly observe. However its proponents took the philosophy too far by placing too great an emphasis on direct evidence and downplaying the importance of the kind of important but somewhat indirect evidence that was then verifying the spectacular predictions of quantum mechanics. The Vienna Circle may well have been appalled by the highly statistical nature of the proof that turns particles like the Higgs boson from squiggles to reality. And yet the pendulum now seems to have swung to the other extreme in physics. The frontier of physics, at least in the public imagination, now consists mainly of complicated symbols written on paper and tossed around in erudite debates. Whether these symbols have much to do with reality is at best a secondary concern and more typically a clerical distraction.

Perhaps the world of physics needs another Ernest Rutherford. When asked what he thought of theorists he famously retorted, “You theorists play games with your symbols, but we are the ones who discover the secrets of the universe”.

About the Author: Ashutosh (Ash) Jogalekar is a chemist interested in the history and philosophy of science. He considers science to be a seamless and all-encompassing part of the human experience. Follow on Twitter @curiouswavefn.

Yes, I think Steven Weinberg had echoed similar sentiments. He said that there is nothing unexpected in finding the Higgs boson at the LHC and that the most important discovery at the LHC would be something completely unexpected, which is the way science ideally progresses. Naturally this would require energies that even the LHC may not be able to achieve. Weinberg thinks there are important experiments to be done at such high energies, but he also thinks it may be impossible to get funding and support for the kind of accelerator that would make such experiments possible. In that sense it’s probably true that experiments are being severely constrained. However it’s also worth noting that this is true mainly in particle physics. In astrophysics and cosmology there still seems to be ample opportunity.

Ash, I agree with your assessment – elegantly correct mathematics do not necessarily correctly describe physical reality, especially when their objectives are misconceived.
… The standard model of Particle Physics may be perceived by theorists as having shortcomings, but those ‘shortcomings’ may not be justifiable.

For example, one perceived fundamental issue is that it does not include gravitation. However, this perception is based solely on the presumption that it is a particle interaction. GR successfully describes the effects of gravitation as the relationship between localized potential mass-energy and a geometrical aspect of spacetime. As John Wheeler put it in “Geons, Black Holes, and Quantum Foam”, “Spacetime tells matter how to move; matter tells spacetime how to curve.”
… In this sense, gravitation is not a quantum particle-particle interaction directly mediated by a force carrying particle as are the ‘other’ three particle interactions but an interaction between condensed potential mass-energy and some unidentified physical aspect of spacetime successfully described in geometric terms. It is interesting to note that it is largely the accretion of material mass-energy that, by extraction, produces and potentially energizes/distorts the geometry of the surrounding vacuum…
… Conversely, there is no evidence that gravitation even operates at the scales of quantum particles. The idea that gravitation must be a ‘forth’ interact in the material force is most likely misconception, much like the ‘attractive force’ that seemed necessary to cause bodies to interact in the Newton’s description of gravitation. While quantum theorists may like to envision that their subject theory explains everything, gravitation may be outside the scope of any theory of quantum particle interactions.

Next, there’s the issue that the standard model/Higgs mechanism does not explain why specific particles acquire particular inherent rest masses. Briefly, any explanation for particle mass disposition in relation to decay processes that was consistent with the standard model would require the existence of a mediating boson particle.
… The discovery of the predicted Higgs boson does not confirm the Higgs mechanism – proposed to explain how all particles acquire their inherent rest mass. As I understand, it requires that all fundamental particles, including bound components of compound particles such as protons and neutrons, must continuously interact with an ambient Higgs field that permeates all spacetime to reacquire their rest mass.
… However, in this case there should be plenty of Higgs easily detectable throughout spacetime – they are absent. Only when bound composite particles are disintegrated via very high energy collisions can an occasional Higgs boson be detected.
… A simpler alternative is that fundamental particles acquire their rest masses only when they are emitted. In this case, stable, persistent particles such as quarks and electrons would have acquired their rest mass only in the exceedingly dense, hot conditions of the early universe. While some massive particles can be emitted in the recent universe, they are the product of very peculiar locally high energy conditions of nuclear processes and particle decay. Most particles emitted in the ambient conditions of the Higgs field today have little if any inherent rest mass…

Lastly, It’s thought that, to be consistent with the still nascent ‘standard model of cosmology’, the standard model of particle physics should be modified! In my view, it is the cosmological model that should be required to be consistent with the model of particle physics!
… I won’t belabor this point, but I suggest that the inferred requirement for dark matter (or modified gravity) is the result of improperly applying methods of gravitational evaluation – intended for discrete bodies of mass – to large scale, extremely compound massive structures (especially those that most greatly deviate from necessary spherically symmetrical mass distribution), composed of hundreds of billions of locally interacting discrete masses spanning hundreds of thousands of light years. Please see the brief, informal essay http://fqxi.org/data/essay-contest-files/Dwyer_FQXi_2012__Questionin_1.pdf and its more formal references.
… As I understand, justification for the existence of exotic dark matter particles is the primary motivation for supersymmetry theory…

In summary, the perceived rationale for necessary extension or replacement of the standard model may all be based on misconceptions. What’s a quantum theorist to do? I think they should look beyond the apparent low hanging fruit – much of the standard model as it stands _is_ arbitrary and demands further definition! There may well be new physics involved in the unexplained aspects of the standard model!

Most will answer Einstein. The others you list will be named by a minority of laymen. I think the US public imagination has been squashed by the enormous costs involved in high energy particle physics and even relatively modest costs of the space probes necessary to investigate astrophysics and cosmology.

Gravity Probe B is a wonderful example. Most laymen have never heard of it. Proposed to NASA in 1961. Nearly cancelled several times. Launched in 2004. After years of analysis the final results announced on 4/4/2011. From the Stanford web site:
“The experimental results are in agreement with Einstein’s theoretical predictions of the geodetic effect (0.28% margin of error) and the frame-dragging effect (19% margin of error).”
Total cost about $750 million. Many would consider that just too much for tax payers to support.
Most people are not aware that Einstein’s special theory of relativity needed more confirmation.

The accelerating expansion of the universe is another great example. No one expected it. No theory predicted it. The astronomers involved did not believe their results at first. Public money spent, virtually none. Nobel Prize! Now that is something “Joe six-pack” can approve of.

As near as I can make out the search for a unified theory will remain dominated by theory and philosophy. I guess it is not beyond imagination to suppose an experiment might eventually be proposed but I suspect the costs of such an experiment will be enormous.

I think physics could learn from biochemistry in that biochemistry is almost entirely experimental evidence-oriented. Even the computational and theoretical papers tend to make experimentally testable predictions and to analyze existing data in new ways. They just don’t make up new life forms out of whole, mathematically beautiful, cloth. Also, the experiments all include a lot of positive and negative controls and are on the lookout for the presence of experimental artifacts. I assume physics experiments (particle colliders, etc.) do this, too, but as a biochemist, I just don’t have that knowledge. Thank you.

You are just bias because you are an experimenter. Theory and experiment are both essential to science. Galileo, a great experimenter, said the book of nature is written in the language of mathematics. BTW Rutherford also said without theory, science is just stamp collecting.

IMO the greatest physicists are those who contributed most to applied physics and engineering, regardless if theorist or experimenter:

If the trouble is theoretical physics’s fame, it is irreconcilable. Capturing human imagination will always beat out bringing humans back down to earth with empirical evidence. One is attractive whereas the other is not. It’s like comparing hollywood to real life. People want to believe one while they live in the other.

If the trouble is the money associated with the prize, applied physics can be far more lucrative than a one-time 3 million dollar prize. Historical examples of this are alternating current and transistors, patents that were both fundamentally important and incredibly profitable. There are plenty of opportunities to make money in experimental physics, one simply must look in the right places.

All considered, I think both theoretical and experimental work are indispensable in physics, the difference is theoretical work has no immediate tangible value. That’s why prizes like this exist in the first place. It is not bad for physics.