Pages

Wednesday, March 09, 2016

A new era of science

Here in basic research we all preach the gospel of serendipity. Breakthroughs cannot be planned, insights not be forced, geniuses not be bred. We tell ourselves – and everybody willing to listen – that predicting the outcome of a research project is more difficult than doing the research in the first place. And half of all discoveries are made while tinkering with something else anyway. Now please join me for the chorus, and let us repeat once again that the World Wide Web was invented at CERN – while studying elementary particles.

But in theoretical physics the age of serendipitous discovery is nearing its end. You don’t tinker with a 27 km collider and don’t coincidentally detect gravitational waves while looking for a better way to toast bread. Modern experiments succeed by careful planning over the course of decades. They rely on collaborations of thousands of people and cost billions of dollars. While we always try to include multipurpose detectors hoping to catch unexpected signals, there is no doubt that our machines are built for very specific purposes.

And the selection is harsh. For every detector that gets funding, three others don’t. For every satellite mission that goes into orbit, five others never get off the ground. Modern physics isn’t about serendipitous discoveries – it’s about risk/benefit analyses and impact assessments. It’s about enhanced design, horizontal integration, and progressive growth strategies. Breakthroughs cannot be planned, but you sure can call in a committee meeting to evaluate their ROI and disruptive potential.

There is no doubt that scientific research takes up resources. It requires both time and money, which is really just a proxy for energy. And as our knowledge increases, new discoveries have become more difficult, requiring us too pool funding and create large international collaborations.

This process is most pronounced in basic research in physics – cosmology and particle physics – because in this area we deal with the smallest and the most distant objects in the universe. Things that are hard to see, basically. But the trend towards Big Science can be witnessed also in other discipline’s billion-dollar investments like the Human Genome Project, the Human Brain Project, or the National Ecological Observatory Network. “It's analogous to our LHC, ” says Ash Ballantyne, a bioclimatologist at the University of Montana in Missoula, who has never heard of physics envy and doesn’t want to be reminded of it either.

These plus-sized projects will keep a whole generation of scientists busy - and the future will bring more of this, not less. This increasing cost of experiments in frontier research has slowly, but inevitably, changed the way we do science. And it is fundamentally redefining the role of theory development. Yes, we are entering a new era of science – whether we like that or not.

Again, this change is most apparent in basic research in physics. The community’s assessment of a theory’s promise must be drawn upon to justify investment in an experimental test of that theory. Hence the increased scrutiny that theory-assessment gets as of recently. In the end it comes down to the question where we should put our money.

We often act like knowledge discovery is a luxury. We act like it’s something societies can support optionally, to the extent that they feel like funding it. We act like it’s something that will continue, somehow, anyway. The situation, however, is much scarier than that.

At every level of knowledge we have the capability to exploit only a finite amount of resources. To unlock new resources, we have to invest the ones we have to discover new knowledge and develop new technologies. The newly unlocked resources can then be used for further exploration. And so on.

It has worked so far. But at any level in this game, we might fail. We might not succeed in using the resources we have smartly enough to upgrade to the next level. If we don’t invest sufficiently into knowledge discovery, or invest into the wrong things, we might get stuck – and might end up unable to proceed beyond a certain level of technology. Forever.

And so, when I look at the papers on hep-th and gr-qc, I don’t think about the next 3 years or 5 years, as my funding agency wants me to. I think about the next 3000 or 5000 years. Which of this research holds the promise of discovering knowledge necessary to get to the next level?
The bigger and more costly experiments become, the larger the responsibility of theorists who claim that testing a theory will uncover worthwhile new insights. Do we live up to this responsibility?

I don’t think we do. Worse, I think we can’t because funding pressures force theoreticians to overemphasize the promise of their own research. The necessity of marketing is now a reality of science. Our assessment of research agendas is inevitably biased and non-objective. For most of the papers I see on hep-th and gr-qc, I think people work on these topics simply because they can. They can get this research published and they can get it funded. It tells you all about academia and very little about the promise of a theory.

While our colleagues in experiment have entered a new era of science, we theorists are still stuck in the 20st century. We still believe our task is being fighters for our own ideas, when we should instead be working together on identifying those experiments most likely to advance our societies. We still pretend that science is somehow self-correcting because a failed experiment will force us to discard a hypothesis – and we ignore the troubling fact that there are only so many experiments we can do, ever. We better place our bets very carefully because we won’t be able to bet arbitrarily often.

The reality of life is that nothing is infinite. Time, energy, manpower – all of this is limited. The bigger science projects become, the more carefully we have to direct our investments. Yes, it’s a new era of science. Are we ready?

24 comments:

"If we don’t invest sufficiently into knowledge discovery, or invest into the wrong things, we might get stuck – and might end up unable to proceed beyond a certain level of technology. Forever."

I can more dystopic than that: we may end up killing ourselves off as a specie. We may have already passed the tipping point. Some will say science got us into this mess, which is incorrect (it was the use of science by commerce and governments that did), but it doesn't matter. Science can get us out. NEW science in particle physics is more challenging than ever thanks to the unbridled success of the SMPP, but ongoing research in neutrinos and quark-gluon soup and all sorts of material science and photonics trickle us forward. What is wider open than ever before is cosmology. So I see the coming funding wars to be between those two camps. I also think each should be given ten time the funding overall that they are currently getting. Our survival as a species depends on it.

You wrote: The community’s assessment of a theory’s promise must be drawn upon to justify investment in an experimental test of that theory.

In my own way, I understand the underlying question may be:Does the meme reign? Does the matrix has us? (I mean the theoretical physicists community).

This is not a comfortable question; I suggest that the community seems stuck in the belief that the very sentence above is the only way of research. Theory -> test -> money???

But when I read Einstein, de Broglie, and even Feynman, I fell that those men were trying to understand "what it is" in order to solve the mysteries of the time. Not only to model it and make predictions. Feynman even spoke about the later works of Einstein being "just mathematics".

Is that what you (or the community) do in your research program?No offense meant of course, and absolutely no obligation to answer (except maybe for yourself, but that's personal).

I think this is a misunderstanding on what I mean by "investment". Investment means the direction of efforts by monetary means, not necessarily making profit with that investment. Basic research is rarely directly profitable. Payoffs are exceeding indirect and often come only centuries later.

In my community we are all driven by trying to understand the fundamental laws of nature. And we have all read about Einstein, de Broglie, and Feynman too. But there isn't any one right path to understanding. Some put more emphasis on modeling and phenomenology. Some put more emphasis on mathematical consistency. We need both. But really this is not the point of my post. The point is that the more difficult it becomes to test new theories, the more careful we have to be in selecting these theories. And that presently just isn't the case. Best,

Therefore, to answer to the great question, "What is time?" becomes ...

Time = Energy - Money

Conclusion: To increase the needed time to produce advances in basic research we must increase the workload of researchers while decreasing the amount of money spent. Keep 'em hungry, and grateful to have a job! ;-)

Great essay but I don't know what can be done to avoid the development. We might wreck our economy (short of racial suicide) such that we simply cannot afford to advance experimentation in physics. This is becoming likely as the years pass. As for serendipity it still happens but now at the component engineering level rather than the experiment overall.

It is a nuanced situation.Because to assess if a theory or hypothesis is promising enough to fund experiment, we rely on our current knowledge and paradigm. But progress, as history shows, often comes out of left field.

Dry statistics : " If you want to win a lottery 100% sure, you have to buy all the tickets and check them ". True, but unaffordable ,).

So aside from 'the big experiments', their should be room to fund smaller experiments to test hypotheses that hold a greater risk, or that seem more unlikely according to our current understanding, IF they entail a path as yet not taken. Because that improves the probabilities.

"The US War on Islam burns about $1.5 trillion/year. The National Science Foundation burns less than $8 billion/year. 1% less war, 4 days, triples NSF budget. Idiots."

I second Al's motion. For that price and assuming we had the plutonium we could have sent 2000 New Horizons missions (which are basic research) into the Kuiper belt.

And Al I think you know this, because I heard you were US Military in Vietnam as a grunt, yes? Bloody stupid war, but war is essentially stupid, all monies for war should be diverted to basic research in Science IMO, and thank you for you service, Al. Salute.

Therefore, to answer to the great question, "What is time?" becomes ...

Time = Energy - Money"

Time is money. Knowledge is power. Power is energy/time. Substituting, we have knowledge = energy/money. Solving for money, we get money = energy/knowledge. So, as knowledge goes to zero, money becomes infinite, regardless of the energy expended. Conclusion: doing science is not the best way to get rich.

Steve Colyer said.. "US Military in Vietnam as a grunt, yes" Point of fact, no. I carry my "1A" draft card always, lest I confuse country and government. US in 'Nam was South China Sea oil exploration. They looked too far south.

WWII was Allies' smash and grab of German industrial and science universes, e.g., Ruhrchemie's Otto Roelen/1938 patent. 1940 US was an agricultural backwater. By 1948, the oxo-process (hydroformylation) moved the world. The US created military-industrial Big Science and flourished. Crude science has been displaced by elegant management - more and more of less and less.

It's possible a lot of you folks are too pessimistic or doer. There is so much scientific research going on these days it is easy to miss the vast amount of low budget small scale research that is giving useful insights. Big budget science is prominent only in publications and research grabs. But I think if you examine the statistics what we find is that more science is being done today on low budgets than was ever being done in the past, and also for higher budgets. There's just more science all around. There are people who still sit down and do research with pen & paper (or electronic equivalents). And I am reminded of the early 2nd gen circa 2007 CMB probes, which had fairly simple detectors. There are loads more examples of simple desktop science labs doing new things. You just have to look around and see what people are doing below the big budget radars.

No, not really. It was only this workshop last December that got me thinking about how the role of theory development has changed so dramatically since, say, the 70s. Anthony's idea with the prediction market is not bad - except that I am afraid people won't use it. No matter how I turn things around, I always come back to the same point, that it's an awareness problem in the community. Best,

"An increasing number of physicists, Ellis and Silk observed, have become strongly convinced of the viability of theories that have no empirical confirmation." no empirical falsification! Newtonian cyclotrons work to 100 KeV but not at 600 KeV. Physics assumed beta-decay was S + T not A + V.

Wow, at 1.5 trillion a year for military expenses in the Middle East, that's three times the estimated cost of a manned Mars mission. It's a shame that politicians prefer the destructiveness of war over the constructiveness, and knowledge increase, that comes with scientific programs.

I was just watching Kennedy's 1962, Rice University speech on Youtube committing our country to a manned lunar landing. We sure had better leadership in those days.

I am personally inclined to think that serendipity is our best source of progress, so I ask myself what are the best ways to improve the chances of serendipity? Of course myself is not an expert and doesn't know the answer, but speculates that exploration in general gives the opportunity to see and learn new things. Probably for that very reason, exploration seems to be one of the fundamental drives that evolution has programmed into us. So I would vote for space exploration - a Moon colony - a mission to another solar system - that sort of thing. Incidentally, that might also be our best chance of surviving as a species, as we like to live beyond our means, and space offers more resources now that we are about to exhaust the renewable resources of our planet. (I read that Americans and Europeans use resources at about 1.5 times the rate at which they can be renewed. I for one have never owned a car or a house and am willing to restrict myself further, but how many of us are?)

Most here will probably not agree with my vote, which illustrates the problem with trying to plan. Good plans require expert planners (an expert being one who has made all the mistakes possible in a narrow field so that he or she will not repeat them), and we have no expert planners in serendipity, or not many.

Still, until one begins to try something and begins to make the possible mistakes, one does not get better at it. So, good luck with the planning.

"I was just watching Kennedy's 1962, Rice University speech on Youtube committing our country to a manned lunar landing. We sure had better leadership in those days." The space race was about demonstrating mastery over ICBM technology. The atomic clocks which begat precision quantum metrology were developed to perfect ICBM telemetry, and the AMO quantum metrology people are now drawing funding, among other things, for promising precision accelerometers allowing submarines to navigate submerged for years at a time. Find a military application for your desired program of physics experiment and you may yet find funding.

"The community’s assessment of a theory’s promise must be drawn upon to justify investment in an experimental test of that theory. "Ernest Orlando Lawrence's particle collider patent will soon be 100 years old. Interferometry, 150 years old. And then there's the AMO (and AMO inspired) stuff. And then some neutrino telescopes. Gadgets aren't built to test theories, rather, theories are for the most part built to be tested by gadgets. Those gadgets which may be built will be built, theoretical significance be damned. Resources are finite, blaming mathematical physics (which supports itself largely through teaching) is bizarre. "Mathematics is the part of physics where experiments are cheap"

It's kind of hard to figure out where a technology is going to go in 50 or 100 years. Back in the early 1970s, I was working with a lab that wanted the latest and greatest in tablet technology so we could put together a free hand drawing system. Most of the new startups with new technologies that we interviewed were started by particle physicists who had been working on particle detection. One actually had an ultrasound based drawing tablet that gave me a splitting headache even though no one else could hear a thing. I nearly puked as it affected my inner ear.

Airbags for cars started with nuclear deterrence. ICBMs originally used liquid oxygen based rockets, but liquid oxygen evaporated. America's "Ace in the Hole" might be caught during an oxygen reload, and there goes mutually assured destruction. This led to a lot of work on solid fuel rockets, particularly in scaling, particle size management and so on. By the mid-1970s the idea was to use solid fuel rockets to stop automobiles before they crashed, though this really didn't make crashes safer when the passenger hit the inside of the car, so they went to innovation B, using a solid fuel rocket to fill an airbag in milliseconds.

You can go down a long list of unexpected spin offs. We write blogs using gear originally designed in reaction to the delayed US census of 1880. If I remember correctly, a major Nobel Prize in medicine was based on work funded by Frederick the Great which involved measuring the size of the prostate glands of moles. Computer security is based on the Sieve of Eratosthenes.

I don't think the LHC or the Cassini mission are going to pay off with more efficient car engines or farming colonies on ice giants, but the technologies that were pushed to make them work may turn out to be critical for the future.

I also don't think science is as stymied as one might think. We have entered the era of overgrown datasets. The astronomers, the physicists, the biologists, the geologists and others are now collecting ridiculous quantities of data. The entire LHC project had to grind through bales of bits to find a handful of Higg's particles. What else is in that dataset? Yes, the instrument design put a lot of constraints on what could be detected, but computers keep getting more powerful. It's not like a telescope or microscope where you can just look through it and let your visual system figure things out. You have to use a computer, and computers can only find what they are looking for, sort of. They can also find things you are not looking for, and that is where serendipity comes in.

For example, mathematicians watching a series compute pi noticed a regularity in where the error appeared. It wasn't spread evenly. It concentrated periodically in the approximation. That led to the first expression for the n-th digit of pi that didn't involve simply computing pi first, then sampling it.

Good piece of thought. Also consider that a career in physics has a finite length. People designing parts of the LHC and its detectors in their post doc years are still able to see the results coming in. But now they are to advanced in their career to start a new "experiment" based on their experience and knowledge. In theory , Higgs and Englert had to wait for 50 years before definitive evidence was available.