Neutrino Hunting in Antarctica

Monday, September 17, 2018

Once again, IceCube has shown that we can study high-energy neutrinos in their own right, rather than just as astrophysical probes. This analysis used a sample of starting tracks in 5 years of data, from neutrinos that interacted within the detector, producing a hadronic cascade from the nuclear target recoil, and a muon from the lepton, in a reaction written as neutrino + nucleon (proton or neutron) -> muon + X, where X is the shower of particles produced by the recoiling nucleon. In these interactions, there are two quantities to measure, the energy of the muon, and the energy of the shower. The inelasticity is the energy of the cascade divided by the total energy (the sum of the shower and muon energy). The distribution of inelasticity is well predicted by the Standard Model of Particle Physics, but has not been measured at energies above 500 GeV (5*10^11 electron volts). With IceCube, we have extended the measurement to energies above 100 TeV (10^14 electron volts) - a factor of 200 upward in energy. This plot shows the measured average inelasticity, from our new Icecube preprint, available here, or directly as pdf.

The points show the inelasticity, while the blue and green curves show the standard model predictions for neutrinos and antineutrinos respectively. The red curve shows the expectation for the mixture expected in IceCube. For aficionados, this calculation is done at next-to-leading order accuracy, with BFKL evolution to low-x partons.

This measurement is sensitive both to potential beyond-standard-model physics, which would likely have a rather different inelasticity distribution than for the expected interactions. Even the standard model cross-section is sensitive to the number of low-momentum quarks and antiquarks in the target nucleus.

Inelasticity is interesting in it's own right. But, the inelasticity can also be used to probe a number of additional physics topics. Neutrinos and antineutrinos have different inelasticity distributions, so by assuming the standard model values, we can measure the neutrino:antineutrino ratio. As can be inferred from the plot above, it is exactly as expected. Unfortunately, at the energies where IceCube is sensitive, we are mostly studying atmospheric neutrinos, not astrophysical.

We can also use inelasticity to probe astrophysical neutrinos. Although the neutrinos selected here are mostly muon neutrinos, some tau neutrinos make it into the fit, and it is possible to use similar criteria to select a matching set of cascades. The plot below shows the flavor triangle found from this study.

Each point on the flavor triangle corresponds to a unique mixture of electron, muon and tau neutrinos. The upper point is all muon neutrino, with the lower left and lower-right points corresponding to all tau neutrinos and all electron neutrinos respectively. The colors show the relatively likelihood, with the best-fit point (cross) corresponding to 83% tau neutrino, 17% electron neutrino and no muon neutrino. Unfortunately, the errors are large, so none of the different standard acceleration scenarios can be ruled out.

Friday, July 13, 2018

Yesterday, in two papers published in Science, IceCube and collaborating experiments announced the observation of high-energy astrophysical neutrinos coming from a source, the blazar TXS0506+56. (the numbers denote its position in the sky). This was announced at a press conference at National Science Foundation headquarters, and accompanied by press releases from multiple institutions, including from one from Berkeley Lab, and made the cover of Science (above).

Blazars are a type of active galactic nuclei (AGNs), which are themselves galaxies with a supermassive black hole at the center. If the black hole is surrounded by a dust cloud (or other matter), it will gradually accrete that matter. In the process, it will eject a fraction of it in a relativistic jet perpendicular to the galaxies axis of rotation. The jet is turbulent, and thought to be a likely site to accelerate particles to extremely high energies. In blazars, this jet is pointed nearly directly at Earth, giving us the best chance to see these ultra-energetic particles.

The story begins on Sept. 22, 2017, when IceCube observed a neutrino with an energy around 300 TeV (about 50 times the energy of the protons accelerated at CERNs Large Hadron Collider). The event display shows the neutrino; each colored dot shows the photons registered by one IceCube optical module, with color indicating relative time (from red to blue), and the size indicating the number of photons.

IceCube has seen many neutrinos that were more energetic than this, but this was still enough energy that the neutrino was likely of astrophysical origin. So, computers clacked and whirred, and 43 seconds later we sent out an automatic alert, telling many partner observatories that we had seen an energetic neutrino coming from a specific direction. Several of these observatories pointed telescopes in the direction of the neutrino, and (to make a long story short) that the location coincided with a minutes to years; this neutrino came during a time when TXS0506+56 was emitting at particularly high levels, from radio waves through (at least) 1 TeV photons. The first paper, by IceCube and the other experimental collaborations, discusses this coincidence in space and time. TXS0506+56 is a relatively energetic, quite nearby (with a redshift of 0.3365), so it is a likely candidate for a first observation.

Shortly after this observation, IceCube went back and looked at archival data, searching for excess emission from the source. We found an excess of neutrino events coming from that direction during the period from September 2014 to March 2015. This is reported in the second paper.

The exact statistical significance of these observations depends on some of the details of the analysis - the first paper gives a range of significances, depending on the preferred assumptions. But, taken together, this is strong evidence that we have seen neutrinos coming from a specific source: we have found at least one cosmic accelerator, far more powerful than CERN's LHC. Besides the observed neutrinos, there is strong suspicion that AGNs also accelerate the ultra-energetic protons and/or heavier nuclei cosmic-rays that led us to look for neutrinos in the first place. Unfortunately, since protons and heavier nuclei are bent by interstellar magnetic fields, they do not point back to their sources.

One still-open question is whether blazars are responsible for all of the neutrinos that IceCube sees. In 2016, IceCube published a paper (freely available arXiv version here) which set limits on the fraction of the astrophysical neutrino that could come from blazars, setting a limit between 27% and 50%, depending on the spectral index. This paper studied 862 blazars, and had to make some assumptions about the relationship between the observed gamma-ray flux and the expected neutrino flux. As you can imagine, extensive work is ongoing to revisit this question.

Friday, December 8, 2017

Neutrinos are popularly known as the particles that go through anything and everything. Neutrinos from beta decay can escape from the best shielded nuclear reactor, and neutrinos from nuclear fusion escape from the center of the sun. Neutrinos interact only via the weak interaction, which is indeed weak. But, that doesn't mean that they can go through anything - the IceCube Neutrino Observatory recently demonstrated experimentally that it is possible to stop a beam of neutrinos, in a paper published in Nature (also freely available on the arXiv).

To do this, IceCube used two tricks.

First, it use extremely energetic neutrinos, with energies above 1 TeV (1 tera-electron volt, or 1012 electron Volts), extending up to 1 PeV (1 peta-electron volt, or 1015 eV), millions of times more energetic than neutrinos from nuclear fusion or radioactive ion decay. The cross-section (probability) for neutrinos to interact rises with energy (linearly at first, then moderated to scale roughly as Energy0.3. So, at an energy of 30 TeV (the rough mid-point of the measurement) the cross-section is several million times higher than it is for neutrinos from radioactive decay. Of course, there aren't that many neutrinos this energetic, but, at 1 cubic kilometer in volume, IceCube is big enough to collect a good sample. The analysis used 10, 784 energetic muons from neutrinos that passed through at least some of the Earth.
Second, it used a very thick absorber - the Earth. With this, the measurement was quite simple. It Compared to a baseline of near-horizontal neutrinos that traversed only a relatively small amount of matter, energetic near-vertical neutrinos were absorbed going through the Earth. The figure above shows the predicted transmission probability (= 1 - absorption probability), as a function of neutrino energy and zenith angle; the latter shows how much Earth matter was traversed.

There are of course many complications - experimental uncertainties on the neutrino energy, neutral current interactions, where a neutrino may emerge from the Earth with a lower energy than it entered, modelling the material within the Earth, etc., but the result clearly showed that neutrinos are absorbed at about the expected rate. More precisely, the best-fit cross-section was. 1.3 +/- 0.5 times the predictions of the Standard model where I have combined the statistical and systematic uncertainty. It was not trivial to find a good definition for the neutrino energy range for which this measurement applies, because different methods give somewhat different energy ranges, but we settled on a method that returned a range from 6.3 TeV to 980 TeV. For comparison, the highest energy measurements at an accelerator laboratory only reached 0.37 TeV - our measurement reaches order of magnitude higher energies than than. The figure below puts this in perspective, comparing our measurement with the previous accelerator work. The cross-sections (y axis) are divided by the neutrino energy so that everything fits on the graph better; otherwise, it would span many orders of magnitude.

I have to mention that this was the dissertation work of my (now graduated) graduate student, Sandra Miarecki. Sandy had a very interesting preparation for graduate school - she was a career US Air Force Pilot, serving many roles, including as a test pilot, before retiring from the Air Force and coming to graduate school in Berkeley. After graduate school, she became an Assistant professor at the US Air Force Academy. The LBNL news center has a very nice article about her.

The Nature article also recieved a fair amount of press coverage. I will just mention one article, in Symmetry magazine, which goes into more detail about the analysis than other press writeups.

Thursday, November 16, 2017

On October 16th, the combined LIGO/VIRGO collaborations announced the observation of gravitational waves from an even that occurred on August 17th. Unlike the previous observations, these waves came from relatively 'light' objects, reflecting the collisions of two presumed neutron stars, with masses around 1.1 to 1.6 times the mass of the sun, forming a black hole with a mass around 2.74 times the mass of the sun. Previous gravitational wave events had come from the collisions of much heavier objects.

But, that's not all. Two seconds later, the FERMI observatory, a satellite containing a large gamma-ray detector, and the INTEGRAL satellite both observed pulses of gamma-rays coming from the same direction. This is the classical signature of a 'gamma-ray burst' (GRB). GRBs were first observed in the 1960's by the VELA satellites, built to monitor gamma-rays from possible atmospheric or space-based nuclear weapons tests. VELA did not observe these, but it did find mysterious bursts of gamma-rays coming from space. These bursts have been the subject of scientific speculation for decades, and the conventional wisdom was that some GRBs came from the merger of neutron stars or black-hole on neutron star mergers. That theory has now been amply confirmed by the LIGO/VIRGO/FERMI/INTEGRAL observation. The graphic above, from the LIGO collaboration, shows the process.

Of course, this collision site was studied by many many other astronomical instruments. IceCube looked, but we didn't see anything. However, the optical studies were very fruitful. Multiple telescopes observed an optical signal that lasted for a few days, plus an infrared signal that lasted for nearly two weeks. These signals were consistent with some predictions made by my LBNL colleague Dan Kasen and his collaborators. Kasen made a detailed model of the graviational, nuclear and atomic processes that would occur in a collision of two neutron stars, and, from that, predicted the optical and infrared light emission. His model predicts considerable production of heavy elements (heavier than iron) via rapid neutron capture (the 'r-process'). The shorter-lived broadband optical emission comes from an initial ejection of lighter nuclei. The long-lived infrared component comes from a secondary emission which is powered by the radioactive decay of heavy elements which heat the plasma that surrounds the newly formed black hole. Heavy elements (Z between 58 and 90) scatter the light strongly, so it takes longer to escape from the plasma.

This agreement is of great interest to nuclear physicists, since it may provide a new answer to the question: where do the heavy elements in the universe come from? Previously, it was thought that they were mostly produced in supernovae, explosions that occur when heavy stars reach the end of their livetime and collapse. However, Dan's simulations shows that GRBs produce heavy elements, and could account for much or all of the gold used in our jewelry, along with all of the other heavy elements.

Wednesday, July 19, 2017

One use for IceCube is to study solar flares, or, more generally, solar weather. Solar weather is important; solar flares are often accompanied by coronal mass ejections (CME), the ejection of plasma which sometimes hit the Earth. This plasma can disrupt radio communications, disable satellites, and endanger astronauts. In 1859, a large CME, the Carrington event, disrupted telegraph communications in the U. S. and Europe, and produced enormous auroras which were visible over much of the globe. Modern electronics is far more susceptible to damage than simple telegraph systems. A similar event occurring today would cause incredible ($1 trillion?) damage. With appropriate warning, we could reduce the damage significantly by turning off and/or shielding as much electronics as possible, grounding airplanes, etc. For this very practical reason, it is important to understand solar weather in more detail.

In addition, sunspots and solar flares contribute to the variation in the Sun's total output. Sunspots and solar flares reduce and increase the solar irradiance directly. So, the lack of sunspots in the current 11-year solar cycle (cycle 24) might be thought to reduce solar irradiance, and hence help combat global warming. However, the story is a bit more complicated than that. Both sunspots and solar flares are governed by the sun's magnetic fields; there is a more direct correlation between the Sun's magnetic activity and irradiance, as can be seen from this plot from the Bartol Institute at the University of Delaware, which compares solar magnetic activity and the rate of neutrons from cosmic-rays reaching Earth. Before ~1950 (i. e. before global warming became significant), these magnetic field variations likely accounted for much of the observed climate variation. The evidence for this comes from comparing carbon-14 dating curves (carbon-14 measures cosmic-ray activity) with climatic data from tree rings and ice cores.

Most of the particles emitted in a solar flare or CME are, by IceCube standards, low energy, protons, neutrons (which mostly decay before reaching Earth) and photons with energies of at most a few billion electron volts (GeV). When the individual particles reach Earth, they leave relatively few direct traces at ground level; only a small fraction of them lead to (via a small air shower) particles reaching the Earths surface. However, a CME contains a very large number of particles, an, if the density is high enough, it can raise the counting rate of terrestrial particle detectors This is known as a ground level event (GLE). Space scientists have deployed neutron detectors at many sites around the globe to monitor these signals. Because of the total sensitive area and low background rates of the 162 IceTop surface array tanks, it is a very sensitive detector for GLE's, and we had expected to detect of order 1 GLE per year.

Unfortunately, Nature has not cooperated. At this week's International Cosmic-Ray Conference, IceCube presented an analysis of GLE's from 2011 to 2016. Only three GLEs were observed, and they were 'quite small by historical standards.' The reason for the low rate is unknown, but it may be connected with the paucity of sunspots during the current solar cycle. This might be expected to lead to a reduction in solar irradiance. Data from the SORCE satellite presents a mixed picture, showing a small (~0.07%) increase in solar irradiance from 2009 and 2015, with a similar sized decrease over the past two years.

We do not understand what is causing these changes; clearly the Sun still has many secrets.

Many thanks to Paul Evenson (Bartol Institute, Delaware) for useful discussions on the relationship between sunspots, flares, CME and magnetic fields. Of course, any errors here are my own.

Monday, December 19, 2016

Particle and nuclear physicists face a real dilema. Our "Standard Model" explains most of what we observe at accelerator and non-accelerator experiments, IceCube included. The Standard Model has been around for about 40 years. It's three generations of quarks and leptons, four forces, and the Higgs boson come together to provide a good description of the processes we observe at the Large Hadron Collider (LHC) and other accelerators, like Brookhaven's Relativistic Heavy Ion Collider, not to mention underground neutrino detectors. The only clear crack in the standard model is the fact that neutrinos oscillate between the different flavors, and therefore should have mass. But, most of us don't feel like this is a huge crack.

So, we have been looking for holes in the Standard Model for the past 40 years. With the discovery of the Higgs boson in the bag, this is now the main rationale for the LHC. Each year the four LHC experiments put out hundreds of new results; the search for "New" (beyond the standard model) physics is a major focus. Unfortunately, they have not found any clear evidence for any new physics.

There are some good reasons we know that there must be physics beyond the Standard Model. The evidence for both dark matter and dark energy is clear and convincing. Many theories of dark matter model it as a new particle that could very well be discovered at the LHC. Dark energy is even more mysterious. It is beyond the reach of any as-yet proposed laboratory scale experiments, but it is a cler reminder that the universe still has some deep secrets.

Although it is not our primary focus, IceCube is also searching for new physics, mostly that involving neutrinos. As part of this search, we continue to study neutrino oscillations (see my previous post here) in more detail. One of the things that we are looking for is a new type of neutrinos, which do not interact; these are called 'sterile neutrinos.' If regular neutrinos oscillated into sterile neutrinos, it would look just like these neutrinos disappeared. We search for sterile neutrinos by looking at how likely neutrinos produced in cosmic-ray air showers are to appear in IceCube. If sterile neutrinos exist, neutrinos traveling long distances through the Earth might disappear. In a recent study, published in Physical Review Letters (here, and also available here through the Cornell arXiv), we set strict limits on sterile neutrinos. We pub limits on the possible existence of sterile neutrinos with certain characteristics; the main characteristics are the mass difference between sterile neutrinos and regular neutrinos, and the mixing angle (strength of coupling) between sterile and regular neutrinos.
The IceCube limits are of particular interest because they rule out a region of parameter space (mass difference and mixing angle) that had been suggested by a couple of earlier experiments. These earlier results had attracted great attention, but we now know that they are unlikely to be correct.

So, we need to keep looking to find a different crack in the Standard Model, possibly including sterile neutrinos with different masses and couplings.

Monday, September 26, 2016

As a scientific experiment, IceCube is approaching maturity. We have collected 6 years of data with the complete detector. In one way, this is a lot. For most analyses, the statistical precision increases as the square root of the amount of data collected). So, to make a measurement with half the error requires four times as much data. It will take IceCube another 18 years to half the statistical errors that are possible with the current data. 18 years is a long time, so for many of our studies, future increases in precision will be relatively slow.

This is not true for everything. For many analyses, it is a matter of waiting for the right astrophysical event. Assuming that high-energy neutrinos are produced by episodic (non-constant) sources, one nearby gamma-ray burst, or supernova, or whatever is producing astrophysical neutrinos, would be a huge discovery. This is worth waiting for.

We continue to improve our analysis techniques, and we will be able to continue to make progress here for some time. And, there are some analyses that are only now becoming possible, either because they require software that is only now becoming possible, or because they require a lot of data. So, we are still productively busy.

But, we are also thinking more intensively about follow-on experiments. There are several possibilities on the table.

PINGU would be a dense infill array, with a threshold of a few GeV, able to determine which neutrino flavor is the lightest.

Gen2 (above) is a comprehensive IceCube upgrade, probably including PINGU, but focused on an array 10 times larger than IceCube. It would have a similar number of strings to IceCube, but be build with more sensitive optics. Because the strings would be more widely dispersed than IceCube, it would have a higher energy threshold, well matched to studies of astrophysical neutrinos. We (both the collaboration and the broader neutrino astronomy community) think, but cannot completely demonstrate that Gen2 will be able to find the source of our cosmic neutrinos.

Gen2 will likely also include a large (10 square kilometer) surface air-shower array. One main purpose of the array will be to reject downward-going atmospheric neutrinos, improving our sensitivity to astrophysical sources which are above the horizon; the center of our galaxy is of prime interest.

There are several efforts to build a large radio-detection array, either as part of Gen2, or as a stand-alone project. Here, the main possibilities are ARIANNA, which I have discussed multiple times before, or ARA, a radio-detection project at the South Pole.

in Europe, there is also a large effort to build an optical detector array, KM3NeT in the Mediterranean Sea. KM3NeT will eventually include a large (~ 5 cubic kilometers?) astrophysical array, and a smaller array, ORCA, which will have physics goals similar to PINGU. KM3NeT is starting to deploy test strings now, and ORCA might be complete in the next ~ 3 years. Construction of the astrophysical array is also starting, although the 5 km^3 array will not be complete until the mid 2020's.

On the U.S. side, these projects are perfectly aligned with National Science Foundation priorities. NSF director France Cordova recently unveiled a 9-point R & D agenda; one of the 6 science themes was "multimessenger astronomy." Unfortunately, even despite this, these U.S. projects seem stalled, due to lack of funding now and in the near-term future. From my perspective, this is very unfortunate; if an excellent science case and a good match to the funding agency's directors priorities isn't enough to move a project forward, then what is needed? Although the full Gen2 would not be inexpensive (comparable in cost to IceCube), one could build either of the radio-detection arrays or a moderate sized surface array for under $50 million - not much by current particle/nuclear physics standards.

Some of the ideas presented here were presented in a comment I wrote for Nature: Invest in Neutrino Astronomy; it appeared in the May 25 issue.

Spencer Klein

I am a physicist in the Nuclear Science Division at Lawrence Berkeley National Laboratory, studying high energy neutrinos for astronomical purposes. Neutrinos are particles which interact very weakly, so they escape unimpeded from extremely dense sources. We want to pin down the sources (accelerators) for ultra-high energy cosmic rays.

This blog is about my trip to the Ross Ice Shelf in Antarctica, to test prototype hardware for ARIANNA , a proposed ultra-high energy (above 100 PeV) neutrino detector that will be built there.

I also work on the IceCube neutrino detector, which is searching for neutrinos with somewhat lower energies - 100 GeV to 100 PeV. IceCube is smaller than ARIANNA, but studies a wider variety of physics, including cosmic-rays and a host of searches for exotic particles.

My travelling partner and colleague, Thorsten Stezelberger, is an engineer at LBNL.