Tag Archives: dark matter

Last month I finished a first draft of an article about AMS-02 for the Yearbook of Astronomy 2016. (As I write this, 2016 seems a long way away – but the lead time for this volume is long and, besides, I have other writing commitments this year. I wanted to get a draft version out of the way while I had chance.) The target audience for the Yearbook consists of amateur astronomers – people who are deeply interested in astronomy and cosmology but who don’t necessarily have a scientific or mathematical training. Much of my article, therefore, is taken up by explaining the meaning of various technical words and phrases – “positron fraction”, “neutralino annihilation”, “primordial antimatter” and so on. Even the name of the experiment – Alpha Magnetic Spectrometer – requires explanation. It took me several thousand words to explain why Sam Ting and his AMS might, or might not, have seen something of great scientific interest.

When I heard that Roberto Trotta had written The Edge of the Sky – a book about cosmology using only the thousand most frequently used English words – I thought he must either be barking mad or else a member of the Oulipo (you know, those authors who decide to write novels while blindfolded and using only letters that appear on the left-half of a keyboard, or something equally arbitrary and constraining). I thought it would be a disaster. It turned out to be one of the most charming, fresh and inventive books of popular science that I’ve read.

In Trotta’s book, the Milky Way becomes the White Road, electrons are referred to as Very Small Drops and antimatter becomes Sister Drops. (So I guess the positron, which I write about in my Yearbook article, would be the Sister Drop of the Very Small Drop.) These and other word choices are wonderful and lead to a surprising clarity of expression.

But is it really possible to describe the complexity of modern science using this approach. Well, I tried to explain the importance of AMS-02 using polysyllabic words to replace other, more technical, polysyllabic words. In The Edge of the Sky a space-based detector such as AMS-02 becomes a flying Far-Seer in the sky. The heroine of Trotta’s book wonders whether dark matter will first show up in such a flying Far-Seer in the sky or in one of the big ears in the rock (in other words, one of the multitude of underground detectors such as LUX or DAMA), in the huge eye in the ice (the IceCube Neutrino Observatory) or in the Big Ring in the ground (the Large Hadron Collider). This approach is possible. It works, and it works beautifully.

The experience of reading The Edge of the Sky is strange and rather hypnotic. I think everyone can learn something from it. This book is a must-read. (A sentence that uses the 28th, 455th, 13th, 85th, 88th and 317th most-used English words according to Project Gutenberg.)

In New Eyes on the Universe I gave only passing mention to the Large Underground Xenon (LUX) dark matter experiment. LUX was clearly going to be an important player in the search for dark matter, but while I was writing the book the experiment was still in its commissioning phase. Yesterday, LUX presented results from its first three months of operation. (For those who haven’t read the book, LUX employs 370kg of liquid xenon cooled to about 160K and shielded by water in a search for WIMP dark matter. If a WIMP collides with a xenon nucleus then the photons and electrons emitted as the nucleus recoils can be detected. In order to shield the xenon from cosmic rays and other background radiation, experimenters have placed the detector a mile underneath the Black Hills of South Dakota.)

The first thing to note is that LUX is now the world’s most sensitive detector currently searching for WIMP dark matter. Richard Gaitskell, a spokesperson for LUX, described its sensitivity with a footballing analogy. Imagine a 75000-strong crowd of football fans, each clapping twice a second: the number of claps is what the detector was hearing each second while it was on the surface. That’s a tremendous cacophony. When the detector was placed a mile underground it was as if the clapping fell to a rate of one clap per minute. That reduction in background is necessary: LUX is trying to ‘hear’ the equivalent of a sigh…

The second thing to note is that LUX is sensitive to WIMPs across a wide range of possible masses. There have been tantalizing hints by other dark matter experiments of WIMPs having a relatively low mass of around 8.6 GeV; many models based on supersymmetry, on the other hand, predict WIMPs with a mass of 35 GeV or more. LUX is sensitive to both low- and high-mass WIMPs.

And the results of the first 90 days of LUX operation? Well, the LUX data are consistent with the detector having seen zero dark matter particles during that time. As a LUX team member put it: “We’ve seen nothing better than anyone else.” The problem is, if an 8.6 GeV WIMP particle did indeed exist, as hinted at by CDMS, then LUX should have seen 1550 of them during those first 90 days. It seems impossible to reconcile these latest results with the existence of a low-mass WIMP.

The LUX results don’t prove the non-existence of dark matter, of course, and before reaching any conclusions we will really need to wait for the next LUX report: that will present an analysis of the first 300-days of operation. But the LUX results do put the dark matter mystery squarely in the spotlight: it’s becoming imperative that we learn just what dark matter is.

On 8 October 2013, the Nobel prize for physics was awarded to Francois Englert and Peter Higgs. In one sense this was a long time coming: the theoretical work that won the prize took place in 1964 (Englert, and his late colleague Robert Brout, working independently of Higgs, published first; a few weeks later Higgs published a paper that explicitly predicted the existence of a scalar boson; another group of physicists – Gerald Guralnik, Carl Hagen and Tom Kibble – published related work later in the same year). In another sense the prize was awarded remarkably quickly: experimental proof of the existence of a fundamental boson was announced on 4 July 2012, and it wasn’t until 14 March 2013 that it was confirmed to be a scalar (spin-0) boson. (If you want to learn more about the Higgs mechanism, you can find a variety of explanations here.)

To my mind, the discovery of the Higgs is one the crowning achievements of human civilisation: it is the culmination of a process that began 2500 years ago with the Greeks. Physicists now have a standard model of fundamental particles: there exists a small number of spin-1/2 point particles (6 quarks; the electron, muon and tau each with their associated neutrino) which interact via the exchange of spin-1 particles that mediate the electroweak and strong (these exchange particles being the photon; W+, W– and Z0; 8 gluons). In the ‘pure’ theories underpinning this model the fundamental particles are massless; they acquire mass – and thus in a certain sense their very existence – by interacting with a spin-0 field that pervades the entire universe. This spin-0 field has an associated particle; the Higgs boson. And that’s it. End of story. Except…

We are really just at the beginning of the story. The theories underpinning the standard model are in conflict with the other central pillar of physics: general relativity. The standard model is based on quantum physics; general relativity is a classical theory. Physicists need to develop a quantum theory of gravity. Furthermore, we now know that the standard model applies to only 5% of the universe: 95% of the mass-energy content of the universe resides in the so-called ‘dark’ sector. We desperately need to understand the nature of dark matter and dark energy.

Now that the Large Hadron Collider has discovered the Higgs its next job, when it becomes operational again after its current upgrade, is to shed light on the dark sector.

At the time New Eyes on the Universe was published, the only confirmed sources extraterrestrial neutrinos were the Sun and SN1987A. The view of the sky afforded by neutrino telescopes was rather dull.

That view of the neutrino sky is beginning to change. The IceCube SouthPole Neutrino Observatory – a “telescope” consisting of particle detectors buried in one cubic kilometre of Antarctic ice – has detected 28 neutrinos with an energy in excess of 30 TeV (a teraelectronvolt is 1012 eV). Two of these neutrinos, dubbed Bert and Ernie, had energies in excess of 1 PeV (that’s 1015 eV) – far in excess of energies available at the Large Hadron Collider.

An artist’s impression of the array of optical sensors, buried in Antarctic ice, that form the IceCube telescope. If a high-energy neutrino interacts with an oxygen atom in the ice, a charged particle can be produced that will be moving through the ice faster than light itself can travel through the ice. A cone of Cerenkov radiation, with its characteristic blue hue, will be produced – and it’s this radiation that the sensors detect. (Credit: IceCube Collaboration/NSF)

It’s possible that Bert and Ernie were produced by high-energy cosmic rays smashing into Earth’s atmosphere, but an extraterrestrial origin for these neutrinos does seem more likely than not. And If IceCube has indeed detected high-energy neutrinos from the depths of space the question becomes: what was their source? That’s where things get interesting. If they came from some violent astrophysical source then astronomers have a telescope that lets us study them. Or perhaps they came from the decay of dark matter particles – a suggestion made in a recent preprint by Arman Esmaili and Pasquale Serpico (Are IceCube neutrinos unveiling PeV-scale decaying dark matter?). Whatever the source of Bert and Ernie turns out to be, it seems certain that IceCube truly is giving us some new eyes through which to view the universe.

Yesterday Sam Ting, the 1976 Nobel prize winner, announced the first results from his Alpha Magnetic Spectrometer (AMS-02) experiment. The results hint at a possible dark matter signal. But it remains only a hint.

The story begins with the observation that the visible stuff in the universe – stars, galaxies, and so on – are all made of matter. We see no evidence for large amounts of antimatter. However, small amounts of antimatter are constantly being created when high-energy cosmic rays scatter off particles in the interstellar medium, or when the intense electromagnetic field surrounding pulsars produce electron-positron pairs. So Earth is certain to be hit by antimatter particles, and in particular by positrons, that have been created by events within our galaxy.

Now, astrophysicists are interested in measuring the so-called positron fraction that hits Earth. (The positron fraction is the ratio of positrons to the total number of electrons and positrons.) If the main production mechanism for positrons is the scattering of high-energy cosmic rays off particles in the galactic disk, an assumption that until recently seemed quite reasonable, then the positron fraction would decrease with energy since there are other processes that generate high-energy electrons without accompanying positrons. In 2008, however, the PAMELA experiment measured a rise in the positron fraction between 10 GeV and 100 GeV. The Fermi satellite later confirmed this excess number of positrons, and it showed that the rise extended up to 200 GeV. So what is going on? Well, one explanation for the rising positron fraction is that positron generation by a few nearby pulsars could be the cause. But there’s another possibility.

If dark matter exists then sometimes, simply because there’s so much of the stuff, dark matter particles and antiparticles will meet and annihilate. In some models, dark matter annihilation can give rise to excess numbers of positrons. Furthermore, the dark matter signal must take a particular form: it will be isotropic (in other words, the positrons will come equally from all directions in space) and the rising positron fraction will have a sharp cut-off after a certain energy is reached (an energy that is determined by the dark matter particle mass, since the particles can’t give rise to positrons that are more energetic than themselves). So after the PAMELA and Fermi results there was hope that we might be seeing hints of a dark matter signal, but the data were not clear enough to draw any conclusions. It wasn’t even certain that the positron excess was real.

Enter Sam Ting’s AMS-02 experiment.

AMS-02 is a cosmic ray detector on board the International Space Station. The detector was put in place by astronauts on 19 May 2011, and since then it has detected 30 billion cosmic rays. It has been able to measure the positron fraction to higher energies than any previous detector and with a precision that is much, much better than anything that has gone before. Yesterday, Ting announced the results of an analysis of the first 10% of data from AMS-02.

An artist’s view of AMS-02 installed on the attached S3 location on the main truss of the ISS (Credit: NASA/JSC)

The first result is that the excess seen by PAMELA and Fermi is real: the positron fraction increases from about 5% at 10 GeV to about 15% at 350 GeV. That in itself is significant, and requires an explanation. Second, the excess seems to be isotropic: if it turns out to be truly isotropic then that would tend to disfavour pulsars as being the source of the excess. Third, there are suggestions – and these are nothing more than tantalising hints – that AMS-02 might be seeing the start of a cut-off.

The positron fraction as a function of positron energy. Compare the AMS-02 error bars with previous experiments! (Credit: AMS collaboration)

So are we seeing the effects of dark matter? As the years go by, and AMS-02 improves and extends the energy spectrum positron fraction yet further, astrophysicists might decide that the dark matter explanation is the only viable one. But at present it’s far too early to make any claims: the AMS-02 results announced yesterday are interesting, but nothing more.

This morning an ESA press conference presented results from an analysis of the first 15 months of data from the Planck mission. The results are exquisite, and it’s clear that Planck will be as important for cosmology as its predecessors COBE and WMAP. Cosmologists will be poring over the data for years to come.

I’ll give more detail in future posts, but for the moment here are just two items.

First, the most detailed picture yet of the early universe:

Planck’s stunning new map of the universe (Credit: ESA)

Second, some of the stand-out points from this morning’s presentation:

The universe is slightly older than we previously thought (about 80 million years older in fact): it’s 13.82 billion years old.

Planck measures the Hubble constant to be 67 km s-1 Mpc-1. This is slightly smaller than most other recent estimates. Curious!

The energy inventory of the universe isn’t quite what we thought it was: there’s slightly more dark matter than previously thought and slightly less dark energy. The universe is currently made up of 4.9% normal matter, 26.8% dark matter and 68.3% dark energy.

On small scales, the standard cosmological model (which includes inflation) agrees supremely well with the observed cosmic microwave background. The standard cosmological model is in good shape.

There are hints, based on observations of the largest angular scales, of physics beyond our current theories. In particular: (i) the sky in the southern hemisphere is ever so slightly warmer than the sky in the northern hemisphere; (ii) large-scale temperature fluctuations are weaker than expected; and (iii) there’s a cold spot in the universe, in the constellation Eridanus, that’s much larger than our models would predict. Gaining an understanding of these anomalies is going to lead to some really interesting ideas over the next few years.

The Wilkinson Microwave Anisotropy Probe (WMAP) has been perhaps the most spectacularly successful cosmology experiment of all time. WMAP painstakingly mapped the cosmic microwave background and its results have allowed scientists to determine key parameters of the Universe – it’s age, its geometry and so on – with a precision that would have been unthinkable just ten years ago. In 2004, WMAP also found hints of something unusual happening closer to home: it found a weak signal corresponding to haze of microwave radiation, roughly spherical in shape, centred around the centre of our Milky Way galaxy.

If such a haze exists then astronomers have an interesting challenge in trying to explain its origin. But does it exist? Some researchers argued that the haze might be nothing more than an artefact of data analysis, since they doubted whether WMAP had the sensitivity to distinguish such a weak signal from the general microwave background and the strong emission from galactic dust.

Well, the European Planck mission has a sensitivity that exceeds WMAP and it can observe the Universe over a greater range of frequencies. We can expect great things for cosmology when the Planck team releases its full results. One intermediate result of the Planck Collaboration, released recently on the arxiv server, is that the microwave haze found by WMAP does indeed exist. What is particularly interesting, however, is that the haze isn’t spherical; it’s stretched out like a cigar. Furthermore, the sharp edge to the haze suggests that whatever causes it is a sporadic rather than a continuous phenomenon (since a continuous process would lead to a diffuse haze).

Astronomers don’t yet have a good explanation for the haze. One initial suggestion for the WMAP observation was that dark matter annihilation created electrons and positrons which, when they spiralled in the Milky Way galaxy’s magnetic field, generated the observed microwave haze. But the Planck discovery of a squashed rather than spherical haze tends to discount that idea. Planck has found a mystery. I’m hoping that when the Planck Collaboration publish its full results from the mission there’ll be several more discoveries to ponder.

In Measuring the Universe I talked about the Sunyaev-Zel’dovich effect (or the SZ effect, for short). It’s named after Rashid Sunyaev and Yakov Zel’dovich, who studied the concept in the late 1960s and early 1970s.

The SZ effect is a distortion in the observed cosmic microwave background radiation caused by high-energy electrons scattering of low-energy CMB photons. The collisions give the photons an energy boost – it’s the familiar inverse Compton scattering effect – and this in turn generates a slightly hotter patch in the microwave background. (‘Slightly’ is the operative word here: a microwave photon passing through a cloud of hot electrons on its journey towards Earth will appear hotter by just a few millionths of a degree.)

The high-energy electrons that can cause the SZ effect are to be found in the extremely hot gas clouds that are found at the centre of galaxy clusters. And, because the SZ effect is caused by scattering, its size doesn’t depend on redshift. In other words, the SZ effect in a high-redshift cluster can be detected just as easily (or, more truthfully, with just as much difficulty!) as in a cluster at low redshift. The SZ effect provides what is in essence a standard ruler – see Measuring the Universe for details – and so it can be used as a distance indicator over quite large reaches of the cosmos.

But there’s another type of SZ effect – the so-called kinematic SZ effect. I didn’t bother discussing it in the book because it is about 20 times fainter than the main (or thermal) SZ effect. Since the thermal SZ effect is hard enough to measure I didn’t think that anyone would be measuring the kinematic SZ effect anytime soon. Well, I was wrong. Cosmologists have now measured it.

The kinematic SZ effect arises because of the motion of galaxy clusters. Imagine a CMB photon passing through a cluster that’s moving away from us: when we observe the photon it will be slightly cooler (redder) than it otherwise would be due to the kinematic SZ effect. And if the photon moves through a cluster that’s approaching us then it will be slightly hotter (bluer). Sunyaev and Zel’dovich considered this from a theoretical point of view four decades ago, in 1972; but it’s taken until 2012 for researchers to measure it, such is the difficulty of teasing out the signal.

If a CMB photon passes through a galaxy cluster that’s moving away from Earth it becomes slightly redder and cooler (left part of the diagram). If it passes through a galaxy cluster that’s moving towards Earth then it becomes bluer and hotter (right part of the diagram). These wavelength shifts are extremely tiny, so this effect has only just been observed. (Credit: Sudeep Das, University of California-Berkeley)

The SZ effects can probe how clusters form and move around – something that depends critically on dark matter and dark energy. The SZ effects thus have the potential to deepen our understanding of the most mysterious elements of our Universe.

The search for dark matter is much more difficult than the search for the Higgs. At least with the Higgs we had an idea where to look: the Higgs is part of the Standard Model of particle physics, after all, and so physicists could guess at least some of its properties. If the Higgs existed, physicists knew they could catch glimpses of it at the LHC.

It’s different with dark matter.

If dark matter particles exist then they clearly and unambiguously relate to physics beyond the Standard Model. This in turn means there are very few clues that can help physicists in the search for dark matter (a search that is incredibly difficult anyway, since dark matter interacts so rarely with “normal” matter). Since physicists by definition don’t know what it is they are looking for when they search for dark matter, that complicates the search enormously: the sort of experiment that can look for axions (one candidate dark matter particle) is very different from the sort of experiment that can look for WIMPs (another candidate).

The currently favoured explanation for dark matter is perhaps the WIMP explanation – that most of the material in the universe consists of Weakly Interacting Massive Particles. Many experiments are currently looking for WIMPs and, as explained in my book New Eyes on the Universe, the results are intriguing. Some experiments seem to have found tentative signs of a WIMP signal; others experiments have found nothing.

The XENON100 experiment uses arrays of photomultipliers, such as this one, to catch the brief flashes of light that would occur if a WIMP scattered off a xenon nucleus.(Credit: XENON100 Collaboration)

One of the biggest WIMP dark matter experiments to date as been the XENON100 collaboration, which is based deep underground at the Gran Sasso National Laboratory. The experiment employs 62kg of extremely pure liquid xenon as a WIMP target.The idea is that, once in the proverbial blue moon, a WIMP will score a direct hit on a xenon nucleus and the collision will emit small amounts of light. Sensitive detectors surrounding the liquid can detect light from the collisions.

On 18 July 2012, the collaboration announced the results from an analysis of 13 months of searching. They found no evidence that a WIMP, in all that time, had interacted with a xenon nucleus in their target.

Well, today LUX took a step closer to its goal. For the past two years, scientists have been testing the 3-tonne detector (which contains 350 kg of liquid xenon) in a laboratory. Today, they transferred it to its permanent home 1500m below ground in the old Homestake gold mine – site of the famous Davis solar neutrino experiment. It was a delicate operation: the detector was taken down on air bearings in order to protect it from even minor bumps. But the operation was a success. The detector is in place, and it will start taking data later this year.

A schematic of the LUX detector (Credit: Symmetry magazine)

The mile or so of rock above the detector will shield it from cosmic rays, but of course dark matter particles will pass through the rock as if it weren’t there. The hope is that once in a while a dark matter particle will collide directly with a xenon nucleus in LUX. When xenon is hit by a particle (it could be a photon, a neutron, or a dark matter particle), liquid xenon both scintillates and ionizes. By using sophisticated detectors that surround the xenon, physicists can measure the ratio of scintillation over ionization energy from the collision. And from that information they can determine the type of particle involved in the collision – whether photon, neutron or dark matter. That’s the hope, anyway. By this time next year we should know more.