New research by cosmologists at the University of Chicago and Wayne State University confirms the accuracy of Type Ia supernovae in measuring the pace at which the universe expands. The findings support a widely held theory that the expansion of the universe is accelerating and such acceleration is attributable to a mysterious force known as dark energy. The findings counter recent headlines that Type Ia supernova cannot be relied upon to measure the expansion of the universe.

Using light from an exploding star as bright as entire galaxies to determine cosmic distances led to the 2011 Nobel Prize in physics. The method relies on the assumption that, like lightbulbs of a known wattage, all Type Ia supernovae are thought to have nearly the same maximum brightness when they explode. Such consistency allows them to be used as beacons to measure the heavens. The weaker the light, the farther away the star. But the method has been challenged in recent years because of findings the light given off by Type Ia supernovae appears more inconsistent than expected.

"The data that we examined are indeed holding up against these claims of the demise of Type Ia supernovae as a tool for measuring the universe," said Daniel Scolnic, a postdoctoral scholar at UChicago's Kavli Institute for Cosmological Physics and co-author of the new research published in Monthly Notices of the Royal Astronomical Society. "We should not be persuaded by these other claims just because they got a lot of attention, though it is important to continue to question and strengthen our fundamental assumptions."

One of the latest criticisms of Type Ia supernovae for measurement concluded the brightness of these supernovae seems to be in two different subclasses, which could lead to problems when trying to measure distances. In the new research led by David Cinabro, a professor at Wayne State, Scolnic, Rick Kessler, a senior researcher at the Kavli Institute, and others, they did not find evidence of two subclasses of Type Ia supernovae in data examined from the Sloan Digital Sky Survey Supernovae Search and Supernova Legacy Survey. The recent papers challenging the effectiveness of Type Ia supernovae for measurement used different data sets.

A secondary criticism has focused on the way Type Ia supernovae are analyzed. When scientists found that distant Type Ia supernovae were fainter than expected, they concluded the universe is expanding at an accelerating rate. That acceleration is explained through dark energy, which scientists estimate makes up 70 percent of the universe. The enigmatic force pulls matter apart, keeping gravity from slowing down the expansion of the universe.

Yet a substance that makes up 70 percent of the universe but remains unknown is frustrating to a number of cosmologists. The result was a reevaluation of the mathematical tools used to analyze supernovae that gained attention in 2015 by arguing that Type Ia supernovae don't even show dark energy exists in the first place.

Scolnic and colleague Adam Riess, who won the 2011 Nobel Prices for the discovery of the accelerating universe, wrote an article for Scientific American Oct. 26, 2016, refuting the claims. They showed that even if the mathematical tools to analyze Type Ia supernovae are used "incorrectly," there is still a 99.7 percent chance the universe is accelerating.

The new findings are reassuring for researchers who use Type Ia supernovae to gain an increasingly precise understanding of dark energy, said Joshua A. Frieman, senior staff member at the Fermi National Accelerator Laboratory who was not involved in the research.

"The impact of this work will be to strengthen our confidence in using Type Ia supernovae as cosmological probes," he said.

Citation: "Search for Type Ia Supernova NUV-Optical Subclasses," by David Cinabro and Jake Miller (Wayne State University); and Daniel Scolnic and Ashley Li (Kavli Institute for Cosmological Physics at the University of Chicago); and Richard Kessler (Kavli Institute for Cosmological Physics at University of Chicago and the Department of Astronomy and Astrophysics at the University of Chicago). Monthly Notices of the Royal Astronomical Society, November 2016. DOI: 10.1093/mnras/stw3109"Learn more >>

A small discrepancy in the value of a long-sought number has fostered a debate about just how well we know the cosmos.

There is a crisis brewing in the cosmos, or perhaps in the community of cosmologists. The universe seems to be expanding too fast, some astronomers say. Recent measurements of the distances and velocities of faraway galaxies don't agree with a hard-won "standard model" of the cosmos that has prevailed for the past two decades. The latest result shows a 9 percent discrepancy in the value of a long-sought number called the Hubble constant, which describes how fast the universe is expanding. But in a measure of how precise cosmologists think their science has become, this small mismatch has fostered a debate about just how well we know the cosmos. "If it is real, we will learn new physics," said Wendy Freedman of the University of Chicago, who has spent most of her career charting the size and growth of the universe.

Michael S. Turner of the University of Chicago said, "If the discrepancy is real, this could be a disruption of the current highly successful standard model of cosmology and just what the younger generation wants - a chance for big discoveries, new insights and breakthroughs."Learn more >>

Five UChicago faculty members have earned 2017 Sloan Research Fellowships: Bryan Dickinson, assistant professor of chemistry; Suriyanarayanan Vaikuntanathan, assistant professor of chemistry; Joseph Vavra, associate professor of economics at the University of Chicago Booth School of Business; Abigail Vieregg, assistant professor of physics; and Alessandra Voena, associate professor of economics.

Abigail Vieregg is interested in answering questions about the nature of the universe at its highest energies through experimental work in particle astrophysics and cosmology. In particle astrophysics, she focuses on searching for the highest energy neutrinos that come from the most energetic sources in the universe. In cosmology, Vieregg works with a suite of telescopes at the South Pole to help determine what happened during the first moments after the Big Bang by measuring the polarization of the cosmic microwave background.

Vieregg was a NASA Earth and Space Sciences Graduate Fellow at UCLA and a National Science Foundation Office of Polar Programs Postdoctoral Fellow at the Harvard-Smithsonian Center for Astrophysics.

The PICO Collaboration is excited to announce that the PICO-60 dark matter bubble chamber experiment has produced a new dark matter limit after analysis of data from the most recent run. This new result is a factor of 17 improvement in the limit for spin-dependent WIMP-proton cross-section over the already world-leading limits from PICO-2L run-2 and PICO-60 CF3I run-1 in 2016.

The PICO-60 experiment is currently the world's largest bubble chamber in operation; it is filled with 52 kg of C3F8 (octafluoropropane) and is taking data in the ladder lab area of SNOLAB. The detector uses the target fluid in a superheated state such that a dark matter particle interaction with a fluorine nucleus causes the fluid to boil and creates a tell tale bubble in the chamber.

The PICO experiment uses digital cameras to see the bubbles and acoustic pickups to improve the ability to distinguish between dark matter particles and other sources when analysing the data.

The superheated detector technology has been at the forefront of spin-dependent (SD) searches, using various refrigerant targets including CF3I, C4F10 and C2ClF5, and two primary types of detectors: bubble chambers and droplet detectors. PICO is the leading experiment in the direct detection of dark matter for spin-dependent couplings and is developing a much larger version of the experiment with up to 500 kg of active mass.

The PICO Collaboration would like to acknowledge the support of the National Sciences and Engineering Research Council of Canada (NSERC) and the Canada Foundation for Innovation (CFI) for funding.

This work was also supported by the U.S. Department of Energy Office of Science and the US National Science Foundation under Grants PHY-1242637, PHY-0919526, PHY-1205987 and PHY-1506377, and in part by the Kavli Institute for Cosmological Physics at the University of Chicago through grant PHY-1125897, and an endowment from the Kavli Foundation and its founder Fred Kavli.Learn more >>

Researchers Provide New Insight Into Dark Matter Halos

April 19, 2017

An image of a simulated galaxy cluster showing evidence for a boundary, or "edge" from a 2015 paper in the Astrophysical Journal ("The Splashback Radius as a Physical Halo Boundary and the Growth of Halo Mass", The Astrophysical Journal, Volume 810, Issue 1, article id. 36, 16 pp., 2015) by Surhud More, Benedikt Diemer and Andrey Kravtsov.

Many scientists now believe that more than 80 percent of the matter of the universe is locked away in mysterious, as yet undetected, particles of dark matter, which affect everything from how objects move within a galaxy to how galaxies and galaxy clusters clump together in the first place.

This dark matter extends far beyond the reach of the furthest stars in the galaxy, forming what scientists call a dark matter halo. While stars within the galaxy all rotate in a neat, organized disk, these dark matter particles are like a swarm of bees, moving chaotically in random directions, which keeps them puffed up to balance the inward pull of gravity.

Bhuvnesh Jain, a physics professor in Penn's School of Arts & Sciences, and postdoc Eric Baxter are conducting research that could give new insights into the structure of these halos.

The researchers wanted to investigate whether these dark matter halos have an edge or boundary.

"People have generally imagined a pretty smooth transition from the matter bound to the galaxy to the matter between galaxies, which is also gravitationally attracted to the galaxies and clusters," Jain said. "But theoretically, using computer simulations a few years ago, researchers at the University of Chicago showed that for galaxy clusters a sharp boundary is expected, providing a distinct transition that we should be able to see through a careful analysis of the data."

Using a galaxy survey called the Sloan Digital Sky Survey, or SDSS, Baxter and Jain looked at the distribution of galaxies around clusters. They formed a team of experts at the University of Chicago and other institutions around the world to examine thousands of galaxy clusters. Using statistical tools to do a joint analysis of several million galaxies around them, they found a drop at the edge of the cluster. Baxter and collaborator Chihway Chang at the University of Chicago led a paper reporting the findings, accepted for publication in the Astrophysical Journal.Learn more >>

Virtual Earth-sized telescope aims to capture first image of a black hole

April 21, 2017

Illustration of the environment around the supermassive black hole Sagittarius A*, located some 26,000 light years away at the center the Milky Way.

UChicago-led South Pole Telescope part of international effort to study event horizon

A powerful network of telescopes around the Earth is attempting to create the first image of a black hole, an elusive gravitational sinkhole that Albert Einstein first predicted in 1915.

The UChicago-led South Pole Telescope is part of the Event Horizon Telescope, which combines eight observatories in six locations to create a virtual Earth-sized telescope so powerful it could spot a nickel on the surface of the moon. Scientists spent ten days in April gathering data on Sagittarius A*, a black hole at the center of the Milky Way, as well as a supermassive black hole about 1,500 times heavier at the center of galaxy M87.

Each radio-wave observatory collected so much data that it could not be transmitted electronically. Instead, it was downloaded onto more than 1,000 hard drives and flown to the project's data analysis centers at the MIT Haystack Observatory in Westford, Mass., and the Max Planck Institute for Radio Astronomy in Bonn, Germany.

Over the next year, supercomputers will correlate, combine and interpret the data using very long baseline interferometry, a procedure common in astronomy but never implemented on such an enormous scale. The goal is to produce an image of the event horizon, the boundary of a black hole where luminous gases burn at tens of millions of degrees and from which nothing escapes, not even light.

"It all came together for us: telescopes with higher resolutions, better experiments, more computer power, bright ideas, good weather conditions and so on," said John Carlstrom, the Subramanyan Chandrasekhar Distinguished Service Professor of Astronomy and Astrophysics at UChicago, who leads the South Pole Telescope collaboration. "I'm very confident that we'll come up with not only a good image, but a better understanding of black holes and gravity."

The telescopes in the network employ radio dishes that can detect very short wavelengths, even less than a millimeter -- the shorter the wavelength, the higher the resolution. Water, dust and clouds of gas can block radio waves, so the telescopes in Event Horizon were selected, in part, for being located in deserts, dry plateaus and mountaintops. Nevertheless, a storm or high winds could have ruined data collection.

Astronomers have taken aim at black holes before, but the big difference this time comes from incorporating the new Atacama Large Millimeter/submillimeter Array and the South Pole Telescope into the virtual network. Located high in the mountains of Chile, ALMA is the most complex astronomical observatory ever built, using 66 high-precision dish antennas with a total collecting area of more than 71,000 square feet. The South Pole Telescope provides the critical north-south resolving power to pick apart the details of Sagittarius A*.

"ALMA is the key to this experiment," Carlstrom said. "It gives us great sensitivity and at the incredibly short wavelength of 1.3 millimeters. But next year we'll repeat this experiment at 0.8 millimeters to get an even higher resolution.

Three UChicago faculty members have been elected to the American Philosophical Society, the oldest learned society in the United States.

They are Lorraine Daston, visiting professor in the John U. Nef Committee on Social Thought; Neil H. Shubin, the Robert R. Bensley Distinguished Service Professor of Organismal Biology and Anatomy; and Michael S. Turner, the Bruce V. and Diana M. Rauner Distinguished Service Professor.

Michael S. Turner is a theoretical cosmologist who helped to pioneer the interdisciplinary field that combines particle astrophysics and cosmology. His research focuses on the earliest moments of creation, and he has made seminal contributions to theories surrounding dark matter, dark energy and inflation. A former chair of UChicago's Department of Astronomy & Astrophysics, Turner currently serves as director of the Kavli Institute for Cosmological Physics.

Turner chaired the National Research Council's Committee on the Physics of the Universe, which published the influential report, "Connecting Quarks with the Cosmos." He previously served as assistant director for mathematical and physical sciences at the National Science Foundation, the chief scientist of Argonne National Laboratory and the president of the American Physical Society.

Turner is a member of the National Academy of Sciences and the American Academy of Arts and Sciences. He has received numerous honors, including the 2010 Dannie Heineman Prize for pioneering cosmological physics research from the American Astronomical Society and the American Institute of Physics, and was selected by the University of Chicago to deliver the 2013 Ryerson Lecture.Learn more >>

World's most sensitive dark matter detector releases first results

May 18, 2017

XENON1T installation in the underground hall of Laboratori Nazionali del Gran Sasso. The three story building on the right houses various auxiliary systems. The cryostat containing the LXeTPC is located inside the large water tank on the left.Photo by Roberto Corrieri and Patrick De Perio

Scientists behind XENON1T, the largest dark matter experiment of its kind ever built, are encouraged by early results, describing them as the best so far in the search for dark matter.

Dark matter is one of the basic constituents of the universe, five times more abundant than ordinary matter. Several astronomical measurements have corroborated the existence of dark matter, leading to an international effort to observe it directly. Scientists are trying to detect dark matter particle interacting with ordinary matter through the use of extremely sensitive detectors. Such interactions are so feeble that they have escaped direct detection to date, forcing scientists to build detectors that are more and more sensitive and have extremely low levels of radioactivity.

On May 18, the XENON Collaboration released results from a first, 30-day run of XENON1T, showing the detector has a record low radioactivity level, many orders of magnitude below surrounding material on earth.

"The care that we put into every single detail of the new detector is finally paying back," said Luca Grandi, assistant professor in physics at the University of Chicago and member of the XENON Collaboration. "We have excellent discovery potential in the years to come because of the huge dimension of XENON1T and its incredibly low background. These early results already are allowing us to explore regions never explored before."

The XENON Collaboration consists of 135 researchers from the United States, Germany, Italy, Switzerland, Portugal, France, the Netherlands, Israel, Sweden and the United Arab Emirates, who hope to one day confirm dark matter's existence and shed light on its mysterious properties.

Located deep below a mountain in central Italy, XENON1T features a 3.2-ton xenon dual-phase time projection chamber. This central detector sits fully submersed in the middle of the water tank, in order to shield it from natural radioactivity in the cavern. A cryostat helps keep the xenon at a temperature of minus-95 degrees Celsius without freezing the surrounding water. The mountain above the laboratory further shields the detector, preventing it from being perturbed by cosmic rays.

But shielding from the outer world is not enough, since all materials on Earth contain tiny traces of natural radioactivity. Thus extreme care was taken to find, select and process the materials making up the detector to achieve the lowest possible radioactive content. This allowed XENON1T to achieve record "silence" necessary to detect the very weak output of dark matter.

A particle interaction in the one-ton central core of the time projection chamber leads to tiny flashes of light. Scientists record and study these flashes to infer the position and the energy of the interacting particle -- and whether it might be dark matter.

Despite the brief 30-day science run, the sensitivity of XENON1T has already overcome that of any other experiment in the field probing unexplored dark matter territory.

"For the moment we do not see anything unexpected, so we set new constraints on dark matter properties," Grandi said. "But XENON1T just started its exciting journey and since the end of the 30-day science run, we have been steadily accumulating new data." UChicago central to international collaboration

Grandi's group is very active within XENON1T, and it is contributing to several aspects of the program. After its initial involvement in the preparation, assembly and early operations of the liquid xenon chamber, the group shifted its focus in the last several months to the development of the computing infrastructure and to data analysis.

"Despite its low background, XENON1T is producing a large amount of data that needs to be continuously processed," said Evan Shockley, a graduate student working with Grandi. "The raw data from the detector are directly transferred from Gran Sasso Laboratory to the University of Chicago, serving as the unique distribution point for the entire collaboration."

The framework, developed in collaboration with a group led by Robert Gardner, senior fellow at the Computation Institute, allows for the processing of data, both on local and remote resources belonging to the Open Science Grid. The involvement of UChicago's Research Computing Center including Director Birali Runesha allows members of the collaboration all around the world to access processed data for high-level analyses.

Grandi's group also has been heavily involved in the analysis that led to this first result. Christopher Tunnell, a fellow at the Kavli Institute for Cosmological Physics, is one of the two XENON1T analysis coordinators and corresponding author of the result. Recently, UChicago hosted about 25 researchers for a month to perform the analyses that led to the first results.

"It has been a large, concentrated effort and seeing XENON1T back on the front line makes me forget the never-ending days spent next to my colleagues to look at plots and distributions," Tunnell said. "There is no better thrill than leading the way in our knowledge of dark matter for the coming years."Learn more >>

Chicago Ideas Week: "Space Exploration: What's After The Final Frontier?"

May 23, 2017

Chicago Ideas Week: "Space Exploration: What's After The Final Frontier?"

chicagoideas.com

Reach for the stars with some of the country's leading astronomers. Human beings have wondered about the universe for centuries, but it is only within the last 70 years that we've begun venturing into space. Should we continue that effort? How are experts working towards the next era of space exploration? From NASA to private enterprises to citizen scientists, find out humanity's next frontier of space exploration.

What Does the Universe Actually Look Like? Humans can only see a small spectrum of wavelengths, but the universe contains much more than we can actually see. Angela Olinto, chair of the department of astronomy at the University of Chicago, is working to bridge that gap. Angela Olinto Homer J. Livingston Distinguished Service Professor; Department of Astronomy and Astrophysics, University of Chicago Angela Olinto is the Homer J. Livingston Distinguished Service Professor and chair of the department of astronomy and astrophysics at the University of Chicago. Olinto received her B.S. from PUC, Rio de Janeiro, and her Ph.D. from MIT. She has made significant contributions to a number of topics in astrophysics and is the PI of the EUSO-SPB mission (Extreme Universe Space Observatory on a Super-Pressure Balloon) and a member of the Pierre Auger Observatory, both designed to discover the origin of the highest energy cosmic rays.

Astrophysics and Unlocking the Universe When it comes to scientific discover on how the universe works, what we know is just as important as what we thought we knew. Rocky Kolb and Hakeem Oluseyi sit down to discuss the most compelling research in quantum physics going on today.Rocky Kolb Dean of Physical Sciences, University of Chicago Edward W. Kolb (known to most as Rocky) is the Arthur Holly Compton Distinguished Service Professor of Astronomy & Astrophysics and the Dean of the Physical Sciences at the University of Chicago, as well as a member of the Enrico Fermi Institute and the Kavli Institute for Cosmological Physics. In 1983, he was a founding head of the Theoretical Astrophysics Group and in 2004 the founding Director of the Particle Astrophysics Center at Fermi National Accelerator Laboratory in Batavia, Illinois.

Kolb is a Fellow of the American Academy of Arts and Sciences and a Fellow of the American Physical Society. He was the recipient of the 2003 Oersted Medal of the American Association of Physics Teachers for notable contributions to the teaching of physics, the 1993 Quantrell Prize for teaching excellence at the University of Chicago and the 2009 Excellence in Teaching Award from the Graham School of the University of Chicago. His book for the general public, "Blind Watchers of the Sky," received the 1996 Emme Award of the American Aeronautical Society.

The field of Rocky's research is the application of elementary-particle physics to the very early Universe. In addition to over 200 scientific papers, he is a co-author of "The Early Universe," the standard textbook on particle physics and cosmology.

LIGO detects colliding black holes for third time

June 1, 2017

Reconstructions of the three confident and one candidate gravitational wave signals that LIGO has detected to date, including the most recent detection (GW170104). Believed to truly be millions of years long, only the portion of each signal that LIGO was sensitive to is shown here -- the final seconds leading up to the black hole merger.

UChicago scientists: Results help unveil diversity of black holes in the universe

The Laser Interferometer Gravitational-Wave Observatory has made a third detection of gravitational waves, providing the latest confirmation that a new window in astronomy has opened. As was the case with the first two detections, the waves -- ripples in spacetime -- were generated when two black holes collided to form a larger black hole.

The latest findings by the LIGO observatory, described in a new paper accepted for publication in Physical Review Letters, builds upon the landmark discovery in 2015 of gravitational waves, which Albert Einstein predicted a century earlier in his theory of general relativity.

"The UChicago LIGO group has played an important role in this latest discovery, including helping to discern what emitted the gravitational waves, testing whether Einstein's theory of general relativity was correct, and exploring whether electromagnetic radiation -- such as visible light, radio, or X-rays -- were also emanated by the black hole collision," said Daniel Holz, associate professor in Physics and Astronomy & Astrophysics, and head of UChicago's LIGO group.

The new detection occurred during LIGO's current observing run, which began Nov. 30, 2016, and will continue through the summer. The newfound black hole formed by the merger has a mass about 49 times that of our sun. The discovery fills in a gap between the systems previously detected by LIGO, with masses of 62 and 21 times that of our sun for the first and second detections, respectively.

"We continue to learn more about this population of heavy stellar-mass black holes, with masses over 20 solar masses, that LIGO has discovered," said LIGO collaborator Ben Farr, a McCormick Fellow at UChicago's Enrico Fermi Institute. "LIGO is making the most direct and pristine observations of black holes that have ever been made, and we're taking large strides in our understanding of how and where these black holes are formed."

LIGO made the first direct observation of gravitational waves in September 2015 during its first observing run. The second detection was made in December 2015, and the third detection, called GW170104, was made on Jan. 4, 2017.

In all three cases, each of the twin detectors of LIGO observed gravitational waves from the tremendously energetic mergers of black hole pairs. The collisions produce more power than is radiated by all of the stars in all of the galaxies in the entire observable universe. The recent detection is the farthest one yet, with the black holes located about 3 billion light-years away. The black holes in the first and second detections were located 1.3 billion and 1.4 billion light-years away, respectively.

"It is truly remarkable that, 100 years after the formulation of general relativity, we are now directly observing some of the most interesting predictions of this theory," said LIGO collaborator Robert Wald, the Charles H. Swift Distinguished Service Professor in Physics at UChicago. "LIGO has opened an entirely new window on our ability to observe phenomena involving strong gravitational fields, and we can look forward to its providing us with many further observations of great astrophysical and cosmological significance in the coming years."

'Looks like Einstein was right' The LIGO Scientific Collaboration is an international collaboration whose observations are carried out by twin detectors -- one in Hanford, Wash., and the other in Livingston, La. -- operated by California Institute of Technology and Massachusetts Institute of Technology with funding from the National Science Foundation.

The discoveries from LIGO are once again putting Albert Einstein's theories to the test. For example, the researchers looked for an effect called dispersion, in which light waves in a physical medium travel at different speeds depending on their wavelength -- the same way a prism creates a rainbow.

Einstein's general theory of relativity forbids dispersion from happening in gravitational waves as they propagate from their source to Earth, and LIGO's latest detection is consistent with this prediction.

"It looks like Einstein was right -- even for this new event, which is about two times farther away than our first detection," said Laura Cadonati, associate professor of physics at Georgia Institute of Technology and deputy spokesperson for the LIGO Scientific Collaboration. "We can see no deviation from the predictions of general relativity, and this greater distance helps us to make that statement with more confidence."

The LIGO team working with the Virgo Collaboration is continuing to search the latest LIGO data for signs of space-time ripples from the far reaches of the cosmos. They also are working on technical upgrades for LIGO's next run, scheduled to begin in late 2018, during which the detectors' sensitivity will be improved.

"With the detection of GW170104, we are taking another important step toward gravitational-wave astronomy," Holz said. "We now have three solid detections, and these provide our first hints about the diversity of black hole systems in the universe."

LIGO is funded by the National Science Foundation. More than 1,000 scientists from around the world participate in the effort through the LIGO Scientific Collaboration and Virgo Collaboration.Learn more >>

This is the third black-hole smashup that astronomers have detected since they started keeping watch on the cosmos back in September 2015, with LIGO, the Laser Interferometer Gravitational-Wave Observatory. All of them are more massive than the black holes that astronomers had previously identified as the remnants of dead stars.

...

As for the original stellar identities of these dark dancers, the consensus, said Daniel Holz of the University of Chicago, is that they were probably very massive and primitive stars at least 40 times heavier than the sun.

According to theoretical calculations, stars composed of primordial hydrogen and helium and lacking heavier elements like oxygen and carbon, which astronomers with their knack for nomenclature call "metals," can grow monstrously large. They could collapse directly into black holes when their brief violent lives were over without the benefit of a supernova explosion or other cosmic fireworks.

Dr. Holz said in an email: "It is indeed odd to think that some of the most dramatic stellar collapse do not result in massive stellar explosions outshining galaxies, but instead just involve a star winking out of existence. But that's what the theory says should happen."Learn more >>

Tiny scientists mobilized to study solar eclipse

July 26, 2017

Jason Henning (left), a post-doctoral fellow at the Kavli Institute for Cosmological Physics at the University of Chicago, talks about eclipses with children Tuesday at the Bright Horizons at Lakeview, a Chicago pre-school on Lincoln Avenue. Credit: Neil Steinberg/Sun-Times

Jason Henning is a post-doctorate fellow at the Kavli Institute for Cosmological Physics at the University of Chicago. He's been to the South Pole three times, working on the university's 10-meter telescope there.

On Tuesday morning, he found himself advancing science in a place it doesn't frequently go: sitting on a too small chair in a basement classroom with the lights dimmed.

"Who's ready for an eclipse?" he asked a group of 4- and 5-year-olds sitting around a table at Bright Horizons at Lakeview, a preschool.

The youngsters didn't exactly squeal "Yes!" in unison, but they at least cast their attention in his general direction. Henning proceeded, using a small model Earth, moon and, as a light source, a lamp with a dinosaur base.

"Does anybody know how you make night and day?" asked Henning. "Does anybody remember?"Learn more >>

A map of dark matter covering about one -- thirtieth of the entire sky and spanning several billion light years -- red regions have more dark matter than average, blue regions less dark matter. Courtesy of Chihway Chang, the DES collaboration

Result supports view that dark matter, dark energy make up most of cosmos

Imagine planting a single seed and, with great precision, being able to predict the exact height of the tree that grows from it. Now imagine traveling to the future and snapping photographic proof that you were right.

If you think of the seed as the early universe, and the tree as the universe the way it looks now, you have an idea of what the international Dark Energy Survey collaboration has just done. Scientists unveiled their most accurate measurement of the present large-scale structure of the universe at a meeting Aug. 3 at the University of Chicago-affiliated Fermi National Accelerator Laboratory. UChicago, Argonne and Fermilab scientists are members of international Dark Energy Survey collaboration.

These measurements of the amount and "clumpiness" (or distribution) of dark matter in the present-day cosmos were made with a precision that, for the first time, rivals that of inferences from the early universe by the European Space Agency's orbiting Planck observatory. The new Dark Energy Survey result (the tree, in the above metaphor) is close to "forecasts" made from the Planck measurements of the distant past (the seed), allowing scientists to understand more about the ways the universe has evolved over 14 billion years.

"This result is beyond exciting," said Fermilab's Scott Dodelson, a professor in the Department of Astronomy and Astrophysics at UChicago and one of the lead scientists on this result, which was announced at the American Physical Society Division of Particles and Fields meeting. "For the first time, we're able to see the current structure of the universe with the same clarity that we can see its infancy, and we can follow the threads from one to the other, confirming many predictions along the way."

Most notably, this result supports the theory that 26 percent of the universe is in the form of mysterious dark matter and that space is filled with an also-unseen dark energy, which makes up 70 percent and is causing the accelerating expansion of the universe.

Paradoxically, it is easier to measure the large-scale clumpiness of the universe in the distant past than it is to measure it today. In the first 400,000 years following the Big Bang, the universe was filled with a glowing gas, the light from which survives to this day. The Planck observatory's map of this cosmic microwave background radiation gives us a snapshot of the universe at that very early time. Since then, the gravity of dark matter has pulled mass together and made the universe clumpier over time. But dark energy has been fighting back, pushing matter apart. Using the Planck map as a start, cosmologists can calculate precisely how this battle plays out over 14 billion years.

"These first major cosmology results are a tribute to the many people who have worked on the project since it began 14 years ago," said Dark Energy Survey Director Josh Frieman, a scientist at Fermilab and a professor in the Department of Astronomy and Astrophysics at UChicago. "It was an exciting moment when we unveiled the results to ourselves just last month, after carrying out a 'blind' analysis to avoid being influenced by our prejudices."

The Dark Energy Survey is a collaboration of more than 400 scientists from 26 institutions in seven countries. Its primary instrument is the 570-megapixel Dark Energy Camera, one of the most powerful in existence, which is able to capture digital images of light from galaxies eight billion light years from Earth. The camera was built and tested at Fermilab, the lead laboratory on the Dark Energy Survey, and is mounted on the National Science Foundation's four-meter Blanco telescope, part of the Cerro Tololo Inter-American Observatory in Chile. The DES data are processed at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign.

Scientists are using the camera to map an eighth of the sky in unprecedented detail over five years. The fifth year of observation will begin this month. The new results draw only from data collected during the survey's first year, which covers one-thirtieth of the sky.

Scientists used two methods to measure dark matter. First, they created maps of galaxy positions as tracers, and second, they precisely measured the shapes of 26 million galaxies to directly map the patterns of dark matter over billions of light years, using a technique called gravitational lensing.

To make these ultra-precise measurements, the team developed new ways to detect the tiny lensing distortions of galaxy images - an effect not even visible to the eye, enabling revolutionary advances in understanding these cosmic signals. In the process, they created the largest guide to spotting dark matter in the cosmos ever drawn. The new dark matter map is ten times the size of the one that the Dark Energy Survey released in 2015 and will eventually be three times larger than it is now.

"The Dark Energy Survey has already delivered some remarkable discoveries and measurements, and they have barely scratched the surface of their data," said Fermilab Director Nigel Lockyer. "Today's world-leading results point forward to the great strides DES will make toward understanding dark energy in the coming years."Learn more >>

Imagine planting a single seed and, with great precision, being able to predict the exact height of the tree that grows from it. Now imagine traveling to the future and snapping photographic proof that you were right.

If you think of the seed as the early universe, and the tree as the universe the way it looks now, you have an idea of what the Dark Energy Survey (DES) collaboration has just done. In a presentation today at the American Physical Society Division of Particles and Fields meeting at the U.S. Department of Energy's (DOE) Fermi National Accelerator Laboratory, DES scientists will unveil the most accurate measurement ever made of the present large-scale structure of the universe.Learn more >>

UChicago physicists play leading role in confirming theory predicted four decades ago

In 1974, a Fermilab physicist predicted a new way for ghostly particles called neutrinos to interact with matter. More than four decades later, a UChicago-led team of physicists built the world's smallest neutrino detector to observe the elusive interaction for the first time.

Neutrinos are a challenge to study because their interactions with matter are so rare. Particularly elusive has been what's known as coherent elastic neutrino-nucleus scattering, which occurs when a neutrino bumps off the nucleus of an atom.

The international COHERENT Collaboration, which includes physicists at UChicago, detected the scattering process by using a detector that's small and lightweight enough for a researcher to carry. Their findings, which confirm the theory of Fermilab's Daniel Freedman, were reported Aug. 3 in the journal Science.

"Why did it take 43 years to observe this interaction?" asked co-author Juan Collar, UChicago professor in physics. "What takes place is very subtle." Freedman did not see much of a chance for experimental confirmation, writing at the time: "Our suggestion may be an act of hubris, because the inevitable constraints of interaction rate, resolution and background pose grave experimental difficulties."

When a neutrino bumps into the nucleus of an atom, it creates a tiny, barely measurable recoil. Making a detector out of heavy elements such as iodine, cesium or xenon dramatically increases the probability for this new mode of neutrino interaction, compared to other processes. But there's a trade-off, since the tiny nuclear recoils that result become more difficult to detect as the nucleus grows heavier.

"Imagine your neutrinos are ping-pong balls striking a bowling ball. They are going to impart only a tiny extra momentum to this bowling ball," Collar said.

To detect that bit of tiny recoil, Collar and colleagues figured out that a cesium iodide crystal doped with sodium was the perfect material. The discovery led the scientists to jettison the heavy, gigantic detectors common in neutrino research for one similar in size to a toaster.

No gigantic lab The 4-inch-by-13-inch detector used to produce the Science results weighs only 32 pounds (14.5 kilograms). In comparison, the world's most famous neutrino observatories are equipped with thousands of tons of detector material.

"You don't have to build a gigantic laboratory around it," said UChicago doctoral student Bjorn Scholz, whose thesis will contain the result reported in the Science paper. "We can now think about building other small detectors that can then be used, for example to monitor the neutrino flux in nuclear power plants. You just put a nice little detector on the outside, and you can measure it in situ."

Neutrino physicists, meanwhile, are interested in using the technology to better understand the properties of the mysterious particle.

"Neutrinos are one of the most mysterious particles," Collar said. "We ignore many things about them. We know they have mass, but we don't know exactly how much."

Through measuring coherent elastic neutrino-nucleus scattering, physicists hope to answer such questions. The COHERENT Collaboration's Science paper, for example, imposes limits on new types of neutrino-quark interactions that have been proposed.

The results also have implications in the search for Weakly Interacting Massive Particles. WIMPs are candidate particles for dark matter, which is invisible material of unknown composition that accounts for 85 percent of the mass of the universe.

"What we have observed with neutrinos is the same process expected to be at play in all the WIMP detectors we have been building," Collar said.

Neutrino alley The COHERENT Collaboration, which involves 90 scientists at 18 institutions, has been conducting its search for coherent neutrino scattering at the Spallation Neutron Source at Oak Ridge National Laboratory in Tennessee. The researchers installed their detectors in a basement corridor that became known as "neutrino alley." This corridor is heavily shielded by iron and concrete from the highly radioactive neutron beam target area, only 20 meters (less than 25 yards) away.

This neutrino alley solved a major problem for neutrino detection: It screens out almost all neutrons generated by the Spallation Neutron Source, but neutrinos can still reach the detectors. This allows researchers to more clearly see neutrino interactions in their data. Elsewhere they would be easily drowned out by the more prominent neutron detections.

The Spallation Neutron Source generates the most intense pulsed neutron beams in the world for scientific research and industrial development. In the process of generating neutrons, the SNS also produces neutrinos, though in smaller quantities.

"You could use a more sophisticated type of neutrino detector, but not the right kind of neutrino source, and you wouldn't see this process," Collar said. "It was the marriage of ideal source and ideal detector that made the experiment work."

Two of Collar's former graduate students are co-authors of the Science paper: Phillip Barbeau, AB'01, SB'01, PhD'09, now an assistant professor of physics at Duke University; and Nicole Fields, PhD'15, now a health physicist with the U.S. Nuclear Regulatory Commission in Chicago.

The development of a compact neutrino detector brings to fruition an idea that UChicago alumnus Leo Stodolsky, SM'58, PhD'64, proposed in 1984. Stodolsky and Andrzej Drukier, both of the Max Planck Institute for Physics and Astrophysics in Germany, noted that a coherent detector would be relatively small and compact, unlike the more common neutrino detectors containing thousands of gallons of water or liquid scintillator. In their work, they predicted the arrival of future neutrino technologies made possible by the miniaturization of the detectors.

Scholz, the UChicago graduate student, saluted the scientists who have worked for decades to create the technology that culminated in the detection of coherent neutrino scattering.

"I cannot fathom how they must feel now that it's finally been detected, and they've achieved one of their life goals," Scholz said. "I've come in at the end of the race. We definitely have to give credit to all the tremendous work that people have done before us."Learn more >>

Ever-Elusive Neutrinos Spotted Bouncing Off Nuclei for the First Time

August 4, 2017

SNS's Beamline 13, which carries neutrons from the SNS collider to experimental stations. The same process that produces the neutrons also spits out neutrinos, which enter the COHERENT detector in the SNS basement. Credit: Jean Lachat University of Chicago.

A new technology for detecting neutrinos represents a "monumental" advance for science.

Juan Collar, a professor in physics at the University of Chicago, with a prototype of the world's smallest neutrino detector used to observe for the first time an elusive interaction known as coherent elastic neutrino nucleus scattering.

Neutrinos are famously antisocial. Of all the characters in the particle physics cast, they are the most reluctant to interact with other particles. Among the hundred trillion neutrinos that pass through you every second, only about one per week actually grazes a particle in your body.

That rarity has made life miserable for physicists, who resort to building huge underground detector tanks for a chance at catching the odd neutrino. But in a study published today in Science, researchers working at Oak Ridge National Laboratory (ORNL) detected never-before-seen neutrino interactions using a detector the size of a fire extinguisher. Their feat paves the way for new supernova research, dark matter searches and even nuclear nonproliferation monitoring.

Under previous approaches, a neutrino reveals itself by stumbling across a proton or neutron amidst the vast emptiness surrounding atomic nuclei, producing a flash of light or a single-atom chemical change. But neutrinos deign to communicate with other particles only via the "weak" force -- the fundamental force that causes radioactive materials to decay. Because the weak force operates only at subatomic distances, the odds of a tiny neutrino bouncing off of an individual neutron or proton are miniscule. Physicists must compensate by offering thousands of tons of atoms for passing neutrinos to strike.

The new experimental collaboration, known as COHERENT, instead looks for a phenomenon called CEvNS (pronounced "sevens"), or coherent elastic neutrino-nucleus scattering. CEvNS relies on the quantum mechanical equivalence between particles and waves, comparable to ocean waves. The high-energy neutrinos sought by most experiments are like short, choppy ocean waves. When such narrow waves pass under floating debris, they can pick out one leaf or twig at a time to toss around. Similarly, a high-energy neutrino typically picks out individual protons and neutrons with which to interact. But just as a long, slow wave would pick up the whole patch of debris at once, a low-energy neutrino sees the entire atomic nucleus as one "coherent" whole. This dramatically improves the odds of an interaction. As the number of neutrons in the nucleus is increased, the effective target size for the neutrino to hit grows in lockstep not just with that number, but with its square.Learn more >>

The fate of the universe just became a little less certain. That's due to a disagreement between a map of the early universe and a new map of today's universe. If the mismatch stands the test of future measurements, we might have to rewrite physics. But that is a pretty big if.

The new results, which are part of the ongoing Dark Energy Survey (DES), charted the distribution of matter across 26 million galaxies in a large swathe of the southern sky.

"This is one of the most powerful pictures of the universe today that we've ever had," says Daniel Scolnic at the University of Chicago, who is a part of the 400-person DES collaboration but wasn't involved in this work.

It is so powerful because knowing the distribution, or clumpiness, of galaxies helps us better understand the cosmic game of tug of war as dark energy - a mysterious force that causes the universe to accelerate - pulls each galaxy apart, and dark matter - a theoretical but still unseen form of matter - pushes each galaxy together.Learn more >>

Eclipses have fascinated people since the earliest days of recorded history.

These rare astronomical events have helped explain the world around us -- from ancient Mesopotamia, where they were believed to foretell the deaths of kings, all the way to the 20th century, when they helped prove Einstein's theory of general relativity.

Such interest hasn't dimmed. People across the United States will have an opportunity on Aug. 21 to witness the first total solar eclipse from coast to coast in 99 years. UChicago faculty and students are among the hordes of enthusiasts traveling across the country toward the area of "totality," the 70-mile-wide stripe stretching from Oregon to South Carolina in which the moon will fully block the sun.

Ahead of this historic event, UChicago News asked scholars in fields ranging from theoretical cosmology to Islamic studies to discuss eclipses and their power.

"Astronomers have learned a lot from eclipses, including one in 1919 that proved Einstein was right.

At the time, only a handful of people were aware of general relativity; Sir Arthur Eddington was one of them. He led an eclipse expedition into the Atlantic to find out whether gravity would bend starlight, as predicted by general relativity. What you want to do is look at stars very close to the sun, and see whether the light coming toward us is bent by the sun's gravity. With the moon blocking the sun, you can get that measurement, and it was exactly what Einstein predicted. The scientific community was agog. It instantly put general relativity on the map, and made Einstein a rockstar.

We're still learning things from eclipses. One thing people will study during this event is the corona of the sun, which is the glowing aura of gases that surrounds the sun. There are still things we don't understand about it -- such as exactly why it actually burns hundreds of times hotter than the surface of the sun itself.

A few years from now, NASA will launch a probe named after UChicago's own Eugene Parker that will explore the sun's corona -- closer than any probe has ever come to the sun."Learn more >>

Newly discovered "standard sirens" provide an independent, clean way to measure how fast the universe is expanding.

To many cosmologists, the best thing about neutron-star mergers is that these events scream into space an otherwise close-kept secret of the universe. Scientists combined the gravitational and electromagnetic signals from the recently detected collision of two of these stars to determine, in a cleaner way than with other approaches, how fast the fabric of the universe is expanding -- a much-contested number called the Hubble constant.

In the days since the neutron-star collision was announced, Hubble experts have been surprised to find themselves discussing not whether events like it could settle the controversy, but how soon they might do so.

Scientists have hotly debated the cosmic expansion rate ever since 1929, when the American astronomer Edwin Hubble first established that the universe is expanding -- and that it therefore had a beginning. How fast it expands reflects what's in it (since matter, dark energy and radiation push and pull in different ways) and how old it is, making the value of the Hubble constant crucial for understanding the rest of cosmology.

And yet the two most precise ways of measuring it result in different answers, with a curious 8 percent discrepancy that "is currently the biggest tension in cosmology," said Dan Scolnic of the University of Chicago's Kavli Institute for Cosmological Physics. The mismatch could be a clue that cosmologists aren't taking into account important details that have affected the universe's evolution. But to see if that's the case, they need an independent check on the measurements.

Neutron-star collisions -- newly detectable by the Laser Interferometer Gravitational-Wave Observatory (LIGO) and Virgo detectors -- seem to be just the thing.

"This first [collision] gives us a seat at the cosmology table," Daniel Holz, an astrophysicist with the University of Chicago and LIGO who was centrally involved in the new Hubble measurement, said in an email. "And as we get more, we can expect to play a major role in the field."

In an expanding universe, the farther away an astronomical object is, the faster it recedes. The Hubble constant says how much faster. Edwin Hubble himself estimated that galaxies move away from us 500 kilometers per second faster for each additional megaparsec of distance between us and them (a megaparsec is about 3.3 million light-years). This was a gross overestimate; by the 1970s, astrophysicists favored values for the Hubble constant around either 50 or 100 kilometers per second per megaparsec, depending on their methods. As errors were eliminated, these camps met near the middle. However, in the past year and a half, the Hubble trouble has reheated. This time, 67 stands off against 73.

The higher estimate of 73 comes from observing lots of astronomical objects and estimating both distance and velocity for each one. It's relatively easy to see how fast a star or galaxy is receding by looking at its "redshift" -- a reddening in color that happens for the same reason the sound of a receding ambulance's siren drops in pitch. Correct for an object's "peculiar velocity," caused by the gravitational pull of other objects in its neighborhood, and you're left with its recessional velocity due to cosmic expansion.

Historically, however, it has proven much, much harder to measure the distance to an object -- the other data point needed to calculate the Hubble constant.

To gauge how far away things are, astronomers build up rungs on a "cosmic distance ladder" in which each rung calibrates more-distant rungs. They start by deducing the distances to stars in the Milky Way using parallax -- the stars' apparent motion across the sky over the course of the year. With this information, astronomers can deduce the brightness of so-called Cepheid stars, which can be used as so-called "standard candles" because they all shine with a known intrinsic brightness. They then spot these Cepheid stars in nearby galaxies and use them to calculate how far away the galaxies must be. Next, the Cepheids are used to calibrate the distances to Type Ia supernovas -- even brighter (though rarer) standard candles that can be seen in faraway galaxies.

Each jump from one rung to the next risks miscalculation. And yet, in 2016, a team known as SH0ES used the cosmic distance ladder approach to peg the Hubble constant at 73.2 with an accuracy of 2.4 percent.

However, in a paper published the same year, a team used the Planck telescope's observations of the early universe to obtain a value of 67.8 for the current expansion rate -- supposedly with 1 percent accuracy.

The Planck team started from the faint drizzle of ancient light called the cosmic microwave background (CMB), which reveals the universe as it looked at a critical moment 380,000 years after the Big Bang. The CMB snapshot depicts a simple, nearly smooth, plasma-filled young universe. Pressure waves of all different wavelengths rippled through the plasma, squeezing and stretching it and creating subtle density variations on different length scales.

At the moment recorded in the CMB, pressure waves with particular wavelengths would have undergone just the right fraction of an undulation since the Big Bang to all reach zero amplitude, momentarily disappearing and creating smooth plasma densities at their associated length scale. Meanwhile, pressure waves with other wavelengths undulated just the right amount to exactly peak in amplitude at the critical moment, stretching and squeezing the plasma to the full extent possible and creating maximum density variations at their associated scales.

These peaks and troughs in density variations at different scales, which can be picked up by telescopes like Planck and plotted as the "CMB power spectrum," encode virtually everything about the young universe. The Hubble constant, in particular, can be reconstructed by measuring the distances between the peaks. "It's a geometric effect," explained Leo Stein, a theoretical physicist at the California Institute of Technology: The more the universe has expanded, the more the light from the CMB has curved through expanding space-time, and the closer together the peaks ought to appear to us.

Other properties of nature also affect how the peaks end up looking, such as the behavior of the invisible "dark energy" that infuses the fabric of the cosmos. The Planck scientists therefore had to make assumptions about all the other cosmological parameters in order to arrive at their estimate of 67 for the Hubble constant.

The similarity of the two Hubble measurements "is amazing" considering the vastly different approaches used to determine them, said Wendy Freedman, an astrophysicist at the University of Chicago and a pioneer of the cosmic distance ladder approach. And yet their margins of error don't overlap. "The universe looks like it's expanding about eight percent faster than you would have expected based on how it looked in its youth and how we expect it to evolve," Adam Riess of Johns Hopkins University, who led the SH0ES team, told Scientific American last year. "We have to take this pretty darn seriously."

The 67-versus-73 discrepancy could come down to an unknown error on one side or both. Or it might be real and significant -- an indication that the Planck team's extrapolation from the early universe to the present is missing a cosmic ingredient, one that changed the course of history and led to a faster expansion rate than otherwise expected. If a hypothesized fourth type of neutrino populated the infant universe, for instance, this would have increased the radiation pressure and affected the CMB peak widths. Or dark energy, whose repulsive pressure accelerates the universe's expansion, might be getting denser over time.

Suddenly, neutron-star collisions have materialized to cast the deciding vote.

The crashing stars serve as "standard sirens," as Holz and Scott Hughes of the Massachusetts Institute of Technology dubbed them in a 2005 paper, building on the work of Bernard Schutz 20 years earlier. They send rushes of ripples outward through space-time that are not dimmed by gas or dust. Because of this, the gravitational waves transmit a clean record of the strength of the collision, which allows scientists to "directly infer the distance to the source," Holz explained. "There is no distance ladder, and no poorly understood astronomical calibrations. You listen to how loud the [collision] is, and how the sound changes with time, and you directly infer how far away it is." Because astronomers can also detect electromagnetic light from neutron-star collisions, they can use redshift to determine how fast the merged stars are receding. Recessional velocity divided by distance gives the Hubble constant.

From the first neutron-star collision alone, Holz and hundreds of coauthors calculated the Hubble constant to be 70 kilometers per second per megaparsec, give or take 10. (The major source of uncertainty is the unknown angular orientation of the merging neutron stars relative to the LIGO detectors, which affects the measured amplitude of the signal.) Holz said, "I think it's just pure luck that we're smack in the middle," between the cosmic-distance-ladder and cosmic-microwave-background Hubble estimates. "We could easily shift to one side or the other."

The measurement's accuracy will steadily improve as more standard sirens are heard over the next few years, especially as LIGO continues to ramp up in sensitivity. According to Holz, "With roughly 10 more events like this one, we'll get to 1 percent [of error]," though he stresses that this is a preliminary and debatable estimate. Riess thinks it will take more like 30 standard sirens to reach that level. It all depends on how lucky LIGO and Virgo got with their first detection. "I do think the method has the potential to be a game changer," said Freedman. "How fast this will occur [or] what the rate of these objects will be ... we don't yet know."

Scolnic, who was part of SH0ES, said his team's tension with Planck's measurement is so large that "the standard siren approach doesn't need to get to 1 percent to be interesting."

As more standard sirens resound, they'll gradually home in on the Hubble constant once and for all and determine whether or not the expansion rate agrees with expectations based on the young universe. Holz, for one, is exhilarated. "I've dedicated the last decade of my life in the hopes of making one plot: a standard siren measurement of the Hubble. I got to make my Hubble plot, and it is beautiful."Learn more >>