Post navigation

Princeton University and Lund University researchers project that the recently launched European satellite Gaia could discover tens of thousands of planets during its five-year mission. In this image, the colored portions indicate the number of observations Gaia would make of a particular part of the sky during its mission; the scale at the bottom indicates the number of observations from zero (purple) to 200 (red). The total number of observations of any part of the sky ranges from about 60 at low ecliptic latitudes to about 80 at high ecliptic latitudes, with a maximum of about 150-200 at intermediate latitudes. From these many different observations of each star, the highly accurate Gaia measurements will reveal the tiny star motion, or “wobble,” that results from any orbiting planet. (Image by Lennart Lindegren, Lund University)

A recently launched European satellite could reveal tens of thousands of new planets within the next few years, and provide scientists with a far better understanding of the number, variety and distribution of planets in our galaxy, according to research published today.

Researchers from Princeton University and Lund University in Sweden calculated that the observational satellite Gaia could detect as many as 21,000 exoplanets, or planets outside of Earth’s solar system, during its five-year mission. If extended to 10 years, Gaia could detect as many as 70,000 exoplanets, the researchers report. The researchers’ assessment is accepted in the Astrophysical Journal and was published Nov. 6 in advance-of-print on arXiv, a preprint database run by Cornell University.

Exoplanets will be an important “by-product” of Gaia’s mission, explained first author Michael Perryman, who made the assessment while serving as Princeton’s Bohdan Paczyński Visiting Fellow in the Department of Astrophysical Sciences. Built and operated by the European Space Agency (ESA) and launched in December 2013, Gaia will capture the motion, physical characteristics and distance from Earth — and one another — of roughly 1 billion objects, mostly stars, in the Milky Way galaxy with unprecedented precision. The presence of an exoplanet will be determined by how its star “wobbles” as a result of the planet’s orbit around it.

More important than the numbers of predicted discoveries are the kinds of planets that the researchers expect Gaia to detect, many of which — such as planets with multi-year orbits that pass directly, or transit, in front of their star as seen from Earth — are currently difficult to find, Perryman said. The satellite’s instruments could reveal objects that are considered rare in the Milky Way, such as an estimated 25 to 50 Jupiter-sized planets that orbit faint, low-mass stars known as red dwarfs. Unique planets and systems — such as planets that orbit in the opposite direction of their companions — can inspire years of research, he said.

One of the main objectives of the Gaia mission is to establish the currently uncertain distance from Earth to various stars using high-precision triangulation, which would allow a much better understanding of the properties of the stars and the planets orbiting them. Of the 1,163 confirmed transiting planets, which pass directly in front of their stars as seen from Earth, there are 644 distinct host stars; less than 200 have accurately known distances from Earth. This image shows the distances from Earth (center) to the stars (black dots) of transiting exoplanets. The inner dashed circle has a radius of 100 parsecs (about 326 light years) with the middle and outer circles corresponding to 500 parsecs (1,630 light years) and 1,000 parsecs (3,260 light years), respectively. The cluster of points to the lower right represents the transiting planets discovered by NASA’s Kepler satellite. For each star, the straight lines extending from the circle indicate the current uncertainty of its distance from Earth. (Image by Michael Perryman)

“It’s not just about the numbers. Each of these planets will be conveying some very specific details, and many will be highly interesting in their own way,” Perryman said. “If you look at the planets that have been discovered until now, they occupy very specific regions of discovery space. Gaia will not only discover a whole list of planets, but in an area that has not been thoroughly explored so far.”

Ultimately, a comprehensive census allows scientists to more accurately determine how many planets and planetary systems exist, the detailed properties of those planets, and how they are positioned throughout the galaxy, Perryman said.

Perryman worked with Joel Hartman, an associate research scholar in Princeton’s astrophysical sciences department, Gáspár Bakos, an associate professor of astrophysical sciences, and Lennart Lindegren, a professor of astronomy at Lund University. Gaia is based on a satellite proposal led by Lindegren and Perryman that was submitted to the ESA in 1993.

Research on exoplanets has increased dramatically in the 15 years since Gaia was accepted by the ESA in 2000. The new estimate is based on a highly detailed model of how stars and planets are positioned in the Milky Way; more accurate details of Gaia’s measurement and data-analysis capabilities; and current estimates of exoplanet distributions, particularly those derived from NASA’s Kepler satellite, which has identified nearly 1,000 confirmed planets and more than 3,000 candidates. Crucial to conducting the assessment is the much-improved knowledge that now exists about distant planets, Perryman said, such as the types of stars that exoplanets orbit.

The first exoplanet was detected in 1995. Nearly 1,900 have since been discovered. Bakos, who focuses much of his research on exoplanets, launched and oversees HATNet (Hungarian-made Automated Telescope Network) and HATSouth, planet-hunting networks of fully automated, small-scale telescopes installed on four continents that scan the sky every night for planets as they transit in front of their parent star. The projects have discovered more than 50 planets since 1999.

“Our assessment will help prepare exoplanet researchers for what to expect from Gaia,” Perryman said. “We’re going to be adding potentially 20,000 new planets in a completely new area of discovery space. It’s anyone’s guess how the field will develop as a result.”

This work was partly supported by the National Science Foundation (grant no. 1108686) and NASA (grant no. NNX13AJ15G).

A study by Princeton researchers and European colleagues found that the positive effect that mortality can have on populations depends on the size and developmental stage of the creatures that die. The finding could aid the management of wildlife and fish such as Atlantic cod (Image source: NOAA).

By Morgan Kelly, Office of Communications

In nature, the right amount of death at the right time might actually help boost a species’ population density, according to new research that could help in understanding animal populations, pest control and managing fish and wildlife stocks.

In a paper in the journal Trends in Ecology and Evolution, a Princeton University researcher and European colleagues conclude that the kind of positive population effect an overall species experiences from a loss of individuals, or mortality, depends on the size and developmental stage of the creatures that die.

If many juveniles perish, more adults are freed up to reproduce, but if more adults die, the number of juveniles that mature will increase because density dependence is relaxed, explained co-author Anieke van Leeuwen, a postdoctoral researcher in Princeton’s Department of Ecology and Evolutionary Biology. Van Leeuwen worked with first author Arne Schröder, a postdoctoral research fellow at the Leibniz-Institute of Freshwater Ecology and Inland Fisheries in Berlin, and Tom Cameron, a lecturer in aquatic community ecology at the University of Essex in the United Kingdom.

This dynamic wherein the loss of individuals in one developmental stage translates to more robust individuals in another stage can be important to managing wildlife, pests or resource stocks, van Leeuwen said. For instance, targeting the adults of an invasive insect could have a counterproductive effect of making more food available to growing larvae, she said.

“It doesn’t matter which developmental stage you target, if you impose mortality on one you will get overcompensation on the opposite end of the size range,” van Leeuwen said. “This effect can be especially advantageous in situations where we want to manage resources we want to harvest. Knowing that there are potential effects that result in an increase in that segment of the population we want to encourage is highly relevant.”

At a certain point, of course, mortality becomes too high and the species as a whole declines, the researchers report.

The researchers compared existing theoretical and experimental work on the effect of mortality on population density to resolve various inconsistencies between the two. Existing mathematical models have predicted this phenomenon, and laboratory and field studies have shown that the effect holds true for a variety of animal species.

Many ecological theories and models, however, have ignored differences in body size and development, and predicted that a modest amount of mortality would result in an increase in the total number of individuals, the researchers wrote. On the other hand, experiments have predominantly shown — along with certain models — that mortality has a positive effect within certain life stages or size classes. The researchers concluded that the overlap of experimental and theoretical data indicates that the benefit of mortality is likely divided by developmental stage. In addition, the number of species in which the phenomenon has been observed makes it commonplace in the natural world.

This work was supported by the Journal of Experimental Biology; the Swedish Research Council and the Leibniz-Institute of Freshwater Ecology and Inland Fisheries; the University of Leeds, the National Environment Research Council (grant no. NE/C510467/1) and the European Commission Intra-European Fellowship (FANTISIZE, #275873); and the National Science Foundation (grant no. 1115838).

ce in bumpy donut-shaped experimental fusion facilities called stellarators. This week in Physical Review Letters, these authors describe an advanced application of the method that could help physicists overcome a major barrier to the production of fusion energy in such devices, and could also apply to their more widely used symmetrical donut-shaped cousins called tokamaks. This work was supported by the DOE Office of Science.

Turbulence allows the hot, charged plasma gas that fuels fusion reactions to escape from the magnetic fields that confine the gas in stellarators and tokamaks. This turbulent transport occurs at comparable levels in both devices, and has long been recognized as a challenge for both in producing fusion power economically.

“Confinement bears directly on the cost of fusion energy,” said physicist Harry Mynick, a PPPL coauthor of the paper, “and we’re finding how to reshape the plasma to enhance confinement.”

The new method uses two types of advanced computer codes that have only recently become available. The authors modified these codes to address turbulent transport, evolving the starting design of a fusion device into one with reduced levels of turbulence. The current paper applies the new method to the Wendelstein 7-X stellarator, soon to be the world’s largest when construction is completed in Greifswald, Germany.

Results of the new method, which has also been successfully applied to the design of smaller stellarators and tokamaks, suggest how reshaping the plasma in a fusion device could produce much better confinement. Equivalently, improved plasma shaping could produce comparable confinement with reduced magnetic field strength or reduced facility size, with corresponding reductions in the cost of construction and operation.

The simulations further suggest that a troublesome characteristic called “stiffness” could occur in reactor-sized stellarators. Stiffness, the tendency for heat to rapidly escape as the plasma temperature gradient rises above a threshold, has been observed in tokamaks but less so in stellarators. The possibility that stiffness might be present in reactor-sized stellarators, wrote the authors, could stimulate efforts “toward further optimizing stellarator magnetic fields for reduced turbulence.”

PPPL, on Princeton University’s Forrestal Campus in Plainsboro, New Jersey, is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Fusion takes place when atomic nuclei fuse and release a burst of energy. This compares with the fission reactions in today’s nuclear power plants, which operate by splitting atoms apart.

Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

Mazhar Ali, a fifth-year graduate student in the laboratory of Robert Cava, the Russell Wellman Moore Professor of Chemistry at Princeton University, has spent his academic career discovering new superconductors, materials coveted for their ability to let electrons flow without resistance. While testing his latest candidate, the semimetal tungsten ditelluride (WTe2), he noticed a peculiar result.

Ali applied a magnetic field to a sample of WTe2, one way to kill superconductivity if present, and saw that its resistance doubled. Intrigued, Ali worked with Jun Xiong, a student in the laboratory of Nai Phuan Ong, the Eugene Higgins Professor of Physics at Princeton, to re-measure the material’s magnetoresistance, which is the change in resistance as a material is exposed to stronger magnetic fields.

“He noticed the magnetoresistance kept going up and up and up—that never happens.” said Cava. The researchers then exposed WTe2 to a 60-tesla magnetic field, close to the strongest magnetic field mankind can create, and observed a magnetoresistance of 13 million percent. The material’s magnetoresistance displayed unlimited growth, making it the only known material without a saturation point. The results were published online on September 14 in the journal Nature.

Crystal structure of WTe2 (Source: Nature)

Electronic information storage is dependent on the use of magnetic fields to switch between distinct resistivity values that correlate to either a one or a zero. The larger the magnetoresistance, the smaller the magnetic field needed to change from one state to another, Ali said. Today’s devices use layered materials with so-called “giant magnetoresistance,” with changes in resistance of 20,000 to 30,000 percent when a magnetic field is applied. “Colossal magnetoresistance” is close to 100,000 percent, so for a magnetoresistance percentage in the millions, the researchers hoped to coin a new term.

Their original choice was “ludicrous” magnetoresistance, which was inspired by “ludicrous speed,” the fictional form of fast-travel used in the comedy “Spaceballs.” They even included an acknowledgement to director Mel Brooks. After other lab members vetoed “ludicrous,” the researchers considered “titanic” before Nature editors ultimately steered them towards the term “large magnetoresistance.”

Terminology aside, the fact remained that the magnetoresistance values were extraordinarily high, a phenomenon that might be understood through the structure of WTe2. To look at the structure with an electron microscope, the research team turned to Jing Tao, a researcher at Brookhaven National Laboratory.

“Jing is a great microscopist. They have unique capabilities at Brookhaven,” Cava said. “One is that they can measure diffraction at 10 Kelvin (-441 °F). Not too many people on Earth can do that, but Jing can.”

Electron microscopy experiments revealed the presence of tungsten dimers, paired tungsten atoms, arranged in chains responsible for the key distortion from the classic octahedral structure type. The research team proposed that WTe2 owes its lack of saturation to the nearly perfect balance of electrons and electron holes, which are empty docks for traveling electrons. Because of its structure, WTe2 only exhibits magnetoresistance when the magnetic field is applied in a certain direction. This could be very useful in scanners, where multiple WTe2 devices could be used to detect the position of magnetic fields, Ali said.

“Aside from making devices from WTe2, the question to ask yourself as a scientist is: How can it be perfectly balanced, is there something more profound,” Cava said.

This research was supported by the Army Research Office, grants W911NF-12-1-0461 and W911NF-11-1-0379, and the NSF MRSEC Program Grant DMR-0819860. This work was supported by the US Department of Energy’s Basic Energy Sciences (DOE BES) project “Science at 100 Tesla.” The electron microscopy study at Brookhaven National Laboratory was supported by the DOE BES, by the Materials Sciences and Engineering Division under contract DE-AC02-98CH10886, and through the use of the Center for Functional Nanomaterials.

A new study examines the question of aggressive versus moderate drug treatment on the emergence of drug-resistant pathogens. Shown is a strain of bacteria known as methicillin-resistant Staphylococcus aureus (MRSA). Photo by James Gathany

By Catherine Zandonella, Office of the Dean for Research

In response to the rise of drug-resistant pathogens, doctors are routinely cautioned against overprescribing antimicrobials. But when a patient has a confirmed bacterial infection, the advice is to treat aggressively to quash the infection before the bacteria can develop resistance.

A new study questions the accepted wisdom that aggressive treatment with high drug dosages and long durations is always the best way to stem the emergence and spread of resistant pathogens. The review of nearly 70 studies of antimicrobial resistance, which was authored by researchers at Princeton and other leading institutions and published last week in the journal Proceedings of the Royal Society B, reveals the lack of evidence behind the practice of aggressive treatment in many cases.

“We found that while there are many studies that test for resistance emergence between different drug regimes, surprisingly few have looked at the topic of how varying drug dosage might affect the emergence and spread of resistance,” said Ruthie Birger, a Princeton graduate student who works with C. Jessica Metcalf, an assistant professor of ecology and evolutionary biology and public affairs at Princeton’s Woodrow Wilson School, and Bryan Grenfell, the Kathryn Briger and Sarah Fenton Professor of Ecology and Evolutionary Biology and Public Affairs in Princeton’s Woodrow Wilson School. Birger, Metcalf and Grenfell coauthored the paper with colleagues from 16 universities. “We are a long way from having the evidence for the best treatment decisions with respect to resistance for a range of diseases,” Birger said.

Microbes such as bacteria and parasites can evade today’s powerful drugs by undergoing genetic mutations that enable them to avoid being killed by the drug. For example, bacteria can develop enzymes that degrade certain antibiotics. The logic behind aggressive treatment goes something like this: kill off as many microbes as you can so that few will be around to evolve into resistant forms.

But some scientists have observed a different outcome in mice infected with both an already-resistant strain of malaria and a non-resistant strain. The high-dose drug treatment killed off the non-resistant malarial parasites, leaving the resistant strains to multiply and make the mice even sicker.

The idea that aggressive treatment may backfire against malarial parasites led the authors of the current study to comb the scientific literature to examine whether the same may be true for other types of microbes such as bacteria. The few studies that they found — mostly in laboratory cell cultures rather than animal models or patients — suggest that the picture is complicated, and depends on whether the resistance is new or existing, how many mutations are necessary for the pathogen to become resistant, and how long the drugs have been in use. “It’s remarkable how little we know about this topic,” said Metcalf. “The malaria study conducted by Silvie Huijben and colleagues at Pennsylvania State University is an inspiring step towards developing an evidence base for these important issues.”

In the current analysis, the study authors found that drug resistance is governed by two factors: the abundance of the pathogen and the strength of the selection pressure that drives the pathogen to evolve. Aggressive treatment deals with the first factor by killing off as much pathogen as possible, while moderate treatment may, for some pathogens, reduce the ability for the resistant pathogen to thrive (for example, by maintaining the competitive advantage of a co-infecting drug-sensitive strain of the pathogen) but still reduce total pathogen levels sufficiently that the patient can recover.

Finding the ideal dose and duration of treatment, one that cures the patient without aiding the spread of resistance, will likely be done on a disease by disease basis, the authors found.

One possibility is that moderate treatment might be best used against already-resistant microbes to prevent their spread. Moderate treatment may also be best for drugs that have been on the market for several years with plenty of time for resistant strains to develop.

Aggressive treatment might be best for pathogens that develop resistance slowly, over the course of several mutations. High doses early in the process could be effective at heading off the development of resistance.

The work emerged from two workshops held at Princeton University and funded by the RAPIDD program of the Science and Technology Directorate, Department of Homeland Security and the Fogarty International Center, National Institutes of Health; Science and Technology Directorate, Department of Homeland Security; contract HSHQDC-12-C-00058

Orthographic projections of a cluster cut from the benzene crystal along the two directions (Image courtesy of Science/AAAS)

Two years after its release, the HIV-1 drug Ritonavir was pulled from the market. Scientists discovered that the drug had crystallized into a slightly different form—called a polymorph—that was less soluble and made it ineffective as a treatment.

The various patterns that atoms of a solid material can adopt, called crystal structures, can have a huge impact on its properties. Being able to accurately predict the most stable crystal structure for a material has been a longstanding challenge for scientists.

“The holy grail of this particular problem is to say, I’ve written down this chemical formula for a material, and then just from the formula be able to predict its structure—a goal since the dawn of chemistry,” said Garnet K. L. Chan, the A. Barton Hepburn Professor of Theoretical Chemistry at Princeton University. One major bottleneck towards achieving this goal has been to compute the lattice energy—the energy associated with a structure—to sufficient accuracy to distinguish between several competing polymorphs.

Chan’s group has now accomplished this task, publishing their results in the journal Science on August 8. The research team demonstrated that new techniques could be used to calculate the lattice energy of benzene, a simple yet important molecule in pharmaceutical and energy research, to sub-kilojoule per mole accuracy—a level of certainty that allows polymorphism to be resolved.

Chan credited this success to the combined application of advances in the field of quantum mechanics over the last 15 years. “Some of these advances allow you to resolve the behavior of electrons more finely, do computations on more atoms more quickly, and allow you to consider more electrons at the same time,” Chan said. “It’s a triumph of the modern field of quantum chemistry that we can now determine the behavior of Nature to this level of precision.”

The group’s next goal is to shorten the time it takes to run the desired calculations. These initial calculations consumed several months of computer time, Chan said, but with some practical modifications, future predictions should take only a few hours.

Chan’s colleagues on the work included first author Jun Yang, an electronic structure theory specialist and lecturer in chemistry, and graduate student Weifeng Hu at Princeton University. Additional collaborators were Denis Usvyat and Martin Schutz of the University of Regensburg and Devin Matthews of the University of Texas at Austin.

The work was supported by the U.S. Department of Energy under grant no. DE-SC0008624, with secondary support from grant no. DE-SC0010530. Additional funding was received from the National Science Foundation under grant no. OCI-1265278 and CHE-1265277. D.M. was supported by the U.S. Department of Energy through a Computational Science Graduate Fellowship, funded by grant no. DE-FG02-97ER25308.

“Antibiotic resistance is a problem of managing an open-access resource, such as fisheries or oil,” writes Ramanan Laxminarayan, a research scholar at Princeton University and the director of the Center for Disease Dynamics, Economics & Policy in Washington, D. C., in today’s issue of the journal Science. He goes on to say that individuals have little incentive to use antibiotics wisely, just as people have little incentive to conserve oil when it is plentiful.

As with many other natural resources, maintaining the effectiveness of antibiotics requires two approaches: conserving the existing resource and exploring new sources, Laxminarayan says. These two approaches are linked, however. “Just as incentives for finding new sources of oil reduce incentives to conserve oil,” Laxminarayan writes, “large public subsidies for new drug development discourage efforts to improve how existing antibiotics are used.” Yet new antibiotics tend to cost more than existing ones due to the expense of clinical trials and the fact that the easiest-to-find drugs may have already been discovered.

Laxminarayan’s analysis reveals that the benefits of conserving existing drugs are significant, and argues that the proposed increases in public subsidies for new antibiotics should be matched by greater spending on conservation of antibiotic effectiveness through public education, research and surveillance.

Ramanan Laxminarayan is a research scholar at the Princeton Environmental Institute. His perspective, “Antibiotic effectiveness: Balancing conservation against innovation,” appeared in the September 12, 2014 issue of Science.