Friday, January 29, 2010

'Grid cells' that act like a spatial map in the brain have been identified for the first time in humans, according to new research by UCL scientists which may help to explain how we create internal maps of new environments.

The study is by a team from the UCL Institute of Cognitive Neuroscience and was funded by the Medical Research Council and the European Union. Published today in Nature, it uses brain imaging and virtual reality techniques to try to identify grid cells in the human brain. These specialised neurons are thought to be involved in spatial memory and have previously been identified in rodent brains, but evidence of them in humans has not been documented until now.

Grid cells represent where an animal is located within its environment, which the researchers liken to having a satnav in the brain. They fire in patterns that show up as geometrically regular, triangular grids when plotted on a map of a navigated surface. They were discovered by a Norwegian lab in 2005 whose research suggested that rats create virtual grids to help them orient themselves in their surroundings, and remember new locations in unfamiliar territory.

Study co-author Dr Caswell Barry said: "It is as if grid cells provide a cognitive map of space. In fact, these cells are very much like the longitude and latitude lines we're all familiar with on normal maps, but instead of using square grid lines it seems the brain uses triangles.

Lead author Dr Christian Doeller added: "Although we can't see the grid cells directly in the brain scanner, we can pick up the regular six-fold symmetry that is a signature of this type of firing pattern. Interestingly, the study participants with the clearest signs of grid cells were those who performed best in the virtual reality spatial memory task, suggesting that the grid cells help us to remember the locations of objects."

Professor Neil Burgess, who leads the team, commented: "The parts of the brain which show signs of grid cells - the hippocampal formation and associated brain areas - are already known to help us navigate our environment and are also critical for autobiographical memory. This means that grid cells may help us to find our way to the right memory as well as finding our way through our environment. These brain areas are also amongst the first to be affected by Alzheimer's disease which may explain why getting lost is one of the most common early symptoms of this disease."

How did the lemurs, flying foxes and narrow-striped mongooses get to the large, isolated island of Madagascar sometime after 65 million years ago?

A pair of scientists say their research confirms the longstanding idea that the animals hitched rides on natural rafts blown out to sea.

Professors Matthew Huber of Purdue and Jason Ali of the University Hong Kong say that the prevailing flow of ocean currents between Africa and Madagascar millions of years ago would have made such a trip not only possible, but fast, too. The findings, based on a three-year computer simulation of ancient ocean currents, will be published in the journal Nature on Feb. 4 and were posted on Nature’s Web site Jan. 20.

The idea that animals rafted to the island is not new. Since at least 1915, scientists have used it as an alternative theory to the notion that the animals arrived on Madagascar via a land bridge that was later obliterated by shifting continents. Rafting would have involved animals being washed out to sea during storms, either on trees or large vegetation mats, and floating to the mini-continent, perhaps while in a state of seasonal torpor or hibernation.

Huber and Ali's work supports a 1940 paper by George Gaylord Simpson, one of the most influential paleontologists and evolution theorists of the 20th century. Simpson introduced the concept of a "sweepstakes" process to explain the chance of raft colonization events taking place through vast stretches of geological time. Once the migrants arrived on the world’s fourth largest island, their descendants evolved into the distinctive, and sometimes bizarre forms seen today.

Anthropologists and paleontologists have good reason to be interested in Madagascar's animals. The island is located in the Indian Ocean roughly 300 miles east of Africa over the Mozambique Channel and is otherwise isolated from significant land masses. Its isolation and varied terrain make it a living laboratory for scientists studying evolution and the impact of geography on the evolutionary process.

Madagascar has more unique species of animals than any location except Australia, which is 13 times larger. The island's population includes 70 kinds of lemurs found nowhere else and about 90 percent of the other mammals, amphibians and reptiles are unique to its 226,656 square miles.

The question has always been how the animals arrived there in the first place. Madagascar appears to have been an island for at least 120 million years, and its animal population began arriving much later, sometime after 65 million years ago.

The raft hypothesis, which scientists refer to as “dispersal,” has always presented one big problem, however. Currents and prevailing winds between Madagascar and Africa flow south and southwest, away from, not toward, the island.

Yet, the land bridge hypothesis also is problematic in that there is no geologic evidence that such a bridge existed during the time in question. Also, there are no large mammals such as apes, giraffes, lions or elephants, indigenous to Madagascar. Only small species such as lemurs, the island's signature species; hedgehog-like tenrecs; rodents; mongoose-like carnivores; and similar animals populate the island.

The animals of Madagascar also appear to have arrived in occasional bursts of immigration by species rather than in a continuous, mixed migration. They likewise appear to have evolved from single ancestors, and their closest relatives are in Africa, scientists say. All of which suggests Simpson's theory was correct.

Ali, who has a research focus in plate tectonics -- the large-scale motions of the Earth's outer shell -- kept running across the land bridge hypothesis in the course of his work. The question intrigued him because the notion of a bridge between Madagascar and Africa appeared to break rules of plate tectonic theory. A background in oceanography also made him think ocean currents between Africa and Madagascar might have changed over time.

"Critically, Madagascar and Africa have together drifted more than 1,600 kilometers northwards and could thus have disrupted a major surface water current running across the tropical Indian Ocean, and hence modified flow around eastern Africa and Madagascar," says Ali, an earth sciences professor.

That led Ali to contact Huber, a paleoclimatologist who reconstructs and models the climate millions of years in the past. Huber, a Purdue earth and atmospheric sciences professor, has a particular interest and expertise in ocean currents, which have a significant impact on climate.

Huber models ancient conditions at a time when the planet was much warmer than it is today, and he specializes in lengthy, highly detailed simulations. He uses the modeling of a warmer Earth in the past -- warm enough for crocodiles to live in an ice-free Arctic -- to help understand conditions generated by today's global warming and to project what the warming trend may hold for the future.

When Ali contacted him about the Madagascar question, Huber had just finished running a three-year simulation on a supercomputer operated by Information Technology at Purdue (ITaP), Purdue's central information technology organization. The modeling produced 100 terabytes of output -- data with potential uses for a variety of purposes, including a study of ancient ocean currents around Madagascar.

The Purdue professor was able to show that 20 million to 60 million years ago, when scientists have determined ancestors of present-day animals likely arrived on Madagascar, currents flowed east, toward the island. Climate modeling showed that currents were strong enough -- like a liquid jet stream in peak periods -- to get the animals to the island without dying of thirst. The trip appears to have been well within the realm of possibility for small animals whose naturally low metabolic rates may have been even lower if they were in torpor or hibernating.

Huber's computer modeling also indicates that the area was a hotspot at the time, just as it is today, for powerful tropical cyclones capable of regularly washing trees and tree islands into the ocean.

"It seems likely that rafting was a distinct possibility," the study concludes. "All signs point to the Simpson sweepstakes model as being correct: Ocean currents could have transported rafts of animals to Madagascar from Africa during the Eocene."

The raft hypothesis has always been the most plausible, says Anne Yoder, director of the Duke University Lemur Center. She specializes in using molecular biogenetic techniques and geospatial analysis to examine the evolutionary history of Madagascar. But Ali and Huber's study now puts hard data behind it, says the Duke professor of biology, biological anthropology and anatomy.

"I was very excited to see this paper," says Yoder, whom Nature asked to review the study prior to publication. "Dispersal has been a hypothesis about a mechanism without any actual data. This takes it out of the realm of storytelling and makes it science."

Ali says the study also is relevant to the movement of animal species elsewhere on the planet, lending support for dispersal over a competing idea that animals arrived at their positions on the drifting land masses of the continents as the Earth took its current form.

Moreover, the Madagascar study provided a test case confirming scientists' ability to model ocean and atmosphere interactions in a past greenhouse climate, Huber said. The National Science Foundation recently funded Huber to further simulate ocean currents in the Eocene epoch, roughly 39 million to 56 million years ago, using the methodology he applied to Madagascar.

For decades, astronomers have analyzed the impact that asteroids could have on Earth. New research by MIT Professor of Planetary Science Richard Binzel examines the opposite scenario: that Earth has considerable influence on asteroids — and from a distance much larger than previously thought. The finding helps answer an elusive, decades-long question about where most meteorites come from before they fall to Earth and also opens the door to a new field study of asteroid seismology.

By analyzing telescopic measurements of near-Earth asteroids (NEAs), or asteroids that come within 30 million miles of Earth, Binzel has determined that if an NEA travels within a certain range of Earth, roughly one-quarter of the distance between Earth and the moon, it can experience a "seismic shake" strong enough to bring fresh material called "regolith" to its surface. These rarely seen "fresh asteroids" have long interested astronomers because their spectral fingerprints, or how they reflect different wavelengths of light, match 80 percent of all meteorites that fall to Earth, according to a paper by Binzel appearing in the Jan. 21 issue of Nature. The paper suggests that Earth's gravitational pull and tidal forces create these seismic tremors.

By hypothesizing about the cause of the fresh surfaces of some NEAs, Binzel and his colleagues have tried to solve a decades-long conundrum about why these fresh asteroids are not seen in the main asteroid belt, which is between Mars and Jupiter. They believe this is because the fresh surfaces are the result of a close encounter with Earth, which obviously wouldn't be the case with an object in the main asteroid belt. Only those few objects that have ventured recently inside the moon's orbital distance and have experienced a "fresh shake" match freshly fallen meteorites measured in the laboratory, Binzel said.

Clark Chapman, a planetary scientist at the Southwest Research Institute in Colorado, believes Binzel's work is part of a "revolution in asteroid science" over the past five years that considers the possibility that something other than collisions can affect asteroid surfaces.

How they did it: Binzel's team used a large NASA telescope in Hawaii to collect information on NEAs, including a huge amount of spectral fingerprint data. Analyzing this data, the group examined where a sample of 95 NEAs had been during the past 500,000 years, tracing their orbits to see how close they'd come to Earth. They discovered that 75 NEAs in the sample had passed well inside the moon's distance within the past 500,000 years, including all 20 fresh asteroids in the sample.

Binzel next determined that an asteroid traveling within a distance equal to 16 times the Earth's radius (about one-quarter of the distance to the moon) appears to experience vibrations strong enough to create fresh surface material. He reached that figure based on his finding that about one-quarter of NEAs are fresh, as well as two known facts — that the space weathering process that ages regolith can happen in less than one million years, and that about one-quarter of NEAs come within 16 Earth radii in one million years.

Before now, people thought an asteroid had to come within one to two Earth radii to undergo significant physical change.

Next steps: Many details about the shaking process remain unknown, including what exactly it is about Earth that shakes the asteroids, and why this happens from a distance as far away as 16 Earth radii. What is certain is that the conditions depend on complex factors such as the velocity and duration of the encounter, the asteroid's shape and the nature of the preexisting regolith. "The exact trigger distance depends on all those seismology factors that are the totally new and interesting area for cutting edge research," Binzel said.

Further research might include computer simulations, ground observations and sending probes to look at the surfaces of asteroids. Binzel's next steps will be to try to discover counterexamples to his findings or additional examples to support it. He may also investigate whether other planets like Venus or Mars affect asteroids that venture close to them.

We have more in common with Dead Sea-dwelling microbes than previously thought. University of Florida researchers have found that one of the most common proteins in complex life forms may have evolved from proteins found in microbes that live in deadly salty environments.

The protein ubiquitin is so-called because it is ubiquitously active in all higher life forms on Earth. The protein is essential to the life cycle of nearly all eukaryotic cells — those that are complex enough to have a nucleus and other membrane-bound structures.

Haloferax volcanii microbes, on the other hand, are unique creatures. One of the most ancient species on the planet, they long ago adapted to conditions far too salty for other organisms — even surviving for thousands of years in dried-out salt lakes.

As they report in the Jan. 7 issue of the journal Nature, researchers for UF’s Institute of Food and Agricultural Sciences have found that two proteins in Haloferax are likely the simple evolutionary precursors of ubiquitin.

These two proteins, dubbed SAMP1 and SAMP2, seem to perform similar functions to ubiquitin without some of enzymes that are needed for ubiquitin to function in eukaryotes, said Julie Maupin-Furlow, the study’s lead researcher and professor in UF’s department of microbiology and cell science.

The finding not only lends insight into how ubiquitin evolved, but it also reveals that this seemingly complex protein network may have some simple mechanisms that can be examined for use as potential medical treatments, Maupin-Furlow said.

Researchers are currently investigating ubiquitin’s role in a broad range of diseases such as cancer, viral infections, neurodegenerative disorders, muscle wasting, diabetes and various inflammatory conditions.

“This opens the door to a new avenue of study for this very important protein,” Maupin-Furlow said. “And it gives us a broader picture of some of the common aspects of life on Earth.”

The surprising increase in methane concentrations millennia ago, identified in continental glacier studies, has puzzled researchers for a long time. According to a strong theory, this would have resulted from the commencement of rice cultivation in East Asia. However, a study conducted at the University of Helsinki's Department of Environmental Sciences and the Department of Geosciences and Geography shows that the massive expanse of the northern peatlands occurred around 5000 years ago, coincident with rising atmospheric methane levels.

After water vapour and carbon dioxide, methane is the most significant greenhouse gas, resulting in about one fifth of atmospheric warming caused by humans. Methane emissions are mainly created by peatlands, animal husbandry, rice cultivation, landfill sites, fossil fuel production and biomass combustion.

Northern peatlands are immense sources of methane, but previous studies have argued them to have been established almost immediately after the Ice Age ended. Consequently, they could not explain the increase of methane, dated to have commenced thousands of years later, since the methane emissions of peatlands decrease as they age.

William Ruddiman, Professor Emeritus in environmental sciences at the University of Virginia, has presented a widely published theory according to which humanity started to affect the climate thousands of years ago, not just since the start of the industrial revolution. According to the theory, rice cultivation, commenced in East Asia already over 5,000 years ago, caused the declining methane amounts to again increase, which contributed to preventing the next ice age.

The new study, conducted under the supervision of Professor Atte Korhola, explains the emergence of the peatlands in the northern hemisphere, and their development history, in a new way. The researchers compiled an extensive radiocarbon dating database concerning the bottom peat in peatlands. Based on over 3,000 dates, their statistical and location information-based analysis, it was identified that the expansion of northern peatlands significantly accelerated about 5,000 years ago. At the same time, the methane content in the atmosphere started to increase.

Peatland expansion resulted in the emergence of millions of square kilometres of young peatlands of the mineretrophic fen type, and they puffed large amounts of methane gas in to the air as the organic matter rotted. According to the study, the early increase in methane levels was mainly caused by natural reasons, and human operations are not necessarily required to explain it.

The expansion of peatlands was triggered by the climate turning moister and cooler, which caused the groundwater levels to rise, while accelerating peat build-up and growth. A similar methane peak may also emerge in the future if precipitation in the arctic areas increases as forecasted.

New research shows that migraine and depression may share a strong genetic component. The research is published in the January 13, 2010, online issue of Neurology®, the medical journal of the American Academy of Neurology.

"Understanding the genetic factors that contribute to these disabling disorders could one day lead to better strategies to manage the course of these diseases when they occur together," said Andrew Ahn, MD, PhD, of the University of Florida in Gainesville, who wrote an editorial accompanying the study and is a member of the American Academy of Neurology. "In the meantime, people with migraine or depression should tell their doctors about any family history of either disease to help us better understand the link between the two."

The study involved 2,652 people who took part in the larger Erasmus Rucphen Family study. All of the participants are descendants of 22 couples who lived in Rucphen in the 1850s to 1900s.

"Genealogical information has shown them all to be part of a large extended family, which makes this type of genetic study possible," said study author Gisela M. Terwindt, MD, PhD, of Leiden University Medical Center in the Netherlands.

Of the participants, 360 had migraine. Of those, 151 had migraine with aura, which is when headaches are preceded by sensations that affect vision, such as seeing flashing lights, and 209 had migraine with no aura. A total of 977 people had depression, with 25 percent of those with migraine also having depression, compared to 13 percent of those without migraine.

The researchers then estimated the relative contribution of genetic factors for both of the disorders. They found that for both types of migraine, the heritability was estimated at 56 percent, i.e., 56 percent of the trait is explained by genetic effects. For migraine with aura, the estimate was 96 percent. "This finding shows that migraine with aura may be a promising avenue to search for migraine genes," Terwindt said.

Comparing the heritability scores for depression between those with migraine and those without showed a shared genetic component in the two disorders, particularly with migraine with aura. "This suggests that common genetic pathways may, at least partly, underlie both of these disorders, rather than that one is the consequence of the other," Terwindt said.

Deep underground aquifers in the American Southwest contain gases that tell of the region's ancient climate, and support a growing consensus that the jet stream over North America was once split in two.

The discoveries were made with a new paleohydrogeology tool, developed by Indiana University Bloomington geologist Chen Zhu and Swiss Federal Institute of Technology geologist Rolf Kipfer, that depends on the curious properties of noble gases as they seep through natural underground aquifers. Noble gases (neon and helium, for example) are elements that resist chemical reactions, and therefore have the potential to record information from Earth's past. In the January issue of Geology, the scientists report the results of their tool's first serious test amid the Navajo sandstone aquifers of northeast Arizona.

"Getting to the point where we understand the interaction between these aquifers and the atmosphere above is going to open up many new ways to ask questions about the relationship between climate changes and water resources," said Zhu, the report's lead author. "We have shown our approach can work extremely well. It confirms that the aquifer was recharged mostly during the Earth's most recent ice age, and particularly related to the fact that the jet stream was actually two jet streams many thousands of years ago, with the lower descending far to the south of North America."

An exhaustive and methodical sampling of ground water from the Arizona sandstone aquifers shows significant changes in noble gas infusion rates and concentration (particularly neon) at key times in Earth's quaternary period (as far back as 40,000 years ago). The scientists saw both an excess of neon about 25,000 to 40,000 years ago, coinciding with the last Pleistocene ice age, as well as an extraordinary peak of neon associated with a flood of groundwater 14,000 to 17,000 years ago, at which time the southern jet stream hovered above northern Arizona.

"We admit, the conditions in Northeast Arizona were ideal," Zhu said. "Navajo sandstone is a rock formed from ancient sand dunes. The system there is free of other complications, and that is not an accident. We chose it for that reason. It is possible results won't be so clear if we employ our method in other, similar places around the world. But we are ready to try." Zhu has conducted substantial studies of the Navajo sandstone in Arizona in the past 15 years.

When water from rains and streams first enters the ground and seeps through sedimentary rock, it carries some gases with it. The water moves slowly. In sideways-oriented aquifers, water at points more distant from where it entered were thought to carry the indelible stamp of ancient atmospheric and hydrological conditions. Groundwater closer to the point of entry, by contrast, would be recognizably newer.

The amount of noble gases carried into the aquifers will depend on a variety of environmental factors, Zhu said. Lower temperatures, lower salinity, and higher pressure can all lead to an excess of noble gases in groundwater.

Much like the analyses of ancient air pockets deep inside glaciers and polar ice, the new tool Zhu and Kipfer have developed can't be used just anywhere. To make sense of the noble gas seepage, the aquifers must be protected by an impermeable layer of rock that stops newer water above from mixing with the older water below. As long as noble gases in the groundwater do not mix with large quantities of noble gases from other sources, the noble gases' initial mix will be left alone to change over time.

Despite restrictions on where Zhu and Kipfer's tool can be used, there are rock formations all over the world that are suitable for its deployment. Best candidates are the more arid regions of the northern hemisphere, such as the southwestern U.S., the northern Sahara, the Middle East, and parts of western China.

The availability of noble gases such as neon, xenon, and helium is believed to have remained relatively unchanged over the last 100,000 years. Zhu said their concentrations in the atmosphere would be controlled primarily by temperature at a given location. Zhu and Kipfer's noble gas data show that soil temperatures were 5 to 6 degrees Celsius cooler (9 to 11 degree Fahrenheit) during the peak of Earth's most recent glacier activity, about 25,000 year ago.

Kipfer was the first to suggest neon might be employed as an indicator of ancient hydraulic effects. The present Geology paper with Zhu represents the approach's first empirical test.

Zhu says he hopes his work with Kipfer will inform policymakers as they grapple with climate change in areas of the world where populations depend on underground aquifers for their water supply. By understanding how these aquifers have waxed and waned in response to changing temperatures, Zhu says experts will get a better idea of how a warmer Earth may influence the availability of water to human populations in the future.

"The more we understand about the interaction between the atmosphere and the water table in the recent geological past, the easier it will be to develop policies that help us adapt to a global warming environment," Zhu said. "We need geologic ground truth for policies based on model forecast."

While governments around the world continue to explore strategies for reducing greenhouse gas emissions, a new study suggests policymakers should focus on what needs to be achieved in the next 40 years in order to keep long-term options viable for avoiding dangerous levels of warming.

The study is the first of its kind to use a detailed energy system model to analyze the relationship between mid-century targets and the likelihood of achieving long-term outcomes.

"Setting mid-century targets can help preserve long-term policy options while managing the risks and costs that come with long-term goals," says co-lead author Brian O'Neill, a scientist at the National Center for Atmospheric Research (NCAR).

The study, conducted with co-authors at the International Institute for Applied Systems Analysis (IIASA) in Austria and the Energy Research Centre of the Netherlands, is being published today in the Proceedings of the National Academy of Sciences. It was funded by IIASA, a European Young Investigator Award to O'Neill, and the National Science Foundation, NCAR's sponsor.

The researchers used a computer simulation known as an integrated assessment model to represent interactions between the energy sector and the climate system. They began with "business as usual" scenarios, developed for the Intergovernmental Panel on Climate Change's 2000 report, that project future greenhouse gas emissions in the absence of climate policy. They then analyzed the implications of restricting emissions in 2050, using a range of levels.

The team focused on how emissions levels in 2050 would affect the feasibility of meeting end-of-century temperature targets of either 2 or 3 degrees Celsius (about 3.5 degrees or 5.5 degrees Fahrenheit, respectively) above the pre-industrial average.

The study identifies critical mid-century thresholds that, if surpassed, would make particular long-term goals unachievable with current energy technologies.

For example, the scientists examined what would need to be done by 2050 in order to preserve the possibility of better-than-even odds of meeting the end-of-century temperature target of 2 degrees Celsius of warming advocated by many governments.

One "business as usual" scenario showed that global emissions would need to be reduced by about 20 percent below 2000 levels by mid-century to preserve the option of hitting the target. In a second case, in which demand for energy and land grow more rapidly, the reductions by 2050 would need to be much steeper: 50 percent. The researchers concluded that achieving such reductions is barely feasible with known energy sources.

"Our simulations show that in some cases, even if we do everything possible to reduce emissions between now and 2050, we'd only have even odds of hitting the 2 degree target-and then only if we also did everything possible over the second half of the century too," says co-author and IIASA scientist Keywan Riahi.

The research team made a number of assumptions about the energy sector, such as how quickly the world could switch to low- or zero-carbon sources to achieve emission targets. Only current technologies that have proven themselves at least in the demonstration stage, such as nuclear fission, biomass, wind power, and carbon capture and storage, were considered. Geoengineering, nuclear fusion, and other technologies that have not been demonstrated as viable ways to produce energy or reduce emissions were excluded from the study.

Research shows that average global temperatures have warmed by close to 1 degree C (almost 1.8 degrees F) since the pre-industrial era. Much of the warming is due to increased emissions of greenhouse gases, predominantly carbon dioxide, due to human activities. Many governments have advocated limiting global temperature to no more than 1 additional degree Celsius in order to avoid more serious effects of climate change.

During the recent international negotiations in Copenhagen, many nations recognized the case for limiting long-term warming to 2 degrees Celsius above pre-industrial levels, but they did not agree to a mid-century emissions target.

"Even if you agree on a long-term goal, without limiting emissions sufficiently over the next several decades, you may find you're unable to achieve it. There's a risk that potentially desirable options will no longer be technologically feasible, or will be prohibitively expensive to achieve," O'Neill says.

On the other hand, "Our research suggests that, provided we adopt an effective long-term strategy, our emissions can be higher in 2050 than some proposals have advocated while still holding to 2 degrees Celsius in the long run," he adds.

The researchers caution that this is just one study looking at the technological feasibility of mid- and end-of-century emissions targets. O'Neill says that more feasibility studies should be undertaken to start "bounding the problem" of emissions mitigation.

"We need to know whether our current and planned actions for the coming decades will produce long-term climate change we can live with," he says. "Mid-century targets are a good way to do that."

A cancer epidemic under way in southeast China may have been initiated by a string of Siberian volcanoes that spewed ash across the Earth 250 million years ago, according to a study published in the journal Environmental Science and Technology.

Nonsmoking women in the Xuan Wei County, Yunnan Province, China suffer from the world’s highest known rate of lung cancer, and Geosciences Research Professor Robert Finkelman, one of the study’s co-authors, said researchers believe the answer is in the coal that women in the province use for heating and cooking.

“Peak lung cancer mortality in women in one specific area of China—Xuan Wei—has been reported at 400 deaths per 100,000 people, which is nearly 20 times the mortality levels in the rest of China,” Finkelman said.

The extraordinarily high rate of lung cancer and the constant use of coal by women for heating and cooking led geoscientists to study the native coal from area mines.

“We discovered that the regional coal that formed after the Permo-Triassic times, about 250 million years ago, was very high in silicon dioxide, which has been linked to cancer in recent studies,” Finkelman said.

The team concluded that volcanoes in Siberia erupted for 5 million years, blasting acidic gasses and particulates into the atmosphere, which cooked into a toxic soup of acid rain. The acidic rain decimated life on Earth and eroded area rocks, freeing up silica, which washed into surrounding peat bogs. Over millions of years, the Xuan Wei peat bogs converted into coal fields, becoming the source of the tainted coal.

“We think the cancer risk comes from burning the coal, not from harvesting it,” Finkelman said. “There is probably a linkage between the gasses being mobilized by the burning coal and the very fine-grained silica particulates that are rafted up by these gasses.”

A new model for primate origins is presented in Zoologica Scripta, published by the Norwegian Academy of Science and Letters and The Royal Swedish Academy of Sciences. The paper argues that the distributions of the major primate groups are correlated with Mesozoic tectonic features and that their respective ranges are congruent with each evolving locally from a widespread ancestor on the supercontinent of Pangea about 185 million years ago.

Michael Heads, a Research Associate of the Buffalo Museum of Science, arrived at these conclusions by incorporating, for the first time, spatial patterns of primate diversity and distribution as historical evidence for primate evolution. Models had previously been limited to interpretations of the fossil record and molecular clocks.

"According to prevailing theories, primates are supposed to have originated in a geographically small area (center of origin) from where they dispersed to other regions and continents" said Heads, who also noted that widespread misrepresentation of fossil molecular clocks estimates as maximum or actual dates of origin has led to a popular theory that primates somehow crossed the globe and even rafted across oceans to reach America and Madagascar.

In this new approach to molecular phylogenetics, vicariance, and plate tectonics, Heads shows that the distribution ranges of primates and their nearest relatives, the tree shrews and the flying lemurs, conforms to a pattern that would be expected from their having evolved from a widespread ancestor. This ancestor could have evolved into the extinct Plesiadapiformes in north America and Eurasia, the primates in central-South America, Africa, India and south East Asia, and the tree shrews and flying lemurs in South East Asia.

Divergence between strepsirrhines (lemurs and lorises) and haplorhines (tarsiers and anthropoids) is correlated with intense volcanic activity on the Lebombo Monocline in Africa about 180 million years ago. The lemurs of Madagascar diverged from their African relatives with the opening of the Mozambique Channel (160 million years ago), while New and Old World monkeys diverged with the opening of the Atlantic about 120 million years ago.

"This model avoids the confusion created by the center of origin theories and the assumption of a recent origin for major primate groups due to a misrepresentation of the fossil record and molecular clock divergence estimates" said Michael from his New Zealand office. "These models have resulted in all sorts of contradictory centers of origin and imaginary migrations for primates that are biogeographically unnecessary and incompatible with ecological evidence".

The tectonic model also addresses the otherwise insoluble problem of dispersal theories that enable primates to cross the Atlantic to America, and the Mozambique Channel to Madagascar although they have not been able to cross 25 km from Sulawesi to Moluccan islands and from there travel to New Guinea and Australia.

Heads acknowledged that the phylogenetic relationships of some groups such as tarsiers, are controversial, but the various alternatives do not obscure the patterns of diversity and distribution identified in this study.

Biogeographic evidence for the Jurassic origin for primates, and the pre-Cretaceous origin of major primate groups considerably extends their divergence before the fossil record, but Heads notes that fossils only provide minimal dates for the existence of particular groups, and there are many examples of the fossil record being extended for tens of millions of years through new fossil discoveries.

The article notes that increasing numbers of primatologists and paleontologists recognize that the fossil record cannot be used to impose strict limits on primate origins, and that some molecular clock estimates also predict divergence dates pre-dating the earliest fossils. These considerations indicate that there is no necessary objection to the biogeographic evidence for divergence of primates beginning in the Jurassic with the origin of all major groups being correlated with plate tectonics.

When consumers talk to each other about products, they generally respond more favorably to abstract language than concrete descriptions, according to a new study in the Journal of Consumer Research.

"In a series of experiments, we explored when and why consumers use abstract language in word-of-mouth messages, and how these differences in language use affect the receiver," write authors Gaby A. C. Schellekens, Peeter W. J. Verlegh, and Ale Smidts (Erasmus University, The Netherlands).

In the course of their studies, the authors found that consumers who described a positive experience with a product (like a smooth shave with a new razor) used more abstract language when they had a positive opinion about the brand before they tried the product. "When consumers were told that the product was a brand they did not like, they used more concrete language to describe a positive experience. Thus, consumers use different ways of describing the exact same experience, depending on whether they use a liked or disliked brand," the authors write.

For a disliked brand, favorable experiences are seen as exceptions, and concrete language helps consumers to frame the experience as a one-time event, the authors explain.

On the receiver end, the studies showed that consumers responded differently to abstract and concrete language. "In our study of receivers, we gave consumers a description of a positive product experience, and asked them to estimate the sender's opinion about the products," the authors write. "We found that perceived opinion of the sender was more positive when the description was cast in more abstract terms." For descriptions of negative experiences, the perceived opinion of the sender was more negative when the description used abstract language.

"Our finding that abstract messages have a stronger impact on buying intentions can be translated straightforwardly into the recommendation to use abstract language if you try to convince someone of the (positive or negative) consequences of buying a product, or of following your advice," the authors conclude.

Plasma jets capable of obliterating tooth decay-causing bacteria could be an effective and less painful alternative to the dentist's drill, according to a new study published in the February issue of the Journal of Medical Microbiology.

Firing low temperature plasma beams at dentin – the fibrous tooth structure underneath the enamel coating - was found to reduce the amount of dental bacteria by up to 10,000-fold. The findings could mean plasma technology is used to remove infected tissue in tooth cavities – a practice that conventionally involves drilling into the tooth.

Scientists at the Leibniz-Institute of Surface Modifications, Leipzig and dentists from the Saarland University, Homburg, Germany, tested the effectiveness of plasma against common oral pathogens including Streptococcus mutans and Lactobacillus casei. These bacteria form films on the surface of teeth and are capable of eroding tooth enamel and the dentin below it to cause cavities. If left untreated it can lead to pain, tooth loss and sometimes severe gum infections. In this study, the researchers infected dentin from extracted human molars with four strains of bacteria and then exposed it to plasma jets for 6, 12 or 18 seconds. The longer the dentin was exposed to the plasma the greater the amount of bacteria that were eliminated.

Plasmas are known as the fourth state of matter after solids, liquids and gases and have an increasing number of technical and medical applications. Plasmas are common everywhere in the cosmos, and are produced when high-energy processes strip atoms of one or more of their electrons. This forms high-temperature reactive oxygen species that are capable of destroying microbes. These hot plasmas are already used to disinfect surgical instruments.

Dr Stefan Rupf from Saarland University who led the research said that the recent development of cold plasmas that have temperatures of around 40 degrees Celsius showed great promise for use in dentistry. "The low temperature means they can kill the microbes while preserving the tooth. The dental pulp at the centre of the tooth, underneath the dentin, is linked to the blood supply and nerves and heat damage to it must be avoided at all costs."

Dr Rupf said using plasma technology to disinfect tooth cavities would be welcomed by patients as well as dentists. "Drilling is a very uncomfortable and sometimes painful experience. Cold plasma, in contrast, is a completely contact-free method that is highly effective. Presently, there is huge progress being made in the field of plasma medicine and a clinical treatment for dental cavities can be expected within 3 to 5 years."