Tuesday, March 30, 2010

An interdisciplinary team of physicists and geologists led by the University of Pennsylvania has made a major step toward predicting where and how large floods occur on river deltas and alluvial fans.

In a laboratory, researchers created a miniature river delta that replicates flooding patterns seen in natural rivers, resulting in a mathematical model capable of aiding in the prediction of the next catastrophic flood.

The results appear in the current issue of Geophysical Research Letters.

Slow deposition of sediment within rivers eventually fills channels, forcing water to spill into surrounding areas and find a new, steeper path. The process is called avulsion. The result, with the proper conditions, is catastrophic flooding and permanent relocation of the river channel.

The goal of the Penn research was to improve prediction of why and where such flooding will occur and to determine how this avulsion process builds deltas and fans over geologic time.

Research was motivated by the Aug. 18, 2008, flooding of the Kosi River fan in northern India, where an artificial embankment was breached and the resulting floodwaters displaced more than a million people. Looking at satellite pictures, scientists from Penn and University of Minnesota Duluth noticed that floodwaters principally filled abandoned channel paths.

Meredith Reitz, lead author of the study and a graduate student in the Department of Physics and Astronomy in Penn’s School of Arts and Sciences, conducted a set of four laboratory experiments to study the avulsion process in detail. Reitz injected a mixture of water and sediment into a bathtub-sized tank and documented the formation and avulsion of river channels as they built a meter-sized delta.

“Reducing the scale of the system allows us to speed up time,” Reiz said. “We can observe processes in the lab that we could never see in nature.”

The laboratory experiments showed flooding patterns that were remarkably similar to the Kosi fan and revealed that flooding and channel relocation followed a repetitive cycle.

One major finding was that the formation of a river channel on a delta followed a random path; however, once a network of channels was formed, avulsion consistently returned flow to these same channels, rather than creating new ones. An additional important finding was that the average frequency of flooding was determined by how long it took to fill a channel with sediment. Researchers constructed a mathematical model incorporating these two ideas, which was able to reproduce the statistical behavior of flooding.

“Avulsions on river deltas and fans are like earthquakes,” said Douglas Jerolmack, director of the Sediment Dynamics Laboratory in the Department of Earth and Environmental Science at Penn and a co-author of the study. “It is impossible to predict exactly where and when they will occur, but we might be able to predict approximately how often they will occur and which areas are most vulnerable. Just as earthquakes occur along pre-existing faults, flooding occurs along pre-existing channel paths. If you want to know where floodwaters will go, find the old channels.”

The authors derived a simple method for estimating the recurrence interval of catastrophic flooding on real deltas. When used in conjunction with satellite images and topographic maps, this work will allow for enhanced flood hazard prediction. Such prediction is needed to protect the hundreds of millions of people who are threatened by flooding on river deltas and alluvial fans. The work could also help in exploration for oil reservoirs, because sandy river channels are an important source of hydrocarbons.

Astronomers have come across what appear to be two of the earliest and most primitive supermassive black holes known. The discovery will provide a better understanding of the roots of our universe, and how the very first black holes, galaxies and stars all came to be.

Astronomers have come across what appear to be two of the earliest and most primitive supermassive black holes known. The discovery, based largely on observations from NASA's Spitzer Space Telescope, will provide a better understanding of the roots of our universe, and how the very first black holes, galaxies and stars all came to be.

"We have found what are likely first-generation quasars, born in a dust-free medium and at the earliest stages of evolution," said Linhua Jiang, a research associate at the University of Arizona's Steward Observatory. Jiang is the lead author on a paper announcing the findings in the March 18 issue of Nature.

Black holes are beastly distortions of space and time. The most massive and active ones lurk at the cores of galaxies, and are usually surrounded by doughnut-shaped structures of dust and gas that feed and sustain the growing black holes. These hungry supermassive black holes are called quasars.

As grimy and unkempt as our present-day universe is today, scientists believe the very early universe didn't have any dust – which tells them that the most primitive quasars should also be dust-free. But nobody had seen such pristine quasars – until now. Spitzer has identified two such immaculate quasars – the smallest quasars on record – about 13 billion light-years away from Earth.

The two quasars, called J0005-0006 and J0303-0019, were first unveiled by Xiaohui Fan, a UA professor of astronomy who coauthored the paper. Jiang and their colleagues, using visible-light data from the Sloan Digital Sky Survey. NASA's Chandra X-ray Observatory had also observed X-rays from one of the objects. X-rays, ultraviolet and optical light stream out from quasars as the gas surrounding them is swallowed.

"As surrounding gas is swallowed by the supermassive black hole, it emits an enormous amount of light, making those quasars detectable literally at the edge of the observable universe," said Fan.

When Jiang and his colleagues set out to observe J0005-0006 and J0303-0019 with Spitzer between 2006 and 2009, their targets didn't stand out much from the usual quasar bunch. Spitzer measured infrared light from the objects along with 18 others, all belonging to a class of the most distant quasars known. Each quasar is anchored by a supermassive black hole weighing more than 100 million suns.

The Spitzer data showed that, of the 20 quasars, J0005-0006 and J0303-0019 lacked characteristic signatures of hot dust. Spitzer's infrared sight makes the space telescope ideally suited to detect the warm glow of dust that has been heated by the feeding black holes.

This is the first observation project to combine data from all three of Spitzer's instruments, including the Multiband Imaging Photometer, or MIPS, a far-infrared camera built at UA's Steward Observatory that gives the Spitzer telescope is ability to see very cold dust.

"The most exciting discovery for us is what we don't see," said Fan, "the dust that typically surrounds all other quasars that have been found so far."

"We think these early black holes are forming around the time when the dust was first forming in the universe, less than one billion years after the Big Bang," Fan added. "The primordial universe did not contain any molecules that could coagulate to form dust. The elements necessary for this process were produced and pumped into the universe later by stars."

The astronomers also observed that the amount of hot dust in a quasar goes up with the mass of its black hole. As a black hole grows, dust has more time to materialize around it. The black holes at the cores of J0005-0006 and J0303-0019 have the smallest measured masses known in the early universe, indicating they are particularly young, and at a stage when dust has not yet formed around them.

Researchers at the University of Rochester's Institute of Optics have discovered a way to make liquid flow vertically upward along a silicon surface, overcoming the pull of gravity, without pumps or other mechanical devices.

In a paper in the journal Optics Express, professor Chunlei Guo and his assistant Anatoliy Vorobyev demonstrate that by carving intricate patterns in silicon with extremely short, high-powered laser bursts, they can get liquid to climb to the top of a silicon chip like it was being sucked through a straw.

Unlike a straw, though, there is no outside pressure pushing the liquid up; it rises on its own accord. By creating nanometer-scale structures in silicon, Guo greatly increases the attraction that water molecules feel toward it. The attraction, or hydrophile, of the silicon becomes so great, in fact, that it overcomes the strong bond that water molecules feel for other water molecules.

Thus, instead of sticking to each other, the water molecules climb over one another for a chance to be next to the silicon. (This might seem like getting energy for free, but even though the water rises, thus gaining potential energy, the chemical bonds holding the water to the silicon require a lower energy than the ones holding the water molecules to other water molecules.) The water rushes up the surface at speeds of 3.5 cm per second.

Yet the laser incisions are so precise and nondestructive that the surface feels smooth and unaltered to the touch.

In a paper a few months ago in the journal Applied Physics Letters, the same researchers proved that the phenomenon was possible with metal, but extending it to silicon could have some important implications. For instance, Guo said, this work could pave the way for novel cooling systems for computers that operate much more effectively, elegantly, and efficiently than currently available options.

"Heat is definitely the number one problem deterring the design of faster conventional processors," said Michael Scott, a professor of computer science at the University, who is not involved in this research.

Computer chips are essentially wafers of silicon covered with billions of microscopic transistors that communicate by sending electrical signals through metal wires that connect them. As technological innovations make it possible to pack astounding numbers of transistors on small pieces of silicon, computer processing speeds could increase substantially; however, the electrical current constantly surging through the chips creates a lot of heat, Scott said. If left unchecked, the heat can melt or otherwise destroy the chip components.

Most computers these days are cooled with fans. Essentially, the air around the circuit components absorbs the heat that is generated and the fan blows that hot air away from the components. The disadvantages of this method are that cold air cannot absorb very much heat before becoming hot, making fans ineffective for faster processors, and fans are noisy.

For these reasons, many companies have been eager to investigate the possibility of using liquid as a coolant instead of air. Liquids can absorb far more heat, and transmit heat much more effectively than air. So far, designers have not created liquid cooling systems that are cost-effective and energy efficient enough to become widely used in economical personal computers. Although Guo's discovery has not yet been incorporated into a prototype, he thinks that silicon that can pump its own coolant has the potential to contribute greatly to the design of future cooling systems.

Global meat production has tripled in the past three decades and could double its present level by 2050, according to a new report on the livestock industry by an international team of scientists and policy experts. The impact of this "livestock revolution" is likely to have significant consequences for human health, the environment and the global economy, the authors conclude.

"The livestock industry is massive and growing," said Harold A. Mooney, co-editor of the two-volume report, Livestock in a Changing Landscape (Island Press). Mooney is a professor of biology and senior fellow at the Woods Institute for the Environment.

"This is the first time that we've looked at the social, economic, health and environmental impacts of livestock in an integrated way and presented solutions for reducing the detrimental effects of the industry and enhancing its positive attributes," he said.

Among the key findings in the report are:

-More than 1.7 billion animals are used in livestock production worldwide and occupy more than one-fourth of the Earth's land. -Production of animal feed consumes about one-third of total arable land. -Livestock production accounts for approximately 40 percent of the global agricultural gross domestic product. -The livestock sector, including feed production and transport, is responsible for about 18 percent of all greenhouse gas emissions worldwide.

Although about 1 billion poor people worldwide derive at least some part of their livelihood from domesticated animals, the rapid growth of commercialized industrial livestock has reduced employment opportunities for many, according to the report. In developing countries, such as India and China, large-scale industrial production has displaced many small, rural producers, who are under additional pressure from health authorities to meet the food safety standards that a globalized marketplace requires.

Beef, poultry, pork and other meat products provide one-third of humanity's protein intake, but the impact on nutrition across the globe is highly variable, according to the report. "Too much animal-based protein is not good for human diets, while too little is a problem for those on a protein-starved diet, as happens in many developing countries," Mooney noted.

While overconsumption of animal-source foods – particularly meat, milk and eggs – has been linked to heart disease and other chronic conditions, these foods remain a vital source of protein and nutrient nutrition throughout the developing world, the report said. The authors cited a recent study of Kenyan children that found a positive association between meat intake and physical growth, cognitive function and school performance.

Human health also is affected by pathogens and harmful substances transmitted by livestock, the authors said. Emerging diseases, such as highly pathogenic avian influenza, are closely linked to changes in the livestock production but are more difficult to trace and combat in the newly globalized marketplace, they said.

The livestock sector is a major environmental polluter, the authors said, noting that much of the world's pastureland has been degraded by grazing or feed production, and that many forests have been clear-cut to make way for additional farmland. Feed production also requires intensive use of water, fertilizer, pesticides and fossil fuels, added co-editor Henning Steinfeld of the United Nations Food and Agriculture Organization (FAO).

Animal waste is another serious concern. "Because only a third of the nutrients fed to animals are absorbed, animal waste is a leading factor in the pollution of land and water resources, as observed in case studies in China, India, the United States and Denmark," the authors wrote. Total phosphorous excretions are estimated to be seven to nine times greater than that of humans, with detrimental effects on the environment.

The beef, pork and poultry industries also emit large amounts of carbon dioxide, methane and other greenhouse gases, Steinfeld said, adding that climate-change issues related to livestock remain largely unaddressed. "Without a change in current practices, the intensive increases in projected livestock production systems will double the current environmental burden and will contribute to large-scale ecosystem degradation unless appropriate measures are taken," he said.

The report concludes with a review of various options for introducing more environmentally and socially sustainable practices to animal production systems.

"We want to protect those on the margins who are dependent on a handful of livestock for their livelihood," Mooney said. "On the other side, we want people engaged in the livestock industry to look closely at the report and determine what improvements they can make."

One solution is for countries to adopt policies that provide incentives for better management practices that focus on land conservation and more efficient water and fertilizer use, he said.

But calculating the true cost of meat production is a daunting task, Mooney added. Consider the piece of ham on your breakfast plate, and where it came from before landing on your grocery shelf. First, take into account the amount of land used to rear the pig. Then factor in all the land, water and fertilizer used to grow the grain to feed the pig and the associated pollution that results.

Finally, consider that while the ham may have come from Denmark, where there are twice as many pigs as people, the grain to feed the animal was likely grown in Brazil, where rainforests are constantly being cleared to grow more soybeans, a major source of pig feed.

"So much of the problem comes down to the individual consumer," said co-editor Fritz Schneider of the Swiss College of Agriculture (SHL). "People aren't going to stop eating meat, but I am always hopeful that as people learn more, they do change their behavior. If they are informed that they do have choices to help build a more sustainable and equitable world, they can make better choices."

It might sound like a mashup of monster movies, but palaeontologists have discovered evidence of how an extinct shark attacked its prey, reconstructing a killing that took place 4 million years ago.

Such fossil evidence of behaviour is incredibly rare, but by careful, forensic-style analysis of bite marks on an otherwise well-preserved dolphin skeleton, the research team, based in Pisa, Italy, have reconstructed the events that led to the death of the dolphin, and determined the probably identity of the killer: a 4 m shark by the name of Cosmopolitodus hastalis.

The evidence, published in the latest issue of the journal Palaeontology, comes from the fossilised skeleton of a 2.8 m long dolphin discovered in the Piedmont region of northern Italy.

According to Giovanni Bianucci, who led the study: "the skeleton lay unstudied in a museum in Torino for more than a century, but when I examined it, as part of a larger study of fossil dolphins, I noticed the bite marks on the ribs, vertebrae and jaws. Identifying the victim of the attack was the easy part - it's an extinct species of dolphin known as Astadelphis gastaldii– working out the identity of the killer called for some serious detective work, as the only evidence to go on was the bite marks."

The overall shape of the bite indicated a shark attack, and Bianucci called in fossil shark expert Walter Landini. "The smoothness of the bite marks on the ribs clearly shows that the teeth of whatever did the biting were not serrated, and that immediately ruled out some possibilities. We simulated bite marks of the potential culprits and, by comparing them with the shape and size of the marks on the fossils, we narrowed it down to Cosmopolitodus hastalis." Circumstantial evidence also supports this verdict: fossil teeth from Cosmopolitodus are common in the rock sequences that the dolphin was found in. "From the size of the bite, we reckon that this particular shark was about 4 m long" says Landini.

Detailed analysis of the bite pattern allowed the researchers to go even further. "The deepest and clearest incisions are on the ribs of the dolphin" says Bianucci, "indicating the shark attached from below, biting into the abdomen. Caught in the powerful bite, the dolphin would have struggled, and the shark probably detached a big amount of flesh by shaking its body from side to side. The bite would have caused severe damage and intense blood loss, because of the dense network of nerves, blood vessels and vital organs in this area. Then, already dead or in a state of shock, the dolphin rolled onto its back, and the shark bit again, close to the fleshy dorsal fin."

The study is significant because of the rarity of such 'fossilized behaviour'. According to Dr Kenshu Shimada, fossil shark expert at DePaul University and the Sternberg Museum of Natural History in the US, "studies like this are important because they give us a glimpse of the ecological interactions between organisms in prehistoric seas. Shark teeth are among the most common vertebrate remains in the fossil record, yet interpreting the details of diet and feeding behaviour of extinct sharks is extremely difficult. Fossil remains of prey species with shark bite marks, like those described by Bianucci and his team, provide direct evidence of what each prehistoric shark ate and how it behaved."

Cryotherapy, an interventional radiology treatment to freeze cancer tumors, may become the treatment of the future for cancer that has metastasized in soft tissues (such as ovarian cancer) and in bone tumors. Such patients are often not candidates for surgery and would benefit from minimally invasive treatment, say researchers at the Society of Interventional Radiology's 35th Annual Scientific Meeting in Tampa, Fla.

"Improved treatment options are needed for individuals affected by metastases in bone and soft tissues since patients with multifocal metastatic disease are often not candidates for surgery," said Peter J. Littrup, M.D., an interventional radiologist and director of imaging research and image-guided therapy for the Barbara Ann Karmanos Cancer Institute in Detroit, Mich. "Percutaneous soft tissue cryotherapy is a well-tolerated treatment option, especially for patients with anesthesia risks, painful lesions or those seeking local control during chemotherapy. Tumor size and/or location do not preclude thorough treatment or pose greater risk with appropriate precautions," added Littrup, who is also a professor of radiology, urology and radiation oncology at Wayne State University in Detroit. In the 97-patient study, researchers used sufficient deadly temperatures to effectively kill tumor cells, resulting in an average of 77 percent tumor shrinkage in patients after 24 months. "Because of the variable placement of tumors within these soft tissue and bone locations, this study shows the versatility of this treatment option when using proper techniques to safeguard nearby structures. Aside from the successful tumor control, what makes this technique even more preferable is the excellent tumor shrinkage since the underlying fibrous or collagenous structures are preserved. The body can then better heal at the ablation (removal) site with minimal additional scar tissue formation," said Littrup.

Last year, it was estimated that 1.5 million new cases of cancer were diagnosed, and up to 85 percent of patients who have breast, prostate or lung cancer have bone metastases at the time of death. Additionally, 5 percent of all cancers result in skin cancer as well. Based on these numbers, conservative estimates determine that up to 500,000 of these newly diagnosed cancer patients alone will suffer from metastatic disease in bone and soft tissue areas. Cryotherapy is a good option for a large—but perhaps under-recognized—problem that could deliver a quantum impact. Namely, the original cancer tumor site (or even a few unresponsive tumors sites) can still cause cancer management problems even after a generally good response to chemotherapy and/or radiation therapy, said Littrup. "Metastasized tumors can occur nearly anywhere in the body and frequently cannot receive additional radiation therapy or would be difficult or very morbid to be controlled with surgery," said Littrup. "Cryotherapy was able to preserve quality of life by providing good local treatment with minimal side effects, especially with advanced stages of cancer where any additional treatment is unlikely to provide a systemic cure," he added. However, cryotherapy is not a first-line therapy for tumor treatment. Despite "superb" cryotherapy outcomes for many tumor types and locations, medical insurance may not cover the treatment, said Littrup.

Historically, cyoablation has been performed on the prostate and liver, but this technique has been recently found effective in other tumors including the breast, kidney and lung. "We simply translated this concept to retroperitoneal, intraperitoneal, superficial and bone locations in order to generate successful use of cryotherapy in different patient groups," said Littrup. The major benefits of cryotherapy are its superb visualization of the ice treatment zone during the procedure, its low pain profile in an outpatient setting and its excellent healing with minimal scar, said Littrup. In this study's cryotherapy treatment, researchers used several needle-like cryoprobes that were inserted through the skin to deliver extremely cold gas directly to a tumor to freeze it. This technique has been used for many years by surgeons in the operating room; however, in the last few years, the needles have become small enough to be used by interventional radiologists through a small nick in the skin, without the need for an operation. The "ice ball" that is created around the needle grows in size and destroys the frozen tumor cells. Surgeons and radiation oncologists have long tried to provide at least a 1-centimer margin of treatment with cancer tumors, and it was important to assure a similar "surgical margin" of lethal temperatures beyond all tumor margins by cryotherapy in this study, said Littrup.

"One of our first soft tissue cryotherapy patients with recurrent ovarian cancer encouraged us to really begin this study," said Littrup. This patient has now undergone seven cryoablation procedures over the last five years in combination with only a few additional cycles of chemotherapy when more than one to two recurrences were noted, he said. "She called cryotherapy a major 'holiday' from chemotherapy and has been one of our big advocates, referring many other ovarian cancer patients with isolated recurrences," said Littrup.

Interventional radiologists are leaders in percutaneous cryotherapy because it requires interventional skills and a thorough understanding of cross-sectional imaging (US, CT, MRI) and IRs are the only physicians who have this rigorously trained skill set combination, said Littrup. Continued study is needed to determine the optimum probe number, spacing and freeze times needed to produce thorough ice coverage of all soft tissue tumors, he said. "With recent developments of powerful new cryotechnology, multiple directions for soft tissue cryotherapy can be pursued, including translating the current, somewhat challenging, procedure done with ultrasound and/or CT guidance to a more consistent and reproducible MR-guided approach," said Littrup. Cryotechnology promises to be more MR-compatible and would also allow accurate targeting of more difficult-to-see tumors. More importantly, larger studies in multiple centers needs to be done, following these basic cryobiology principles of sufficient lethal temperatures generated by multiple cryoprobes spaced evenly throughout a cancer region, he added.