Thursday, October 8, 2009

Indiana University's Karl MacDorman has been to the valley -- the uncanny valley of virtual humans so lifelike they give us real humans the creeps. What he's found is that things don't look so bad after all.

That's because MacDorman's research into human photorealism in robots, androids and computer-generated characters is calling into question a longstanding premise put forth by the pioneering Japanese roboticist Masahiro Mori that people become unsettled by any slight nonhuman imperfection in very human-looking forms. MacDorman is an IU School of Informatics associate professor in the Human-Computer Interaction program and the director of the Indiana University-Purdue University Indianapolis Android Science Center.

Nearly 40 years ago Mori identified the uncanny valley as that place where zombies, corpses and dancing mannequins reside. Developing impeccably humanlike computer-generated characters that will not fall into the uncanny valley is the Holy Grail for animators in the multi-billion dollar animation and video game industries.

And what MacDorman has found is that it is no longer your parents' art school.

"What is surprising is that some of the rules you learn in art school do not apply to computer graphics characters," he said. "For example, artists will typically enlarge the eyes of someone they paint to make the person look more attractive. In fact, artists tend to make their own eyes larger in self-portraits. But our results indicate that eye enlargement can have a decidedly negative effect when photorealistic textures are applied to a human model."

The same holds true when distorting other facial proportions such as eye separation and face height, as the changes have a more negative impact on very realistic human character models. So caricaturing can be effective for abstract faces, like the ones seen in Pixar's movie The Incredibles, but fails for human-looking faces like those seen the same year in another film, The Polar Express, that also relied on computer generated imagery.

"The Disney animation of Cinderella's stepmother can be horrid without being eerie, and Tom Hanks can be warm in Polar Express but still eerie," MacDorman said. "So our group is devising new scales that can deal with these nuances." MacDorman has developed the first "eeriness index" to help quantify how virtual characters influence human decision making.

"The index has been developed based on a pilot study, but we want to refine it further in a larger study on animated characters and robots," he said.

Developing design principles for human-looking characters, MacDorman believes, would not only have a huge economic impact on computer graphic animation and video games, but could also play an integral role in developing long-term beneficial relationships between robots and humans.

"Interventions could range from the treatment of autism to rehabilitation coaching to companionship for the elderly," he said. "One goal is to create robots and computer generated characters that are capable of interacting with people in humanlike ways to sustain long-term relationships that improve people's physical, cognitive and social well-being."

Research results to date reflect subjective measures used with human participants who viewed the characters, but MacDorman and his team plan to use results from objective sources like heart rate, galvanic skin response, electroencephalograms and functional magnetic imaging of the brain to gauge how far designers can go before characters creep out viewers.

His latest published research, "Too real for comfort? Uncanny responses to computer generated faces," (appearing in Computers in Human Behavior, and co-written with Robert D. Green, Chin-Chang Ho and Clinton T. Koch, all of the IU School of Informatics), called into question Mori's predictions that a computer generated face looks the eeriest when it looks nearly human.

To view a visualization of two different, side-by-side computer graphic characters as they undergo slight changes in skin and eye realism, visit the Uncanny Valley.

MacDorman's upcoming research with his colleague, Matthias Scheutz, turns in the direction of virtual versus physical representation influencing ethical decision making made via online and face-to-face interactions. Matthias Scheutz is an associate professor of Cognitive Science, Computer Science, and Informatics and the director of the Human-Robot Interaction Laboratory at Indiana University Bloomington.

"The purpose of one of these upcoming proposals is to determine whether the physical embodiment or virtual representation of a robot can influence human decision-making of ethical consequence," MacDorman explained.

And a research paper now in review, "Gender Differences in the Impact of Presentational Factors in Human Character Animation on Ethical Decisions about Medical Dilemmas," indicates that under certain similar conditions men and women make different decisions.

MacDorman, Chin-Chang Ho, Joseph Coram and Himalaya Patel found that using a computer-generated character instead of a human character, or using jerky movements instead of fluid movements, to present participants with an ethical dilemma produced no significant effect on female participants. Male participants, however, were much more likely to rule against the computer-generated character with jerky movements.

"From the standpoint of studying human interaction, it is important to get into and out of the uncanny valley. If you build a robot or computer-generated character that looks human, but you do not animate it well -- if its motion looks jerky or the timing of its responses are off -- it will look uncanny, no matter how realistic and attractive it looks when still," MacDorman said. "But this is useful because it indicates when the cognitive model I am testing in an android, or the artificial intelligence used to control a character in a computer game, is not working. The fact that we are so sensitive to flaws in the behavior of human-looking characters means that we know when our models of human interaction are wrong, so we can correct them. This is the beauty of human photorealism. Because if a cartoon character or a less human-looking robot does something that isn't human, we might not even notice it."

In subzero sediments off the island of Spitsbergen, scientists from the German Max Planck Institute for Marine Microbiology have detected high numbers of thermophilic (heat-loving) bacteria that are adapted to live in much warmer habitats. These thermophiles exist in the Arctic as spores - dormant forms that withstand adverse conditions for long periods, waiting for better times. Experimental incubations at 40 to 60 degrees Celsius revive the Arctic spores, which appear to have been transported from distant hot spots. The discovery could shed new light on one of microbiology’s great hypotheses: "Everything is everywhere, but, the environment selects."

Research into temperature adaptations of psychrophilic (cold-loving) bacteria in Spitsbergen’s permanently cold fjords has made a new breakthrough. Biological activity was measured by incubating sediment samples with labelled substrate at increasing temperatures. The scientists were impressed to see the activity increase dramatically above 40 degrees Celsius. Some dormant spores had apparently come back to life.

The results presented a unique opportunity to study misplaced microbes in a quantitative way. Using metabolic rate measurements, the researchers estimated that a single gram of the Arctic sediment contains up to 100 000 thermophilic spores. This abundance combined with the unusual location is what Max Planck Director Prof. Bo Barker Jørgensen finds exciting: "What is novel here is not the discovery of thermophiles in the Arctic, but demonstrating their high numbers and constant rate of supply." By measuring the sediment accumulation rate, the team calculated an annual deposition of 100 million thermophiles per square metre of the seabed.

So, where are the Arctic thermophiles coming from? Lead author Casey Hubert narrows down the possibilities: "The large and steady flux of anaerobic bacteria indicates that they are coming from a huge anoxic (free of oxygen) source." Transport pathways connecting these hot spots to the cold ocean must also exist. The researchers speculate that fluid circulation through spreading ridges where the ocean crust forms and "black smokers" and other hydrothermal vents occur, since bacteria from these systems are genetically similar to the Arctic thermophiles. Another source could be deep hot sub-marine oil reservoirs where gas and oil leak upwards, eventually penetrating the sea floor. "The genetic similarities to bacteria from hot North Sea oil reservoirs are striking," adds Dr. Hubert. The scientists hope further experiments and genetic forensics will reveal the warm source. The spores might provide a unique opportunity to trace seepages from the hot subsurface, possibly pointing towards undiscovered offshore petroleum deposits.

In the meantime, the findings provide fresh insight for understanding marine biodiversity and the "hidden rare biosphere." Obscured by the major bacterial groups in a given environment are countless minorities that do not contribute to element cycling in any detectable way. Microbiologists continue to puzzle over how bacteria spread out to establish the vast microbial diversity that is measured in nature. The thermophilic spores appear to hold important clues about this riddle of biogeography, even as they sit dormant in the cold Arctic sediment, waiting in vain for better times.

Challenging conventional wisdom, new research finds that the number of sunspots provides an incomplete measure of changes in the Sun's impact on Earth over the course of the 11-year solar cycle. The study, led by scientists at the High Altitude Observatory of the National Center for Atmospheric Research (NCAR) and the University of Michigan, finds that Earth was bombarded last year with high levels of solar energy at a time when the Sun was in an unusually quiet phase and sunspots had virtually disappeared.

"The Sun continues to surprise us," says NCAR scientist Sarah Gibson, the lead author. "The solar wind can hit Earth like a fire hose even when there are virtually no sunspots."

The study, also written by scientists at NOAA and NASA, is being published today in the Journal of Geophysical Research - Space Physics. It was funded by NASA and by the National Science Foundation, NCAR's sponsor.

Scientists for centuries have used sunspots, which are areas of concentrated magnetic fields that appear as dark patches on the solar surface, to determine the approximately 11-year solar cycle. At solar maximum, the number of sunspots peaks. During this time, intense solar flares occur daily and geomagnetic storms frequently buffet Earth, knocking out satellites and disrupting communications networks.

Gibson and her colleagues focused instead on another process by which the Sun discharges energy. The team analyzed high-speed streams within the solar wind that carry turbulent magnetic fields out into the solar system.

When those streams blow by Earth, they intensify the energy of the planet's outer radiation belt. This can create serious hazards for weather, navigation, and communications satellites that travel at high altitudes within the outer radiation belts, while also threatening astronauts in the International Space Station. Auroral storms light up the night sky repeatedly at high latitudes as the streams move past, driving mega-ampere electrical currents about 75 miles above Earth's surface. All that energy heats and expands the upper atmosphere. This expansion pushes denser air higher, slowing down satellites and causing them to drop to lower altitudes.

Scientists previously thought that the streams largely disappeared as the solar cycle approached minimum. But when the study team compared measurements within the current solar minimum interval, taken in 2008, with measurements of the last solar minimum in 1996, they found that Earth in 2008 was continuing to resonate with the effects of the streams. Although the current solar minimum has fewer sunspots than any minimum in 75 years, the Sun's effect on Earth's outer radiation belt, as measured by electron fluxes, was more than three times greater last year than in 1996.

Gibson said that observations this year show that the winds have finally slowed, almost two years after sunspots reached the levels of last cycle's minimum.

The authors note that more research is needed to understand the impacts of these high-speed streams on the planet. The study raises questions about how the streams might have affected Earth in the past when the Sun went through extended periods of low sunspot activity, such as a period known as the Maunder minimum that lasted from about 1645 to 1715.

"The fact that Earth can continue to ring with solar energy has implications for satellites and sensitive technological systems," Gibson says. "This will keep scientists busy bringing all the pieces together."

For the new study, the scientists analyzed information gathered from an array of space- and ground-based instruments during two international scientific projects: the Whole Sun Month in the late summer of 1996 and the Whole Heliosphere Interval in the early spring of 2008. The solar cycle was at a minimal stage during both the study periods, with few sunspots in 1996 and even fewer in 2008.

The team found that strong, long, and recurring high-speed streams of charged particles buffeted Earth in 2008. In contrast, Earth encountered weaker and more sporadic streams in 1996. As a result, the planet was more affected by the Sun in 2008 than in 1996, as measured by such variables as the strength of electron fluxes in the outer radiation belt, the velocity of the solar wind in the vicinity of Earth, and the periodic behavior of auroras (the Northern and Southern Lights) as they responded to repeated high-speed streams.

The prevalence of high-speed streams during this solar minimum appears to be related to the current structure of the Sun. As sunspots became less common over the last few years, large coronal holes lingered in the surface of the Sun near its equator. The high-speed streams that blow out of those holes engulfed Earth during 55 percent of the study period in 2008, compared to 31 percent of the study period in 1996. A single stream of charged particles can last for as long as 7 to 10 days. At their peak, the accumulated impact of the streams during one year can inject as much energy into Earth's environment as massive eruptions from the Sun's surface can during a year at the peak of a solar cycle, says co-author Janet Kozyra of the University of Michigan.

The streams strike Earth periodically, spraying out in full force like water from a fire hose as the Sun revolves. When the magnetic fields in the solar winds point in a direction opposite to the magnetic lines in Earth's magnetosphere, they have their strongest effect. The strength and speed of the magnetic fields in the high-speed streams can also affect Earth's response.

The authors speculate that the high number of low-latitude coronal holes during this solar minimum may be related to a weakness in the Sun's overall magnetic field. The Sun in 2008 had smaller polar coronal holes than in 1996, but high-speed streams that escape from the Sun's poles do not travel in the direction of Earth.

"The Sun-Earth interaction is complex, and we haven't yet discovered all the consequences for the Earth's environment of the unusual solar winds this cycle," Kozyra says. "The intensity of magnetic activity at Earth in this extremely quiet solar minimum surprised us all. The new observations from last year are changing our understanding of how solar quiet intervals affect the Earth and how and why this might change from cycle to cycle."

A person, usually a child, dies of rabies every 20 minutes. However, only one inoculation may be all it takes for rabies vaccination, according to new research published in the Journal of Infectious Diseases by researchers at the Jefferson Vaccine Center.

A replication-deficient rabies virus vaccine that lacks a key gene called the matrix (M) gene induced a rapid and efficient anti-rabies immune response in mice and non-human primates, according to James McGettigan, Ph.D., assistant professor of Microbiology and Immunology at Jefferson Medical College of Thomas Jefferson University.

“The M gene is one of the central genes of the rabies virus, and its absence inhibits the virus from completing its life cycle,” Dr. McGettigan said. “The virus in the vaccine infects cells and induces an immune response, but the virus is deficient in spreading.”

The immune response induced with this process is so substantial that only one inoculation may be sufficient enough, according to Dr. McGettigan. In addition, the vaccine appears to be efficient in both pre-exposure and post-exposure settings.

Currently, the World Health Organization standard for rabies infection is post-exposure prophylaxis. The complex regimen in the United States requires six different shots over 28 days: five of the rabies vaccine and one of rabies immunoglobulin.

The current standard vaccine is made from inactivated rabies virus, whereas the experimental vaccine is made from a live rabies virus. The virus is modified by removing the M gene, thus inhibiting its spread within the vaccine recipient.

Worldwide, the annual number of rabies-related deaths is estimated to be 40,000 to 70,000. The disease is endemic in developing areas, where the six-shot post-exposure regimen is not feasible for many people due to cost and availability. According to the World Health Organization, approximately 10 million people worldwide receive the post-exposure regimen, which presents a financial burden to both industrialized and developing countries.

“Developing countries do not have the resources to vaccinate people six times after exposure, so many of these 10 million do not receive the full regimen,” Dr. McGettigan said. “ Therefore, simpler and less expensive vaccine regimens are needed. The alternative may also be to treat people pre-exposure, as they are with many of the current vaccines used. Although our vaccine was tested primarily to be a post-exposure vaccine, the data we collected show it would be effective as a pre-exposure vaccine as well.”

Anyone who has visited the ancient ruins of great civilizations can appreciate the difficulty of visualizing the buildings at their peak. Today's visitor to the British Museum can see structures of the Aztecs, thanks to one professor's research into the ancient architecture that served as the center stage of Aztec ceremonial life, combined with an ultra-modern electronic digital modeling process.

Antonio Serrato-Combe, professor of architecture at the University of Utah, has spent decades bringing the ancient structures of the Aztecs into focus. His work is now the basis for a new British Museum exhibition exploring the power and empire of the last elected Aztec Emperor, Moctezuma II. The exhibit, Moctezuma: Aztec Ruler, opens September 24 at the Round Reading Room, The British Museum, London WC1B 3DG, UK.

Moctezuma (reigned 1502-1520) was a ruler of semi-mythical status. He inherited and then consolidated Aztec control over a politically complex empire that by the early 16th century stretched from the shores of Pacific to the Gulf of Mexico. His major accomplishment was the construction of the Templo Mayor Precinct in Tenochtitlan, Mexico (modern day Mexico City). Destroyed by Hernando Cortes in 1521, the Templo Mayor was the epicenter for Aztec ceremonial life and served as the setting for colorful displays of highly energized rituals depicting the relationships between social groups and humans and their gods.

The question of what the Aztec Templo Mayor Precinct looked like has piqued the curiosity of many, including Serrato-Combe. For more than two decades, he has been trying to solve the mystery on how the capital of the Aztecs looked by using the technology and tools of architecture. His book, The Aztec Templo Mayor: A Visualization was published in 2002 by the University of Utah Press.

"The Aztec capital was a thriving metropolis planned and built according to principles that not only understood and applied critical environmental issues, but added holistic concepts as well," explains Serrato-Combe. "The Aztecs did not compartmentalize the arts. The final result was a unique combination of architecture, sculpture, painting, costume, wall and sand painting, pottery, masks, amulets, all into one expression. I envy those individuals who had the opportunity to experience those environments."

Combe's research and visualizations are centered on historic and archaeological studies conducted on-site in Mexico City, in conjunction with extensive research on Mesoamerican Manuscripts at the National Library and Museo Nacional de Antropologia in Mexico City, Dumbarton Oaks, Harvard University, and Harold B. Lee Library, Brigham Young University among others. The research itself took more than two decades, due to the complexity and diverse nature of the historic and archeological record.

More involved than the research however, was the question of how to visualize the discoveries. A self-proclaimed computer geek, it was at the suggestion of a student that Combe combined his two passions of research and computer graphics into an illustrated book. He said, "One day, after one of my history classes here at the University of Utah, one of my students remarked, 'since you know so much about pre-Columbian architecture and you also seem to be a computer geek, why don't you combine both disciplines and come up with a book that uses digital tools to illustrate the past?" The rest is history.

Through his project, Combe has become the authority at the U on digital visualization techniques and now teaches architecture students the basics of an integral tool in architecture. "Digital tools in architecture are unique in that they provide a communication channel where a student does or proposes something and the computer responds," he says. "The conversation between student and machine triggers a variety of actions that eventually make the academic experience more exciting and fruitful."

The digital modeling process for the exhibit this month began by simulating structures based on historical accounts and current archaeological data including satellite imagery. Once a highly complex drawing-layer system was set, a solid model was constructed that determined the overall dimensions for the most important structures that archaeologists have been able to uncover to date. Some sections, including the base of the largest temple within the precinct, are still visible today.

A huge flesh-eating eagle that became extinct in New Zealand only 500 years ago was an efficient hunter that could attack prey 10 times its size, UNSW research has found, lending credibility to a Maori legend of a giant man-eating bird.

Research from UNSW’s School of Medical Sciences and NZ’s Canterbury Museum has confirmed that the Haast’s eagle – which had a wingspan of up to three metres and claws the size of a tiger – was indeed a predator and not a scavenger as previously thought.

Skeletal remains of the giant eagle (Harpagornis moorei) were first uncovered by Sir Julius von Haast in the 1870s. CAT scan re-examinations of the remains by Professor Ken Ashwell, from UNSW’s Department of Anatomy, and a colleague at Canterbury Museum in Christchurch, revealed that the bird had a strong enough pelvis to support a killing blow as it dived at speeds of up to 80kph.

A disproportionately small brain, olfactory and optic capacity in the Haast’s eagle also supports the theory that the giant bird evolved from a much smaller ancestor, most likely a genus of raptors which includes the modern day little eagle and the booted eagle.

The rapid growth in body size was likely due to the abundance of large prey particularly the moa, a flightless bird which grew up to 250kg and 2.5 metres tall.

Maori legend refers to a huge black-and-white predator – the Te Hokioi – that was capable of killing a man.

“That might be stretching things, but it was certainly capable of carrying off a child,” Professor Ashwell said.

A new study led by the University of Colorado at Boulder indicates most of the world's low-lying river deltas are sinking from human activity, making them increasingly vulnerable to flooding from rivers and ocean storms and putting tens of millions of people at risk.

While the 2007 Intergovernmental Panel on Climate Change report concluded many river deltas are at risk from sea level rise, the new study indicates other human factors are causing deltas to sink significantly. The researchers concluded the sinking of deltas from Asia and India to the Americas is exacerbated by the upstream trapping of sediments by reservoirs and dams, man-made channels and levees that whisk sediment into the oceans beyond coastal floodplains, and the accelerated compacting of floodplain sediment caused by the extraction of groundwater and natural gas.

The study concluded that 24 out of the world's 33 major deltas are sinking and that 85 percent experienced severe flooding in recent years, resulting in the temporary submergence of roughly 100,000 square miles of land. About 500 million people in the world live on river deltas.

Published in the Sept. 20 issue of Nature Geoscience, the study was led by CU-Boulder Professor James Syvitski, who is directing a $4.2 million effort funded by the National Science Foundation to model large-scale global processes on Earth like erosion and flooding. Known as the Community Surface Dynamic Modeling System, or CSDMS, the effort involves hundreds of scientists from dozens of federal labs and universities around the nation.

The Nature Geoscience authors predict that global delta flooding could increase by 50 percent under current projections of about 18 inches in sea level rise by the end of the century as forecast by the 2007 Intergovernmental Panel on Climate Change report. The flooding will increase even more if the capture of sediments upstream from deltas by reservoirs and other water diversion projects persists and prevents the growth and buffering of the deltas, according to the study.

"We argue that the world's low-lying deltas are increasingly vulnerable to flooding, either from their feeding rivers or from ocean storms," said CU-Boulder Research Associate Albert Kettner, a co-author on the study at CU-Boulder's Institute of Arctic and Alpine Research and member of the CSDMS team. "This study shows there are a host of human-induced factors that already cause deltas to sink much more rapidly than could be explained by sea level alone."

Other study co-authors include CU-Boulder's Irina Overeem, Eric Hutton and Mark Hannon, G. Robert Brakenridge of Dartmouth College, John Day of Louisiana State University, Charles Vorosmarty of City College of New York, Yoshiki Saito of the Geological Survey of Japan, Liviu Giosan of the Woods Hole Oceanographic Institute and Robert Nichols of the University of Southampton in England.

The team used satellite data from NASA's Shuttle Radar Topography Mission, which carried a bevy of radar instruments that swept more than 80 percent of Earth's surface during a 12-day mission of the space shuttle Endeavour in 2000. The researchers compared the SRTM data with historical maps published between 1760 and 1922.

"Every year, about 10 million people are being affected by storm surges," said CU-Boulder's Overeem, also an INSTAAR researcher and CSDMS scientist. "Hurricane Katrina may be the best example that stands out in the United States, but flooding in the Asian deltas of Irrawaddy in Myanmar and the Ganges-Brahmaputra in India and Bangladesh have recently claimed thousands of lives as well."

The researchers predict that similar disasters could potentially occur in the Pearl River delta in China and the Mekong River delta in Vietnam, where thousands of square miles are below sea level and the regions are hit by periodic typhoons.

"Although humans have largely mastered the everyday behaviour of lowland rivers, they seem less able to deal with the fury of storm surges that can temporarily raise sea level by three to 10 meters (10 to 33 feet)," wrote the study authors. "It remains alarming how often deltas flood, whether from land or from sea, and the trend seems to be worsening."

"We are interested in how landscapes and seascapes change over time, and how materials like water, sediments and nutrients are transported from one place to another," said Syvitski a geological sciences professor at CU-Boulder. "The CSDMS effort will give us a better understanding of Earth and allow us to make better predictions about areas at risk to phenomena like deforestation, forest fires, land-use changes and the impacts of climate change."

If you find yourself more concerned about highly publicized dangers that grab your immediate attention such as terrorist attacks, while forgetting about the more mundane threats such as global warming, you're not alone.

And you can't help it because it's human nature, according to a new study led by University of Colorado at Boulder psychology Professor Leaf Van Boven. That's because people tend to view their immediate emotions, such as their perceptions of threats or risks, as more intense and important than their previous emotions.

In one part of the study focusing on terrorist threats, using materials adapted from the U.S. Department of Homeland Security, Van Boven and his research colleagues presented two scenarios to people in a college laboratory depicting warnings about traveling abroad to two countries.

Participants were then asked to report which country seemed to have greater terrorist threats. Many of them reported that the country they last read about was more dangerous.

"What our study has shown is that when people learn about risks, even in very rapid succession where the information is presented to them in a very clear and vivid way, they still respond more strongly to what is right in front of them," Van Boven said.

With that in mind, Van Boven says one of the take-home messages from the study is that when communicating to the public, people must be mindful of how and when they publicize threats, which is a tall task in the around-the-clock news cycle of today.

"Whatever the threat of the season is can ‘crowd out' concern about other threats even if those other threats are actually more dangerous," Van Boven said. "Because we are so emotionally influenced when it comes to assessing and reacting to threats, we may ignore very dangerous threats that happen not to be very emotionally arousing."

Human emotions stem from a very old system in the brain, Van Boven says. When it comes to reacting to threats, real or exaggerated, it goes against the grain of thousands of years of evolution to just turn off that emotional reaction. It's not something most people can do, he said.

"And that's a problem, because people's emotions are fundamental to their judgments and decisions in everyday life," Van Boven said. "When people are constantly being bombarded by new threats or things to be fearful of, they can forget about the genuinely big problems, like global warming, which really need to be dealt with on a large scale with public support."

In today's 24-hour society, talk radio, the Internet and extensive media coverage of the "threat of the day" only exacerbate the trait of focusing on our immediate emotions, he said.

"One of the things we know about how emotional reactions work is they are not very objective, so people can get outraged or become fearful of what might actually be a relatively minor threat," Van Boven said. "One worry is some people are aware of these kinds of effects and can use them to manipulate our actions in ways that we may prefer to avoid."

The study, which involved undergraduate students as subjects, was published in the August edition of the Journal of Experimental Psychology: General. Michaela Huber, a doctoral student of psychology and neuroscience at CU-Boulder and Assistant Professor Katherine White of the University of Calgary co-authored the study.

Van Boven said the study would be of particular interest to policymakers.

"If you're interested in having an informed citizenry you tell people about all the relevant risks, but what our research shows is that is not sufficient because those things still happen in sequence and people will still respond immediately to whatever happens to be in front of them," he said. "In order to make good decisions and craft good policies we need to know how people are going to respond."

The Ohio State University researchers are the first to publish a mathematical model of an ischemic wound – a chronic wound that heals slowly or is in danger of never healing because it is fed by an inadequate blood supply. Ischemic wounds are a common complication of diabetes, high blood pressure, obesity and other conditions that can be characterized by poor vascular health.

An estimated 6.5 million people in the United States are affected by chronic wounds, and many are at risk of losing limbs or even dying as a result of the most severe of these wounds.

Modeling by mathematicians with expertise in biomedical processes has become increasingly important in the health sciences. The modeling reduces the need for guesswork and time-consuming animal testing traditionally required as researchers pursue prevention, diagnosis and treatment of complex diseases.

“Before you treat any problem successfully, you have to understand it,” said Chandan Sen, professor and vice chair for research in Ohio State’s Department of Surgery and a senior author of the study. “Now that we have this model, we can take the next step to find what factors in the equations can be fine-tuned to the point where the net result is improvement in the ischemic wound outcome.”

The modeling research appears in the online early edition of the Proceedings of the National Academy of Sciences.

The mathematical model, to date, simulates both non-ischemic wounds – those typical of wounds in healthy people with good circulation – and ischemic wounds. The current model produced results that generally match pre-clinical expectations: that a normal wound will close in about 13 days, and that 20 days after the development of an ischemic wound, only 25 percent of the wound will be healed.

The model also showed that normal wounds have higher concentrations of proteins and cells expected to be present during the healing process, while ischemic wounds lack oxygen and remain in a prolonged inflammatory phase that interferes with the subsequent cascade of events required to begin wound closure.

Sen, also executive director of the Comprehensive Wound Center at Ohio State, recently published a report about a biological pre-clinical model of an ischemic wound that his lab designed using the skin on a pig’s back. The new mathematical model, a system of partial differential equations, borrowed some data from the animal model, but also includes numerous calculations assigning values to the various cells and chemicals involved in the wound-healing process.

“Wound geometry is complicated because it is three-dimensional,” said Avner Friedman, a senior author of the paper and a Distinguished University Professor at Ohio State. “It would be infeasible to perform our computations within the framework of this geometry. However, we used some mathematical ideas to reduce the problem to a simpler geometry without giving up any of the important aspects of the process.”

It is not just the wound that is three-dimensional, the researchers noted. The complexity of this process is compounded by the fact that the wound-healing model must take into account both the total space occupied by the wound and the time required for the healing process.

Wound healing under normal conditions occurs in four overlapping stages: haemostasis, when platelets make clots to stop bleeding and release chemicals that attract cells to the wound; transient inflammation, when a variety of white blood cells go to work to kill infectious agents and generate growth factors needed for repair; proliferation, when new blood vessels form and when cells produce a bed, called the extracellular matrix, on which the repair occurs; and remodeling, which can take years, as the repaired wound site gains strength.

Sen and colleagues have spent years studying the characteristics of wounds and the intricate details of the healing process. Oxygen is a known essential element to the healing process, and high-pressure oxygen chambers are used to treat some wounds. But for ischemic wounds, oxygen alone isn’t enough.

Scientists know that reduced blood flow to a wound site means that oxygen, important nutrients and circulating cells are not finding their way to the wound to initiate healing. Researchers hope that manipulating mathematical models of these conditions could offer guidance on how to approach this problem without the time and trial-and-error required in biological studies on animals.

“We’re not just considering what type of therapy should be used for these wounds. It is the specifics of when and how you apply it – those are the details that matter,” Sen said. “Mathematical algorithms provide more pointed data that biologists can use to develop hypotheses.”

Developing the biological model was an important start, Sen and Friedman noted. To create an animal model of an ischemic wound, researchers had to strike a careful balance so they reduced blood flow to the wound site without killing all the surrounding tissue by cutting off too much blood. Sen said the 8-millimeter-wide cylindrical puncture wounds rest on what the researchers consider an “island” of skin receiving too little blood to effectively deliver healing cells and chemicals to the wound. Details about the animal model are published in the May issue of the journal Physiological Genomics, a publication of the American Physiological Society.

In developing the mathematical model, Friedman worked with first author Chuan Xue, a postdoctoral researcher in Ohio State’s Mathematical Biosciences Institute, to assign values to variables in the first two stages of wound healing. These included oxygen concentration, concentration of growth factors, density of white blood cells that fight pathogens, density of fibroblasts that perform part of the repair, and density of tips and sprouts of tiny new blood vessels.

The two also modeled the extracellular matrix – the bed on which cells work to close the wound – in a way that allows for the matrix to change the way it functions over time. This part of the model also allowed for simulation of the exertion of pressure – a characteristic of certain types of ulcers that people with diabetes are prone to develop.

Xue noted that the equations were borrowed from the mathematical theory of homogenization by manipulating a single parameter – called parameter alpha – to draw the distinction between ischemic and nonischemic wounds in the model. This is one example, Friedman noted, of simplifying the model without leaving out important biological details.