On Simulating Growth and Form

Simulations can teach us how young bodies and faces develop; how an artery compensates for decades of fatty plaque deposits by growing and thickening its walls; how tissue engineers can best coax endothelial cells to develop into organized sheets of skin for burn patients; and how cancerous tumors invade neighboring tissue.

By Regina Nuzzo, PhD

For better or for worse, and on many levels, our tissues never stop growing and changing. While developing from childhood to old age, we grow not only bone, cartilage, fat, muscle and skin, but also toughened arteries, scars for our wounds, and, sometimes, deadly tumors.

As researchers in various fields create computational tools to visualize and simulate growth in all its incarnations, it’s clear there’s much to be gained. Simulations can teach us how young bodies and faces develop; how an artery compensates for decades of fatty plaque deposits by growing and thickening its walls; how tissue engineers can best coax endothelial cells to develop into organized sheets of skin for burn patients; and how cancerous tumors invade neighboring tissue. Ultimately, computational models of growth may help clinicians and surgeons plan appropriate and patient-specific treatments and interventions for a number of diseases in which growth plays a role.

“Computation is a great tool to study growth,” says Ellen Kuhl, PhD, assistant professor of mechanical engineering and bioengineering at Stanford University, “because it lets us understand all those fascinating biological processes we can’t otherwise see and predict.”

Growing Up: A Change of Face

As we progress from childhood to old age, our faces change. In some ways, these alterations are fairly predictable. As skeletal structures grow and mature, muscle and fatty tissue in our faces increases. Cartilage continues to grow even after facial bone structures have stopped (especially in men), so our noses change shape. And in later years we lose muscle tone and skin elasticity, which can dramatically alter the facial shape.

Though all faces follow this rough trajectory, certain genetic syndromes cause other distinctive patterns. Children with Williams syndrome, for instance, often have particularly full cheeks and lips with a wide jaw. As adults, on the other hand, their faces grow thinner and narrower, while their mouth grows even wider. In fact, experienced clinical geneticists often use their observations of characteristic facial gestalts to make early diagnoses. Yet these observations lack objective, quantifiable evaluation.

Peter Hammond, PhD, professor of computational biology at the University College London (UCL) Institute of Child Health, is working to change that. Hammond and his colleagues are developing computational methods to analyze and visualize variations in three-dimensional face images. The hope, he says, is that pattern recognition tools will support clinicians in their diagnoses of these rare syndromes.

In work published in the American Journal of Human Genetics in 2006, Hammond’s team investigated facial patterns of four genetic diseases (Williams, Smith-Magenis, Noonan, and 22q11 deletion syndromes), each of which has a characteristic facial gestalt. First, the researchers captured three-dimensional images of facial surfaces from 696 volunteers (roughly half of whom were affected by one of the genetic syndromes) with remote-sensing scanners that use natural light to capture the facial surface.

Then they processed the images—each containing between 4,000 and 20,000 points in three dimensions—to produce a dense correspondence of tens of thousands of points across all of the faces. Next, they analyzed the variability among all the faces using principal component analysis to find the modes of variation that best characterize all the differences. It turns out that about 100 modes could capture 99% of the faces’ total shape variation, Hammond says.

This reduced the complexity of the dataset: instead of requiring tens of thousands of points in three dimensions, the essential characteristics of a face can instead be described by a simple linear combination of vectors. The beauty of this, Hammond says, is that each face can now be represented by a single point lying in a high-dimensional “face-space.” Important differences between two faces can now be captured by a simple quantitative measure: the distance between two points in face-space.

This simplification allows for more detailed facial analysis. In work published in 2003 in IEEE Transactions on Medical Imaging, Hammond’s team found that changes in a face over time can be nicely expressed as a trajectory through face-space. The methods could also capture subtle features in the faces, even those relating to gender.

Furthermore, in the 2006 study, the researchers found that a single face-space dimension could essentially capture most of the age-related facial growth. So by taking an “average” face and morphing it along this dimension, Hammond says, they were able to construct typical facial growth sequences—one for each of the syndromes as well as the controls.

Hammond’s team also used several different pattern recognition algorithms to see if they could discriminate between dense surface models for syndrome and control groups. Each model was developed from a training dataset and then used to classify new test faces. In each case, the algorithms were able to achieve at least a 76 percent success rate—and reached as high as 100 percent for recognizing adults with and without Williams syndrome.

It also turns out that for Williams syndrome, just looking at the areas around the nose and eyes distinguishes affected children from unaffected ones nearly as well as analyzing the entire face. Over time, however, the mouth becomes the most distinctive characteristic: In adults the pattern-recognition algorithms could distinguish affected and unaffected adults very well by focusing only on the areas around the mouth. These discoveries should help researchers develop streamlined clinical applications.

Currently, Hammond and his team are expanding their work to 30 different genetic conditions. But the ultimate goal, Hammond says, is to make these visualization and pattern recognition tools available to clinicians. Before that can happen, however, more data needs to be collected to ensure that the models are trained on a broad range of faces, Hammond says—including both children and adults, affected and unaffected, male and female, and from various races.

Growing Tough: Arteries Harden

As we age, our blood vessels slowly grow and change in response to the stresses of life. Ultimately, this can lead to hardening of the arteries—atherosclerosis—a disease in which arterial walls grow and change in response to the accumulation of waxy plaque. Eventually, drastic alterations in the arteries not only reduce blood flow but may also lead to aneurysms or blood clots from sudden plaque ruptures.

Treatments for atherosclerosis, while potentially life-saving, sometimes also backfire and induce fast, dangerous growth. Surgically-implanted stents widen the diameter of arteries and increase blood flow, but they can also trigger in-stent restenosis in which the artery narrows once again, sometimes in the span of only a few weeks.

Ellen Kuhl has been looking at how this atherosclerotic growth occurs. She uses computational models to simulate artery walls that respond to changes in mechanical loading from both plaque build-up and stent insertion.

“The golden rule is that the wall of the artery tries to keep the stress at a physiological base level,” Kuhl says. “If you increase the base load on the wall [as happens with plaque build-up or stent insertion], the wall will thicken. If you decrease the load, the wall will thin.”

To study this effect, Kuhl and Ramona Maas, then a master’s student under Kuhl at the University of Kaiserslautern, first modeled three stages of atherosclerosis in an idealized artery: initial plaque build-up, adiposis (in which plaque is still soft and fatty), and calcification (in which plaque becomes hard and brittle).

In the earliest stage, dramatic tissue growth developed around soft plaque in the simulated artery. This caused a general thickening of artery walls. As the plaque calcified, however, the wall stresses became focused at the boundaries of the plaque, producing a different growth pattern: increased tissue in very small areas—spots that would be prime targets for sudden plaque ruptures. These results were published in 2006 in Biomechanics and Modeling in Mechanobiology.

Kuhl and Maas also applied their methods to the more complicated geometry of an actual aorta undergoing stent insertion. They obtained computer tomography images that captured cross-sections of a human abdomen—from beneath the heart down to the legs—in slices 10 millimeters apart. By applying the finite element method to these data they then generated a solid model of the patient’s aorta. A uniform pressure inserted at a certain spot in the artery simulated the patient’s stent surgery. The team then followed the simulated aorta as it healed. Each time step in their simulation corresponded to 30 minutes (approximately the amount of time required for a stent-implant surgery).

Within 200 time steps—about four days after surgery— stress-induced growth had started to appear in the tissue. The artery walls thickened dramatically, especially in a few extra-vulnerable spots. After one simulated month, the aorta walls had stopped thickening. But in a real patient, restenosis damage would have already been done. Due to forces in the body acting on the outside of the artery (forces not present in the simulation), the walls’ growth would have been forced inward, once again narrowing the artery and reducing blood flow.

Eventually, Kuhl hopes simulations like these will help in the design of better stents and will also allow clinicians to simulate the effect of various stent locations and materials on a specific patient’s anatomy before doing the surgery.

“Advances in this field have largely been driven by trial and error and have not yet been driven by patient-specific modeling,” Kuhl says. “Now we might have the means to eventually say what works best for each patient—and to say why it works.”

Next, Kuhl is working to apply these methods to a multiscale simulation of the heart. Working with Oscar Abilez, MD, a postdoctoral fellow in the Department of Surgery at Stanford University and member of the Cardiovascular Tissue Engineering Group, she is exploring the use of computational tools to model how heart tissue changes after a heart attack, and how an experimental treatment that would inject stem cells into the heart might quickly restore normal form and functioning of the heart after the attack.

Growing Together: Social Cells

Other computational researchers are focusing on the cellular components of growth—looking at what happens, for example, when growth is desirable (as in wound healing) or undesirable (as in cancer).

In a sense, cells self-organize into tissue in much the way that individuals form a society. The analogy is more than a cute anthropomorphism, however; it also makes surprising sense from a systems-level perspective. Cells form communities and exchange information with their neighbors. They multiply, and they will continue to do so until their neighbors send signals that discourage excessive behavior. Once rebuked, cells will lie quiet until changes in the neighborhood remove these inhibitions and allow growth once again.

Rod Smallwood, PhD, a professor of computational systems biology in the Department of Computer Science at University of Sheffield, refers to this paradigm as “the social life of cells.” To study this self-organization, he is developing computational simulations in what he has dubbed the Epitheliome Project. “I wanted to explore how the machinery of individual cells could produce tissue at the next hierarchical level without a blueprint,” he says. “That is, how can the interactions of cells produce something greater than individual cells?”

To answer these questions, Smallwood turned to agent-based modeling, a method in which every cell is represented by an individual chunk of software with logical rules that abstract out the details of cells’ biochemical lives. In his simulations, these cellular agents each possess individual memory and a physical location in the simulated tissue as well as other physical properties. They can communicate with each other and with their environment—for instance by sensing hormones in extracellular space—and most importantly, make behavioral decisions based on a set of rules.

“This modeling paradigm is so general that we’ve also used it for modeling individual proteins, ant colonies, and the European financial market,” Smallwood says. “But the level I’m particularly interested in is that of cells.”

Smallwood and colleagues, including Dawn Walker, PhD, a post-doctoral academic fellow in the Department of Computer Science, have focused specifically on epithelial cells. Not only are these cells relatively simple and backed up by good in vitro models, Smallwood says, but they also are associated with important clinical applications, such as wound healing, skin grafts and tissue engineering.

In one study, published in 2004 in IEEE Transactions on Nanobioscience, Smallwood and Walker explored how bladder cell monolayers heal differently in different environments. Their simulation found that wounds in low-calcium environments healed about twice as fast as those surrounded by physiologically-normal levels of calcium. What’s more, the healing mechanism was different for the two cases, Walker says. Under the right conditions in the low-calcium environment, cells at the wound’s edge first quickly migrated into the bare area, while a second united front of cells then slowly inched forward into the wound, pushed by physical forces created by cells spreading and proliferating behind them. But in the physiologically-normal environment, individual cells did not usually migrate into the wound, Walker says. Healing occurred only as a function of the much slower united-front process. Experiments with wound healing in vitro confirmed these simulation results.

Walker has been working on more detailed simulations to better understand growth and healing in epithelial tissue. In work presented at the Foundations of Systems Biology in Engineering conference in Stuttgart, Germany in September of 2007, Walker and colleagues investigated how different patterns of direct cell-to-cell contact can change whether a cell is likely to undergo growth. This required a set of complicated behavior rules that incorporated mathematical modeling paradigms to capture molecular mechanisms involved in cell-to-cell contact. The results are difficult to verify experimentally, but they suggest that cell-to-cell signaling through epidermal growth factor receptors could explain how calcium can affect cell growth, resulting in the kinds of wound-healing patterns observed previously.

For their simulations, Smallwood’s group developed their own freely-available modeling framework, called Flame (www.flame.ac.uk). Their simulations, which typically include about 50,000 cells on a single processor, are run at HPCx, the largest academic computer center in the United Kingdom. Smallwood’s group is now working to extend the framework for use on supercomputers as well.

Growing Out of Control: Cancer

Cancer’s hallmark is uncontrolled growth, but particular kinds of tumors have especially complex growth patterns. For example, some malignant brain tumors appear to grow by harboring two single-minded focuses: Their tumor cells may invade nearby tissue, or they may divide, but they cannot do both at the same time. The growth process for tumors like these remains largely hidden, in part because imaging still can’t capture the advance of single cells.

Tumor simulations might help researchers better understand the mechanisms behind these growth patterns as well as develop new hypotheses for further testing, says Thomas S. Deisboeck, MD, assistant professor of radiology at Massachusetts General Hospital and Harvard Medical School and principal investigator of the Center for the Development of a Virtual Tumor (CViT).

Deisboeck and his colleagues Le Zhang, PhD, and Zhihui Wang, PhD, both postdoctoral fellows at Massachusetts General Hospital and Harvard Medical School, simulate cancer growth using agent-based modeling with a hybrid approach of both continuous and discrete techniques.

Most importantly, Deisboeck says, their model spans scales from the molecular (using proteomic data, for example) to the micro- and macroscopic levels (using imaging data, for example). “The insight that we’re after is how perturbations move throughout and across the scales,” he says.

In Deisboeck’s models, each tumor cell senses cues from its neighbors and from its microenvironment, processed through a gene-protein interaction network with epidermal growth factor receptors (EGFR). Each cell uses this signaling information to choose its behavior—divide, invade nearby tissue, stay quiet, or die—at every point in time in the simulation.

In work published in 2007 in the Journal of Theoretical Biology, simulations by Deisboeck showed that the number of cells dedicated to proliferation and migration do not increase steadily. Rather, they tend to oscillate over time. These patterns in turn affect how quickly the entire tumor spreads through the virtual brain.

The group has also extended this model to include brain tumor cells with different characteristics, including a range of EGFR densities. The simulation results, discussed in a paper posted on arXiv.org in 2007 [http://arxiv.org/abs/q-bio/0612037], show that heterogeneity in the tumor on a microscopic level can indeed affect the tumor system’s growth patterns. The results also suggest that higher EGFR density levels will lead to some groups of cells switching from proliferation to migration behavior even earlier.

Deisboeck and his team are also using multiscale agent-based modeling to investigate another cancer where EGFR-related signaling pathways seem to play an important role: non-small cell lung cancer. Their early results, published in 2007 in Theoretical Biology and Medical Modelling, suggest that tumors will spread more aggressively when more epidermal growth factor is available to the them. This work holds potential for use in drug discovery, Deisboeck says.

His group expects to soon move from “biologically-inspired lattices” to “patient-specific lattices” that will incorporate a patient’s imaging data on which to train the simulations, Deisboeck says. In time, he hopes that such cross- scale simulations of tumor expansion and related biomarker evaluations will help clinicians treat cancer patients. For instance, Deisboeck says, in the case of brain tumors, the simulations might help physicians plan for the impact of various treatments, or continuously monitor patients’ response to treatment, especially by reducing the number of costly brain scans needed.

Growing Challenges: Data, Methods, Platforms and Patients

If computational models of growth are to find a place in the physician’s tool kit, Deisboeck says, they will need to handle not only a huge volume of data but a wide spectrum of different data types, incorporating scales from the molecular to the macroscopic. “Multiscale and multiresolution modeling are one promising way to address that,” he says.

In addition, according to Kuhl, models that incorporate a hybrid of approaches will likely fare the best. “Traditional continuum models assume everything is homogeneous and continuous,” she says. “But when you look at it at the cell level, you see that’s not quite true. We are now able to build these models much better.”

Common software platforms will be another challenge for the growth modeling community, Smallwood says. He believes when researchers develop new modeling software, they have an obligation to make it freely available. “Otherwise no one else can replicate your results,” he says. “And if other researchers can’t replicate them, it isn’t true science.”

Ultimately, the holy grail of growth modeling will be applying simulations to individual patients. “Simulations of growth are finally becoming powerful enough that we can start to do patient-specific modeling,” Kuhl says. “That’s where the real pay-offs are, because we can virtually play with possible medical scenarios and find the best ways to treat a patient’s personal disease.”