In this seven-part series, directors of neuroscience-related institutes at the National Institutes of Health examine how brain research has progressed since 2000—the decade after The Decade of the Brain. Here in part one, we hear from Dr. Nora D. Volkow of the National Institute on Drug Abuse.

In 1990, Congress designated the 1990s the “Decade of the Brain.” President George H. W. Bush proclaimed, “A new era of discovery is dawning in brain research.” During the ensuing decade, scientists greatly advanced our understanding of the brain.

The editors of Cerebrum asked the directors of seven brain-related institutes at the National Institutes of Health (NIH) to identify the biggest advances, greatest disappointments, and missed opportunities of brain research in the past decade—the decade after the “Decade of the Brain.” We also asked them what looks most promising for the coming decade, the 2010s. Our experts focused on research that might change how doctors diagnose and treat human brain disorders.

We hear from Nora D. Volkow, director of the National Institute on Drug Abuse; Thomas R. Insel, director of the National Institute of Mental Health; Story Landis, director of the National Institute of Neurological Disorders and Stroke; Kenneth R. Warren, acting director of the National Institute on Alcohol Abuse and Alcoholism; Paul A. Sieving, director of the National Eye Institute; James F. Battey, director of the National Institute on Deafness and Other Communication Disorders; and Richard J. Hodes, director of the National Institute on Aging.

Challenges and Opportunities in Drug Addiction Research

By Nora D. Volkow, M.D., National Institute on Drug Abuse

Neuroscience is at a historic turning point. Today, a full decade after the “Decade of the Brain,” a continuous stream of advances is shattering long-held notions about how the human brain works and what happens when it doesn’t. These advances are also reshaping the landscapes of other fields, from psychology to economics, education and the law.

Until the Decade of the Brain, scientists believed that, once development was over, the adult brain underwent very few changes. This perception contributed to polarizing perspectives on whether genetics or environment determines a person’s temperament and personality, aptitudes, and vulnerability to mental disorders. But during the past two decades, neuroscientists have steadily built the case that the human brain, even when fully mature, is far more plastic—changing and malleable—than we originally thought.1 It turns out that the brain (at all ages) is highly responsive to environmental stimuli and that connections between neurons are dynamic and can rapidly change within minutes of stimulation.

Neuroplasticity is modulated in part by genetic factors and in part by dynamic, epigenetic changes that influence the expression of genes without changing the DNA sequence. Epigenetic processes are of particular clinical interest because their external triggers (such as early parental care, diet, drug abuse and stress) can affect a person’s vulnerability to many diseases, including psychiatric disorders. In addition, in contrast to genetic sequence differences, epigenetic alterations are potentially reversible, and thus amenable to public health policy interventions.

It also has become increasingly clear that the human brain is particularly sensitive to social stimuli, which likely has accelerated the rate of human brain evolution. Humans have evolved a complex neuronal circuitry in large areas in the brain to process complex social information (such as predicting others’ reactions and emotions) and to respond appropriately. New research has revealed that social stimuli (such as parenting style and early-life stress) can epigenetically modify the expression of genes that influence brain morphology and function including the sensitivity of an individual to stressful stimuli.2 In the future, this knowledge will enable us to tailor personalized prevention interventions that are based on information on how genetics and epigenetics affect brain function and behavior. For example, a recent study showed that a prevention intervention based on improving parenting style reduced the risk for substance use disorders only in adolescents with a particular variant of a gene that recycles the chemical serotonin back into the neurons, which is a variant that results in greater sensitivity to social adversity.3

In the coming decade, insights about what underlies neuroplasticity, combined with technological advances that allow us to “see” with greater precision the human brain in action, are bound to revolutionize the way we view learning and the methods we use to educate young people. New research will also show us how to help people overcome or compensate for many of the deficits associated with drug abuse, addiction and other mental disorders.4

For example, scientists are using imaging technologies in neurofeedback programs that train people to voluntarily recalibrate their neural activity in specific areas of the brain, allowing them to gain unprecedented control over, for example, pain perception5 or emotional processing.6 During drug addiction treatment, this approach could greatly reduce the risk of relapse by enabling a patient to control the powerful cravings triggered by a host of cues (e.g., people, things, places) that have become tightly linked, in the brain of the user, to the drug experience.

Other promising advances stem from ongoing research and development of direct communication pathways between a brain and external computer devices, the so called brain-computer interfaces (BCI). In a recent study, one version of BCI appeared to help paralyzed stroke victims regain some movement control.7 In the next decade, forms of BCI might help people with a variety of neuropsychiatric conditions that have proved resistant to traditional treatments. For example, early evidence suggests that BCI training could benefit patients with epilepsy or attention-deficit/hyperactivity disorder (ADHD) that is unresponsive to drugs.8

As we build on these rapid advances in neuroscience research, we must keep a watchful eye on their vast social and political implications. For example, neurologists have started to uncover the molecular components and neural circuitry that underlie the learning process.9 We also are learning how to use transcranial magnetic stimulation (TMS), a noninvasive method to modulate the activity within a neural circuit, more effectively.10 Should we use this knowledge to better educate young people and teach new skills to seniors, or should we use these tools only to treat people with neuropsychiatric disorders? As we begin to understand how parenting styles affect the development and function of the brain, how far should we go to protect children from the long-term and deleterious effects of bad parenting?

Recent progress in brain research and associated fields has been impressive, and we are sure to witness further acceleration in the pace of neuroscientific discovery in the next couple of decades. Indeed, we are entering a new era in which our technologies are beginning to affect our lives in profound ways. We are bound to recast our relationship with our brains and, in the process, to redraw the boundaries of human evolution.

Understanding Mental Disorders as Circuit Disorders

Thomas R. Insel, M.D., National Institute of Mental Health

When the Decade of the Brain began in 1990, scientists had developed both drug and behavioral treatments for most mental disorders, but their understanding of these disorders was primitive. Two decades later, neuroscientists are finally uncovering the brain processes involved in mental disorders. There is great promise for development of more effective treatments in the upcoming decade.

In 1990, most theories of the causes of mental disorders were based on investigations of treatments, rather than on scientific insight about how diseases arise. By 2000, we had developed more treatments—including best-selling second-generation antipsychotics and antidepressants—but we were no further along in our understanding of the causes. During the so-called Decade of the Brain, there was neither a marked increase in the rate of recovery from mental illness, nor a detectable decrease in suicide or homelessness—each of which is associated with a failure to recover from mental illness. To reduce the occurrence and death toll of mental disorders, we will need a more thorough understanding of why these mysterious illnesses occur.

People frequently cite the 1990s as the era for redefining mental disorders as brain disorders. While this conceptual shift was important, we now realize the greater importance of developing new tools: imaging techniques for quantitative studies of brain structure, function and chemistry, as well as other comprehensive tools for mapping DNA and RNA. What do we mean by comprehensive? Rather than focusing on four or five neurotransmitters, researchers at the turn of the 21st century were able to investigate thousands of genes to yield an unbiased survey of the biology of mental disorders. These advances ushered in a decade of discovery that brings us to 2010.

If scientists introduced mental disorders as brain disorders in the Decade of the Brain, researchers in the past ten years have demonstrated the importance of specific brain circuits. Unlike neurological disorders, which often involve areas of tissue damage or cell loss, mental disorders have begun to appear more like circuit disorders, with abnormal conduction between brain areas rather than loss of cells.

Neuroimaging technology has revealed that specific brain pathways, mostly located in the prefrontal cortex, are involved in major mental disorders. Deep brain stimulation, a procedure in which neurologists manipulate certain pathways via electric current, has shown promise as a treatment for depression and obsessive-compulsive disorder, on the heels of its successful use as a treatment for neurological motor disorders such as Parkinson’s. In the past couple of years, via a new technology called optogenetics, neuroscientists have used light to manipulate circuits in experimental animals with millisecond precision and cellular resolution. Thus, for the first time, researchers can conduct specific tests of theories about brain circuits and behavior.

What causes a circuit disorder? Although this will be a major question for the next decade, we already have some intriguing ideas. Mental disorders such as schizophrenia and mood and anxiety disorders are mostly diseases of early life; their onset tends to occur during adolescence or early adulthood, when the brain is still developing. For example, a person with schizophrenia usually experiences a psychotic break in early adulthood, which is a time when the number of cortical synapses is being pruned. The disorder might result from the excessive loss of synapses in a critical cortical pathway when the normal process overshoots.

Since 2005, scientists studying our genes, the proteins they produce and their functions have started to identify some of the key factors that increase the risk of mental disorders, from autism to schizophrenia. The candidates include a long list of previously unknown proteins that have one thing in common: They are important for healthy brain development. Indeed, if the Decade of the Brain redefined mental disorders as brain disorders, recent research suggests that mental disorders are really developmental brain disorders, caused by disruptions in the circuitry map of the developing brain.

During this next decade, expect to see the full roster of candidates as scientists begin to describe the key variations in sequences of genes that produce altered proteins and dysfunctional circuitry. Neuroscientists already have powerful tools to move from the study of molecules to circuits and, ultimately, to behavior. How will we translate this emerging knowledge into better treatments? The answer for psychiatry will likely be the same as the answer in the rest of medicine: Basic discoveries regarding genes and proteins will point the way to molecular and cellular mechanisms, which in turn will yield new targets for treatment and prevention.

In some ways, psychiatry has been the victim of its early success, as medications found by accident in the 1960s delayed the search for fundamental mechanisms of disease that could yield new targets and new treatments. After two decades of progress, clinical neuroscientists are finally beginning to understand what underlies a few mental disorders. In the upcoming decade, which we can perhaps call the Decade of Translation, we can look forward to seeing this new understanding translate to improved treatments that will finally reduce the occurrence and death rates of these disabling illnesses.

Basic Science and Gene Findings Drive Research

By Story Landis, Ph.D., National Institute of Neurological Disorders and Stroke

Remarkable advances during the Decade of the Brain set the stage for the decade that just ended, and recent findings make us optimistic that progress will accelerate.

Basic neuroscience research in the 1990s has been an important part of this momentum. The following is just a sample of the many important findings during the Decade of the Brain: In 1991, researchers discovered the molecular basis of olfaction—our sense of smell—which made the olfactory system as attractive as the visual system for exploring neural development and sensory processing. The identification of molecules in multiple systems that guide axons—fibers involved in communication between neurons—led to a new understanding of how connections form during development. Finally, molecular and biochemical studies showed that synapses—the junctions at which nerves communicate—are complex molecular machines rather than simple structures, as earlier images from electron microscopes suggested.

Although neuroscientists believe that advances in basic science ultimately will improve our understanding of neurological disease and thereby will guide treatments, genetics has had the most immediate impact. The discovery in 1991 that Kennedy’s disease, a motor neuron disorder, is caused by a specific gene mutation was the first of a stream of significant findings, including gene mutations involved in Huntington’s disease, amyotrophic lateral sclerosis (ALS, or Lou Gehrig’s disease) and Rett syndrome. In addition to identifying mutations responsible for these undeniably familial disorders, investigators discovered a mutation in the alpha-synuclein gene in a large European family with Parkinson’s disease. This finding was particularly noteworthy since the consensus in the field had been that Parkinson’s had environmental causes. Now more Parkinson’s researchers have expanded their search to include genetics, which accounts for the disease in some patients and may contribute to it in others.

In the past decade, the list of single gene defects that contribute to neurological disorders grew at an extraordinary pace, leading to almost an embarrassment of riches. For example, at least 15 identified genes cause spinocerebellar ataxias, and almost as many additional genes are suspected culprits. Classification for ataxias by genetic profile has replaced the clinical classification based on time of onset, rate of progression and subtleties of the clinical exam. In addition, researchers have identified at least six additional Parkinson’s-related genes; together, these genes appear to underlie the disease in as much as 35 percent of patients with Parkinson’s disease.

Gene identification has immediately benefited patients. For many of the rare neurological disorders, patients and their families might have spent many years and thousands of dollars in their search for a diagnosis. For some diseases, an inexpensive genetic test can now bring that odyssey to a rapid and conclusive end.

We hope for more than the ability to diagnose, however. Implicit in the discovery of a causative gene is the belief that this knowledge will quickly lead to a better understanding of disease processes, and this in turn will yield better treatments. But to date this translation has proved to be much more difficult than we had imagined. For example, in 1987, researchers discovered that mutations in the dystrophin gene cause Duchenne muscular dystrophy, a disorder that results in the death of affected boys, usually before age 20. Despite two decades of research and the availability of both mouse and dog models, the only treatment currently in use is corticosteroids.

Similarly, scientists identified the genetic defect in Huntington’s disease in 1993; today, no treatments slow the disease’s progression, and those that address symptoms in the middle to late stages are not particularly effective. Scientists still debate whether aggregates formed in the brain from the Huntington’s gene’s mutant protein are toxic or protective. We have, however, become much more sophisticated about defining and testing targets for therapeutics development, and we now have at our disposal exciting new technologies, such as small interfering RNA (siRNA), to turn our knowledge of gene mutations into treatments.

Other basic science developments offer significant promise for the current decade. One such advance is human pluripotent stem cell technology. Using recipes for specific classes of neural cells, scientists can use human embryonic stem cells to generate thousands of dopamine neurons, motor neurons or myelin-producing cells suitable for studying and potentially treating Parkinson’s disease, ALS or multiple sclerosis, respectively. In 2006 scientists learned how to turn back the developmental clock of mouse skin cells to make them into induced pluripotent stem (iPS) cells that closely resemble embryonic stem cells. A year later researchers extended this remarkable technology to human skin cells. In proof-of-principle studies, investigators have turned skin cells from patients with ALS or Parkinson’s disease into induced pluripotent stem cells. Then they have differentiated the cells into motor neurons or dopamine neurons. Scientists are already using such cells in the search for disease-modifying drugs. Many investigators believe that well down the road, human embryonic stem and iPS cells may be useful in replacing nervous system cells lost or damaged by neurological disorders.

Tackling the Mysteries of Alcohol Dependence

By Kenneth R. Warren, Ph.D., National Institute on Alcohol Abuse and Alcoholism

Why does drinking alcohol have such profound effects on people’s behavior? Why does alcohol dependence develop and persist in some people but not in others? Scientists attempt to answer these questions by studying the brain, where alcohol intoxication and dependence begin. During the past decade, advances in technology have helped us better understand how alcohol changes the brain and how those changes influence alcohol-related behaviors. In the coming decade, this knowledge will help researchers develop drug and other interventions that can reduce the high social, personal and economic costs of alcohol-related problems.

Research supported by NIAAA in the early 1990s demonstrated that people who abuse alcohol for a long time experience lasting changes within the brain’s limbic system, which supports emotion and motivation. These changes, which we call neuroadaptation, involve multiple neurotransmitters and other brain chemicals. Neuroadaptation can result in heightened anxiety and distress during abstinence; the drinker can alleviate this discomfort for a short time by drinking more. This may help explain why people with alcohol dependence steadily increase the amount they drink.

As a person’s dependence on alcohol grows, the affected neurotransmitter systems change from those that are involved in the brain’s reward system to those that cause negative effects such as anxiety, sweating and tremors. It appears that people with alcohol dependence continue to drink despite recurring health and social problems because of a vicious cycle: They are drinking in an attempt to avoid the unpleasant effects of drinking. In the future, alcohol scientists hope to use their understanding of how neuroadaptation occurs to develop targeted medications for treating alcohol dependence.

Researchers have identified stress as a probable trigger for relapse into alcohol dependence. Alcohol neuroscientists have identified several brain-cell receptors that influence resilience to stress and may be involved in susceptibility to alcohol dependence. For example, researchers found that mice lacking a receptor that mediates stress responses voluntarily drank much less alcohol and were more sensitive to its sedative effects than normal mice. In one study, people who had recently gone through alcohol detoxification took a drug that targets this same receptor. They reported fewer alcohol cravings and improved overall well-being.1 This finding might lead to a new treatment for some types of alcohol dependence, which is a central part of the NIAAA’s mission.

Scientists also are seeking ways to combat underage drinking, a major public health challenge worldwide. For example, researchers conducting studies on animals have found that adolescents are less sensitive than adults to the negative effects of intoxication, including sleepiness, hangover and impaired coordination. That means it takes more alcohol for teens to begin to experience the negative effects that adults recognize as signs that they have had too much to drink. On the other hand, researchers conducting studies on humans have found that adolescents are more sensitive than adults to alcohol’s impairment of memory and social inhibition.2 These findings suggest that adolescents are particularly prone to alcohol-related consequences, such as teenage drinking and driving accidents and lasting cognitive deficits.3 In addition, the earlier drinking begins in adolescence, the greater the risk of alcohol use disorders in adulthood.4 Our next challenge, therefore, is to learn how drinking may interfere with normal adolescent brain development at the cellular and molecular level, as well as how this interference may lead to cognitive impairment and alcohol use disorders. Then we can investigate interventions that will protect people of all ages.

During the Decade of the Brain, scientists developed imaging and electrical recording techniques that allow today’s researchers to study how alcohol affects different brain systems and structures. We can also see, in real time, how both the motivation to drink and alcohol itself change the human brain. For example, using functional magnetic resonance imaging (fMRI), scientists can track how the desire to use alcohol changes specific brain regions. Scientists using magnetic resonance spectroscopy (MRS) can monitor chemical and metabolic changes that may cause alcohol’s short-term pleasurable effects (intoxication) and long-term detrimental effects (dependence).

Furthermore, about half of a person’s risk of developing alcoholism is based on his or her genetic makeup,5 and real-time recording techniques also are helping scientists to identify genetic risk factors. For instance, using event-related potentials (ERPs), researchers have identified unusual brainwaves that appear in the brains of children of alcoholics before they have taken their first drink.6 Researchers also have found that certain genetic markers linked to alcohol dependence also are associated with psychiatric disorders such as antisocial personality disorder and attention-deficit/hyperactivity disorder. This finding suggests that these illnesses have genetic connections.7 In order to investigate the interface of genetics and neuroimaging, the NIAAA has promoted imaging research that may clarify how genes associated with alcohol dependence affect the brain. This new field of imaging genetics offers a powerful research tool to help us understand the genes that underlie alcohol-related disorders.

Understanding the effect of alcohol on the brain through discoveries in neuroscience is integral to understanding why people get into trouble from alcohol use and figuring out how to prevent and reduce alcohol-related problems. During the next decade, animal and human studies using increasingly sophisticated technology will provide information that may help bring us closer to these important goals.

Using Breakthroughs in Visual Neuroscience to Treat Diseases

By Paul A. Sieving, M.D., Ph.D., National Eye Institute

Advances in visual neuroscience during the past 10 years are generating a lot of excitement. The ability to record simultaneously the activity of different clusters of neurons in the eye has greatly improved our understanding of how our neural circuits process and integrate visual signals. For example, recording the impulses from clusters of retinal ganglion cells, which transmit visual input from the eye to the brain, allows researchers to characterize completely the information presented to the visual parts of the brain.

The next research front will involve investigating how neurons interconnect into circuits that control visually guided behavior, such as when we alter our path to avoid an obstacle we see. In the next decade, three recent technical advances will help us learn more about this neural circuitry. First, scientists can now see complex, interconnected brain structures using the Brainbow technique, which employs genetically coded fluorescent proteins that can mark hundreds of neurons with unique colors. Second, two-photon imaging technology can display the dynamic interactions between neurons in real time. Finally, scientists can implant into the brain a grid containing 100 electrodes that deliver signals to a computer, which measures the activity of individual neurons within a larger group.

Scientists can use information about the activity of individual neurons therapeutically. Because the organization of the primate visual system is very close to that of humans, what we learn through ethical studies of nonhuman primates brings us closer to human medical applications. Studies using an array of electrodes implanted in the brain show that monkeys can use their visual system to control an artificial limb remotely, by mental control alone.1 If this ability holds true in humans, it could dramatically improve sensory substitution treatments used for a range of human injuries—for example, better devices for people who have lost limbs due to war or disease. In addition, learning about faulty nerve circuits in the visual system will provide insight into other types of circuit disorders, such as chronic pain and epilepsy.

We have tremendous opportunities to translate what we have learned about visual circuits in the past decade into treatments for neurodegenerative diseases affecting vision, such as retinitis pigmentosa and macular degeneration. These diseases target photoreceptor cells in the retina, which normally process light that becomes an image when electrochemical signals are transmitted through the retina and optic nerve to the brain. The degeneration or death of photoreceptor cells causes loss of vision and blindness, but the other parts of the transmission process—the second-stage retinal neurons—remain intact. Researchers are testing several methods to activate retinal cells by bypassing the nonfunctional photoreceptor cells.

For example, through funding from the National Eye Institute and the U.S. Department of Energy, scientists have developed an artificial retina chip—an electrode array that receives signals interpreted as electrical impulses from a camera. When the array is transplanted into the eye, it stimulates the remaining retinal circuits and transmits the impulses to the brain to enable it to visualize what the camera sees. An alternative strategy involves a microchip with tiny solar cells that convert light energy into electrochemical impulses. And optogenetics, which draws on advances in nanotechnology, uses pulses of light to specifically activate genetically engineered ion channels in retinal cells to initiate the visual pathway.

Researchers are also investigating how to restore vision via cell-based therapies such as stem cell technology and gene therapy. For example, scientists can now induce stem cells to develop into retinal cells, and these cells function correctly when transplanted into an animal with retinal degeneration. The most promising results come from recent clinical trials using gene therapy to treat people with Leber’s congenital amaurosis. People with this condition lack an enzyme required for vitamin A metabolism, and the resultant degeneration of photoreceptor cells causes vision loss. When researchers delivered the missing enzyme to the remaining intact photoreceptors, the patients’ visual sensitivity increased. This was one of the first examples of safe and effective gene therapy.

These advances are beginning to help people with limited vision see better. They are also shedding light on treatments for other neurological disorders. New technologies and scientific breakthroughs afford significant opportunities to expand our knowledge about the visual system and develop applications with therapeutic potential.

Advances in Genetics and Devices Are Helping People with Communication Disorders

By James F. Battey Jr., M.D., Ph.D., National Institute on Deafness and Other Communication Disorders

During the past decade, scientists have made astonishing advances in the NIDCD’s mission areas of hearing, balance, smell, taste, voice, speech and language. Numerous discoveries have expanded our knowledge base amid one of the most exciting periods in the history of communication research.

Genetics ranks high on the list of areas in which we’ve made significant progress. Before the Decade of the Brain, we knew that deafness could be inherited, but we knew little about the genes involved. Twenty years later, we’ve identified hundreds of genetic mutations linked to inherited hearing loss, with more than 80 genes mapped just in the past 10 years. Further study has shown us the functions of many of the proteins that these genes encode and has revealed molecular pathways essential for normal hearing.

Similar explorations in speech and language have turned up genetic mutations that are responsible for delayed language development in young children and that also play a supporting role in dyslexia and some cases of autism. This kind of discovery, which reveals common neural pathways in speech, reading and language development, could be the key to freeing thousands of children now locked inside their own worlds.

Another exciting gene discovery, the result of a collaboration across the National Institutes of Health and internationally, recently identified the first genetic mutations responsible for stuttering, which places this speech disorder squarely in the medical world. Researchers are currently working with animal models to understand how this gene influences the neural circuits that control expressive language.

Combating hearing loss by regenerating hair cells, small sensory cells in the inner ear, also is showing promise. Our ability to hear relies on these hair cells, and defects in them or damage to them cause hearing loss. Although fish, amphibians, and birds are able to grow new hair cells, humans and other mammals can’t. Scientists are trying to understand the molecules and genes involved in hair cell regeneration in animals, with hopes of learning how to mimic the process in humans. Research in hair cell regeneration could one day offer a powerful treatment option, if not a cure, for hearing loss.

Beyond genetic discoveries, we continue to focus on the development of devices that bring sound into the worlds of people who are profoundly deaf or hard of hearing. The cochlear implant, one of the most groundbreaking biomedical achievements of the past 30 years, uses direct electrical stimulation of the auditory nerve via implanted electrodes to bypass inner ear damage and provide a sense of sound. Although cochlear implants have helped close to 200,000 people worldwide, most still have problems clearly hearing conversations in noisy environments. Scientists are currently looking at how to better localize sound by using advanced signal processing techniques and improved electrode design.

Hearing aid users have similar problems in noisy environments. An ingenious solution has emerged from the study of the ears of a parasitic fly, Ormia ochracea, which is extraordinarily successful at localizing sound. Using the lessons from this research, scientists are developing a miniature directional microphone that can zero in on a single voice and make communication in noisy places a more effective.

As this new decade begins, we’re applying the technology of cochlear implants to the development of other potential neural prostheses for hearing, balance and speech. These include auditory brainstem implants, which reconnect the ear to the brain in people whose auditory nerves have been surgically removed; vestibular implants to normalize balance by electrically stimulating the vestibular nerve; and brain-computer interfaces to help patients with locked-in syndrome translate thought into synthesized speech.

In smell and taste research, we’ll focus less on the nose and the tongue and more on the brain, tackling questions about how the brain interprets sensory data and mapping the functional organization of the neural circuits that mediate these senses. We are just beginning to understand the complicated neural networks that turn objects and words into speech, but newer imaging techniques, such as voxel-based morphometry, will allow us to localize brain function at a much finer spatial resolution than fMRI and will become a powerful tool for researchers to see which areas of the brain are active during speech and word retrieval.

I am certain that we will end this new decade with a far better understanding of how language and speech are processed in the brain. We’ll also have more sensitive, individually tailored and effective technologies for people with hearing loss. Finally, our continued studies in genetics, and the rapid accumulation of knowledge about genes and their functions, mean that the era of precise genotype-based diagnosis may be at hand for many of the communication disorders we study.

It Takes a Village: Large-Scale Studies Prove Vital to Alzheimer’s Disease Research

By Richard J. Hodes, M.D., National Institute on Aging

During the next 25 years, the number of Americans living to age 65 is expected to double to about 72 million. Many people thrive as they age, but others experience cognitive decline wrought by Alzheimer’s disease and other dementias. Today, as many as 5.1 million Americans may have Alzheimer’s disease, the most common form of dementia. Unless we can cure or prevent it, Alzheimer’s prevalence may triple by 2050.

These dire projections lend urgency to research into this devastating disease. In the past decade, researchers and clinicians working across diverse disciplines have made important discoveries about the molecular changes that take place in the brains of people with Alzheimer’s disease, identified genetic risk factors, and pointed to lifestyle and environmental factors—such as diet and exercise—that may contribute to the onset and progression of the disorder.

Along with these advances, however, came a humbling appreciation of the complexity of Alzheimer’s. More than a century since Alois Alzheimer first described abnormal deposits of beta-amyloid and tau proteins in the brain of a woman with dementia, researchers are still asking if these hallmark plaques and tangles are the causes or the results of the disease process. While it is difficult to predict when we will have the answers, we may gain great insight from large-scale, collaborative studies.

Researchers in government, academia and private industry are joining forces to discover the genetic and environmental risk factors involved in Alzheimer’s. One such success story is the Alzheimer’s Disease Neuroimaging Initiative (ADNI), a partnership launched in 2004 and primarily supported by the National Institute on Aging (NIA), joined by other NIH institutes and private partners. ADNI scientists are developing imaging and biomarker profiles of the changes that signal the onset of Alzheimer’s, sometimes long before symptoms appear.

These new biomarker tools—from brain scans to blood and cerebrospinal fluid tests—will enable us to detect and follow the progression of Alzheimer’s during clinical trials. And the scientists’ efforts are beginning to show results. In spring 2009, ADNI reported that certain cerebrospinal fluid biomarkers may help us both predict who is at risk of developing the disease and learn how the disease responds to various therapies. These data, involving hundreds of volunteers, are available to qualified researchers worldwide, thus further driving the collaborative nature of the research and strengthening our chances of finding answers quickly. Ultimately, we hope that these technologies will prove useful in everyday clinical practice so that we can implement therapies or preventative measures as soon as possible.

We anticipate that a similar collaborative approach will tell us more about Alzheimer’s risk factor genes. Researchers have shown that three genes cause the rare, early-onset form of Alzheimer’s that occurs in some families. However, only one of the other 30,000 genes in our DNA is linked to increased risk for the more widespread, late-onset form that commonly occurs after age 65. Scientists are eager to identify additional risk factor genes.

Genome-wide association studies, which use methods that can rapidly test up to a million sites in one person’s genes, will help scientists find those elusive genetic variations. Since 2007, several international research groups conducting association studies have identified variants of the SORL1, CLU, PICALM and CR1 genes that may play a role in the risk of late-onset Alzheimer’s.

To build the large bank of DNA samples needed for future association studies, the NIA supports the Alzheimer’s Disease Genetics Consortium, which collects and analyzes biological samples from tens of thousands of people with and without the disease. The consortium freely shares its data and analyses with others in the research community to help spur advances in our understanding of the genetic mechanisms at work and to help scientists identify new pathways to prevention or treatment.

The past decade is also marked by advances in translational research—applying knowledge gained in the laboratory as quickly as possible to new tests or therapies in a clinical setting. The NIA currently supports 60 grants aimed at identifying and developing effective therapies for the treatment of Alzheimer’s. The work is varied, from finding new compounds that will modify beta-amyloid production or clear it from the brain to reformulating existing drugs and naturally occurring compounds used to treat other diseases. These studies allow the NIA to capture new and creative therapeutic approaches and to “seed” promising drug discovery and preclinical development programs.

The success of these and many other efforts relies on another vital partner in Alzheimer’s research: the many volunteer research participants, including patients in clinical trials. Both our recent progress and our growing confidence for future advances rely heavily on this generosity of spirit. Collaboration is key to translating discoveries into safe and effective therapies that will benefit us all.

About Cerebrum

Bill Glovin, editor Carolyn Asbury, Ph.D., consultant

Scientific Advisory Board Joseph T. Coyle, M.D., Harvard Medical School Kay Redfield Jamison, Ph.D., The Johns Hopkins University School of Medicine Pierre J. Magistretti, M.D., Ph.D., University of Lausanne Medical School and Hospital Robert Malenka, M.D., Ph.D., Stanford University School of Medicine Bruce S. McEwen, Ph.D., The Rockefeller University Donald Price, M.D., The Johns Hopkins University School of Medicine