How would you like to share?

Proposed research criteria for AD employ biomarkers to identify several stages of preclinical disease, and since their publication in 2011 the field has been pushing hard to test how well they work. At the Alzheimer's Association International Conference (AAIC), held July 13-18 in Boston, Massachusetts, researchers presented data that support the criteria’s usefulness and reinforce current models of biomarker progression. In many cases, different study populations revealed remarkable agreement on the prevalence of each proposed stage of preclinical dementia, and how quickly it advances to the next. At the same time, the data also confirm that substantial numbers of people present puzzling biomarker profiles, indicative of neurodegeneration and clinical AD without detectable amyloid deposits (see Part 2 of this series). Speakers agreed that larger studies will be needed to fully elucidate the relationship between biomarkers and prognosis, as most studies to date have been small.

Research criteria commissioned by the National Institute on Aging and the Alzheimer’s Association distinguish three advancing stages of preclinical AD (see ARF related news story on Sperling et al., 2011). Stage 1 applies to people who have only amyloid accumulation; stage 2 to people with amyloid plus a marker of neuronal injury; and stage 3 to people who in addition have signs of subtle cognitive change. These divisions reflect current models of biomarker staging (see ARF webinar; ARF related news story). More recently, Cliff Jack and David Knopman at the Mayo Clinic in Rochester, Minnesota, suggested two additional stages to the model: stage 0, in which all biomarkers are normal, and “suspected non-AD pathophysiology” (SNAP), marked by neurodegeneration in the absence of amyloid. These five stages encompass 97 percent of cognitively healthy older adults, they reported (see image below and Jack et al., 2012; Knopman et al., 2012).

Biomarkers Predict Impairment in Cognitively Normal Seniors
How well does the model work in practice? At AAIC, Stephanie Vos from Maastricht University in the Netherlands described her work testing the model in a long-running study of 311 cognitively normal elderly people seen at the Knight Alzheimer’s Disease Research Center at Washington University in St. Louis, Missouri. Participants were older than 65 and had a clinical dementia rating (CDR) of 0 at baseline, indicating normal cognition. Vos assessed amyloid pathology and neurodegeneration as measured by cerebrospinal fluid (CSF) Aβ and tau, respectively. At the beginning of the study, 42 percent of the population fit stage 0, 15 percent fit stage 1, 12 percent stage 2, and 4 percent stage 3 criteria. Almost a quarter of the population fell into the SNAP group, and about 5 percent fit none of the categories, Vos reported. These percentages are almost identical to those reported by the Rochester Mayo clinic in a separate population of cognitively healthy older adults using imaging measures (see Knopman et al., 2012), as Knopman noted during the discussion.

Importantly, Vos and colleagues next examined whether this initial staging then helped predict who would develop cognitive problems. They found that over an average of four years, people at advanced stages were more likely to progress to symptomatic AD, defined by a CDR of 0.5 or more, that clinicians believed was due to Alzheimer’s. More than half of those at stage 3 progressed, but only a quarter of the stage 2 group, and 13 percent of stage 1. This suggests that the stages have prognostic value, Vos told Alzforum. Cognition remained stable in the stage 0 cohort.

Cognition declined in only six percent of people who were classified as SNAP. The relative stability of this group in this study cohort contrasts sharply with what other researchers report in cognitively impaired populations, where people classified as SNAP often progress rapidly (see part 2 of this series). Vos suggested that cognitively normal people with stable SNAP may represent a different group than cognitively impaired patients who are negative for amyloid and positive for neurodegeneration. She noted that in the St. Louis cohort, the few SNAP individuals who progressed to AD did have some amyloid pathology at baseline, but it fell below predetermined cutoffs.

Other researchers presented similar data supporting the idea that brain amyloid heralds future AD in people with normal cognition. William Klunk from the University of Pittsburgh, Pennsylvania, noted that although the presence of amyloid β deposits does not always correlate with low cognition at baseline, so far all studies have found a connection between brain amyloid and future cognitive decline. In a study that followed 63 cognitively normal subjects, average age of 75, for about four years, one third of those with brain amyloid developed mild cognitive impairment (MCI) due to Alzheimer's disease, a rate of seven percent per year, he reported. By comparison, of those without detectable brain amyloid, less than three percent per year became impaired. The data support the hypothesis that amyloid burden carries a strong risk of future cognitive decline, Klunk said. In contrast, age, cognitive test scores, and brain hypometabolism did not predict progression in this study.

The progression rate Klunk and Vos found for Aβ-positive participants matches that reported by other groups. For example, Christopher Rowe from Austin Hospital, Melbourne, Australia, said that in the Australian Imaging, Biomarkers, and Lifestyle (AIBL) Flagship Study of Ageing, cognitively normal seniors with brain amyloid developed MCI at a rate of about eight percent per year. This is consistent with data from longitudinal studies showing that people deposit amyloid for 10-15 years before becoming symptomatic, Rowe noted (see ARF related news story).

Klunk also looked at how quickly cognitively normal older adults go from amyloid negative to amyloid positive, i.e. from stage 0 to 1. He reported a rate of 10 percent per year in the 75-year-old cohort. During the discussion, Jack noted that he sees a similar figure of about 13 percent per year in the Mayo Clinic population. The AIBL study previously found that it takes about 12 years to go from a negative amyloid scan to the threshold for a positive scan, which agrees well with these figures (see ARF related news story).

Researchers at the meeting asked whether this means that all older people will eventually become amyloid positive. Klunk speculated that they might if everyone lived to be over 120, yet there seems to be large variation in when amyloid deposition starts in different individuals. Klunk added that this does not necessarily mean that everyone would develop clinical AD, since other brain changes must occur before cognition starts to fail.

Memory Impairments and Brain Atrophy Foreshadow Dementia
What about people who already have cognitive problems? People diagnosed with MCI represent a mixed group, researchers agreed. Not everyone will go on to develop Alzheimer’s dementia. Some have underlying pathologies that are not AD, and their cognition may remain stable or even revert to normal. Which biomarkers are most prognostic? Here too, amyloid burden is central but not the whole story, as brain atrophy and other markers of neurodegeneration may be even more predictive, according to research presented at AAIC.

Rowe reported that in the AIBL study, 86 percent of people with amnestic MCI and brain amyloid progress to Alzheimer’s dementia over a period of three years. The data support the concept that MCI with a positive amyloid scan represents prodromal AD, Rowe told Alzforum. Other speakers described similar findings indicating that memory loss plus amyloid strongly predict an Alzheimer’s dementia diagnosis within three to five years, contributing to the emerging consensus in the field.

Both clinicians and patients tend to be unsatisfied with the “three to five year” time window often invoked in research studies. Is there a way to time progression more precisely? In current staging models, amyloid accumulates for two to three decades before diagnosis, whereas markers of neurodegeneration such as rising tau and a shrinking brain appear only a few years before symptoms. David Wolk at the University of Pennsylvania, Philadelphia, hypothesized that these neurodegenerative markers might predict short-term progression better than does the presence of amyloid. Wolk and Brad Dickerson at Massachusetts General Hospital previously defined an “AD signature.” This is a pattern of cortical thinning in nine AD-vulnerable brain regions that discriminates people with mild AD from controls (see ARF related news story; Dickerson et al., 2009). Since brain volume also shrinks with age (see Bakkour et al., 2013), Wolk and Dickerson have now adjusted this signature to remove the effects of aging, Wolk said at AAIC.

At AAIC, Wolk presented data from a cohort of 156 MCI patients enrolled in the Alzheimer’s Disease Neuroimaging Initiative (ADNI). As expected, people who progressed to AD within one year had significantly worse baseline-adjusted AD signatures compared to people who progressed in three years. Both groups had more cortical thinning than people who remained stable over three years. Supporting Wolk’s hypothesis, one- and three-year converters displayed similar levels of Aβ in the CSF.