CONFERENCE COVERAGE SERIES
Alzheimer's Association International Conference 2010
Honolulu, Hawaii, U.S.A.
10 – 15 July 2010
CONFERENCE COVERAGE SERIES
Honolulu, Hawaii, U.S.A.
10 – 15 July 2010
Regulatory approval could be on the horizon for an amyloid tracer that is widely considered the frontrunner among several 18F-labeled compounds being developed for brain imaging using positron emission tomography (PET). Even the skies of Honolulu, which hosts the Alzheimer’s Association’s International Conference on Alzheimer’s Disease (ICAD) from 10-15 July 2010, could not have been sunnier than the presentation of Phase 3 histopathology data reported Sunday by Christopher Clark of Avid Radiopharmaceuticals, Philadelphia, Pennsylvania. He said he and colleagues found near-perfect correlation between PET imaging using the new tracer and amyloid load measured postmortem in the same patients. Given these encouraging results, the company plans to submit an application by late summer to the U.S. Food and Drug Administration (FDA), which could, so the company hopes, approve the compound as soon as six months thereafter under expedited review, Clark told ARF. A validated F18 PET tracer would expand the commercial availability of amyloid imaging, which has thus far been restricted largely to some 60 or so research centers worldwide.
In this study, Avid tested florbetapir (formerly 18F AV-45) in 35 people who were expected to die within six months. The idea was that imaging people near the end of their lives would minimize the time interval between PET scan and histopathological evaluation, allowing researchers to more effectively compare the two measures of brain amyloid load. To test their PET reagent’s specificity, the team also imaged 47 people who were highly unlikely to have detectable brain amyloid—namely, young, cognitively normal subjects without an ApoE4 allele. On average, less than three months elapsed between PET imaging and death, about 11 hours between death and autopsy, Clark said.
Of the 19 subjects who met NIA-Regan criteria for AD pathology, all but one were amyloid-positive on PET as judged by visual reads, and all 19 came out positive on SUVR quantification of PET data. For both PET analysis methods, all 16 who lacked postmortem AD pathology were also amyloid-negative by live brain imaging, giving the tracer 100 percent specificity in this small sample.
The data drew praise from the neuroimaging community, which might have expected as much given that florbetapir already looked promising in Avid’s analysis of six autopsy cases presented earlier this spring at the Human Amyloid Imaging meeting and in more detail at the American Academy of Neurology conference, both in Toronto (see ARF related news story).
“These data are excellent,” said Chris Rowe of the University of Melbourne. Other scientists’ informal hallway comments ranged from enthusiasm that the availability of an approved agent to image a key AD pathology would transform the clinical trial landscape, to more guarded optimism. Some scientists cautioned that even if the data withstood the FDA's review, it remained to be seen whether insurance will pay for amyloid imaging, and noted that this measure may add more value to clinical detection of early cases for research purposes than for routine diagnosis in the community.
Florbetapir appears useful for predicting whether seniors with mild cognitive impairment were likely to decline further, as reported in a separate talk at ICAD by Reisa Sperling of Brigham and Women’s Hospital in Boston. She and colleagues reported follow-up data on a florbetapir Phase 2 study. The scientists have data thus far on 138 of 184 cognitively normal, newly diagnosed MCI and AD participants in the study. After getting their brains imaged with florbetapir at baseline, the participants had their symptoms assessed every six months through phone interviews, and returned for clinical and neuropsychological evaluation at 18 months. None of the healthy seniors worsened noticeably within that timeframe, regardless of brain amyloid status. However, in the MCI group, a greater proportion of amyloid-positive subjects progressed to AD than did amyloid-negative MCI patients, suggesting that amyloid imaging using florbetapir may help identify those at risk for progressive cognitive decline. In his talk, Rowe broadly reviewed ongoing longitudinal assessments of various MCI cohorts imaged with several of the amyloid imaging agents currently in development at his center and elsewhere. The overall trend there, too, was that MCI patients who have amyloid in their brain go on to meet an AD diagnosis over the next two to four years, while amyloid-free MCI patients generally do not.—Esther Landhuis.
Gift of Water, bronze sculpture by Shige Yamada, 1997, at main entrance of the Hawai'i Convention center. Representing a spring offering water, it celebrates the Hawaiian people's generosity and sense of goodwill to newcomers. Image credit: Chris Hass
Tomm40 has created a stir in Alzheimer disease research with the recent proposal that variable-length polymorphisms of this gene, which lives near ApoE on chromosome 19, can help predict at what age a person may develop late-onset AD (LOAD) (Roses, 2010; Lutz et al., 2010). This finding came from analysis of several small cohorts in which ages of LOAD onset were determined retrospectively. Now, new work on Tomm40 (aka translocase of the outer mitochondrial membrane 40) confirms the age-of-onset connection in a small prospective study of cognitively normal adults who went on to develop mild cognitive impairment (MCI) or AD. Scientists reported the preliminary findings at the International Conference on Alzheimer’s Disease (ICAD) held 10-15 July in Honolulu, Hawaii, along with other data showing that the Tomm40 length variants also correlate with brain atrophy and cognition in asymptomatic middle-aged people. If the results hold up, they could explain why some ApoE3/E3 homozygotes, a supposedly risk-neutral group, have LOAD risk that parallels that of E4 carriers, and may improve stratification of participants in future clinical trials.
Plowing through phylogenetic analyses, researchers led by Allen Roses and Michael Lutz of Duke University in Durham, North Carolina, and Eric Reiman of Banner Alzheimer’s Institute in Phoenix, Arizona, came upon a poly-T variant within intron 6 of Tomm40 that greatly improved predictions for when ApoE3 carriers might develop AD. In particular, among autopsy-confirmed ApoE3/4 patients, those with two copies of the long Tomm40 variant (more than 20 poly-T repeats)—aka the “long/longs”—developed AD about eight years earlier than the “short/longs,” who had a copy of the short Tomm40 variant (20 or fewer poly-T repeats) along with the long version (Roses et al., 2009; see also ARF related news story). In this earlier study, the researchers analyzed several independent cohorts of patients whose LOAD onset ages were documented in medical records.
Because retrospective data can be unreliable, the scientists sought to reproduce those findings in prospective studies of people with known ApoE and Tomm40 status who are being followed with neuropsychological testing for future development of MCI or AD. On an ICAD poster, Richard Caselli of Mayo Clinic, Scottsdale, Arizona, and colleagues including Reiman and Roses, reported preliminary data from 30 participants in the first of several prospective studies in progress for five to 19 years. In short, the results came out as predicted: the “long/long” group developed incident MCI or AD about nine years earlier than the “short/longs” (onset age 73 versus 82). The cohort was too small to correct for ApoE genotype, Caselli noted, but the earlier age of onset in the long/longs did hold for both ApoE3/4 (n = 10) and ApoE3/3 (n = 11) subgroups.
The Tomm40 length variants also seem to track with other defining measures of AD—namely, brain atrophy and cognition. These preliminary studies involved participants of a longitudinal cohort study called WRAP (Wisconsin Registration for Alzheimer’s Prevention) that started in 2001 under the leadership of Mark Sager at the University of Wisconsin in Madison. Participants around a mean age of 54 enter the study asymptomatic and get cognitive testing every few years. Some also receive brain imaging through ancillary studies led by Sterling Johnson, also at the University of Wisconsin. Forty-six percent of the subjects are ApoE4-positive. Mining the data on 1,400 study participants, the researchers uncovered differences in white matter (measured by diffusion tensor imaging), brain activity (measured by functional magnetic resonance imaging of AD-relevant areas such as hippocampus), and certain measures of learning. Somewhat surprisingly, “the differences were based on whether or not their parents had AD,” Sterling said in his ICAD talk. “ApoE wasn’t really giving us all the explanatory power we needed. So we looked for other genetic and lifestyle factors that might predict [the parental history connection].”
Puzzling over these findings, which were reported last fall at the Clinical Trials on Alzheimer’s Disease meeting in Las Vegas (see ARF conference story), the researchers recalled the recent buzz over Tomm40 and wondered whether Tomm40 length variants might help tease out the differences they had seen related to family history. Johnson focused on E3 homozygotes because of their curious bimodal distribution on AD risk charts. Though E3 has historically been regarded as the risk-neutral ApoE variant, in reality, there is a subgroup of E3 carriers who seem just as prone to AD as people with the high-risk E4 allele. Johnson’s team analyzed 120 healthy E3/3 WRAP participants (mean age 57), assessing their Tomm40 status and measuring gray matter volume in the ventral posterior cingulate and precuneus (brain regions affected early in AD) by structural MRI. Comparing participants with two “short” Tomm40 alleles to those with two “long” Tomm40 alleles, the researchers found that the latter had lower gray matter volume in the analyzed brain areas.
Sager and colleagues analyzed more than 700 asymptomatic WRAP participants (mean age 54) with a family history of AD, and similarly compared short/short and long/long subgroups for cognitive differences. Consistent with their greater brain atrophy, the long/long subjects, regardless of ApoE genotype, did worse on several measures of the Auditory Verbal Learning Test.
If confirmed in larger samples, the findings may be “very important to explain why some E3/3s develop AD at earlier ages,” said Yadong Huang of Gladstone Institute of Neurological Disease at the University of California, San Francisco. Among E3 homozygotes, about a quarter have the long/long Tomm40 genotype that confers greater AD risk.
The new data may also hint at possible synergistic effects between ApoE and Tomm40 at mitochondria, which help maintain synapses and falter in early AD. Huang and colleagues have shown that proteolytic fragments of ApoE, which form more commonly from E4 than E3, interact with neuronal mitochondria, throwing off membrane potential and contributing to cytoskeletal structures that contain phosphorylated tau (Chang et al., 2005). Tomm40 is a mitochondrial membrane protein needed for shuttling proteins into the organelle. “If Tomm40 causes problems, then when the ApoE fragment comes in, that might make it even worse,” Huang speculated.
Whether and how the Tomm40 poly-T variants influence mitochondrial function to begin with remain unclear. Because they are intronic, the polymorphs do not affect Tomm40’s protein sequence and have yet to demonstrate effects on expression, leaving in question their biological effect in neurons, suggested John Hardy of University College London, U.K., in an e-mail to ARF. In his view, it seems more likely, for now, that LOAD risk variability derives from ApoE promoter polymorphisms that govern expression of ApoE (Lambert et al., 2002; Lambert et al., 1997). Hardy noted that this mechanism plays out in another disease, where missense variants in complement factor H have been shown to influence gene expression and predisposition to macular degeneration (Li et al., 2006).
Still, the ICAD data suggest the Tomm40 length variants “clearly have some effect—especially on E3/3s, where there are no confounding effects due to E4,” Huang told ARF. Whether those effects involve synergism with ApoE remains to be seen. On the one hand, studies with transgenic mice that express human E4 have shown that E4 alone can drive cognitive decline. That suggests to Huang that E4 messes with learning and memory independent of Tomm40, since mice are unlikely to have the same Tomm40 length variants that have been studied in people. In collaboration with Roses, Huang hopes to do the converse experiment—that is, put the human Tomm40 “long” allele into transgenic mice with or without ApoE4—to see whether Tomm40 effects require E4.
In the meantime, Roses has submitted an application to the U.S. Food and Drug Administration for a prevention trial in which the Tomm40 genotype would serve as a key criterion for selecting high-risk patients to test an investigational AD drug. The trial, called Opportunity for the Prevention of Alzheimer’s (OPAL), would enroll cognitively normal seniors from the ages of 60 to 87 and judge, based on age and Tomm40, whether they are at high or low risk for developing AD in the next five years. Most low-risk participants would go into the placebo arm while the high-risk group is randomized to receive drug or placebo. In this manner, the five-year study would, as Roses hopes, serve a dual purpose: validate Tomm40 as a genetic marker, and test the ability of an investigational AD drug to delay LOAD onset. The trial could start next year, Roses told ARF.—Esther Landhuis.
No Available Comments
Disorientation. Memory loss. Postmortem plaques and tangles. These clinical and pathological features define the conventional notion of Alzheimer disease. However, in recent decades, the meaning of AD has undergone a transformation. AD is moving from a disease diagnosed with tests initiated once patients complain of memory loss, to one where risk of future disability is predicted in asymptomatic people by neuroimaging, fluid biomarkers, and data modeling done on networked computers. In short, AD diagnosis is moving from the bedside to the desktop, suggested Jason Karlawish, a bioethicist at the University of Pennsylvania, Philadelphia, in a plenary talk that offered a humanistic perspective to set the stage for the detailed basic and clinical science presentations at the International Conference on Alzheimer’s Disease (ICAD) held 10-15 July in Honolulu, Hawaii.
Karlawish laid out clinical and policy implications of this changing view of AD, concluding with a plea to clinicians not to let stacks of brain images and test scores overshadow the patients and caregivers. One area where rapport with patients is particularly critical—and often lacking—is clinical testing of potential therapeutics. To help speed up trials that are becoming increasingly technology-intensive, the Alzheimer’s Association, which sponsors the annual ICAD meetings, has launched TrialMatchTM. This is a free and confidential online and phone-based system that helps match interested participants with local research studies. In this newest age of AD, this is how patients get invested and become part of a shared effort to steer the age of desktop medicine toward better treatments.
In his plenary, Karlawish cited a 1999 paper by researchers at Mayo Clinic, Rochester, Minnesota (Petersen et al., 1999) that, in his view, ushered in the most recent era of AD. Previously, AD was a model of bedside medicine—“a careful clinical history initiated with a chief complaint and concluding with anatomic examination of the brain,” Karlawish said. Nowadays, AD is no longer characterized by images of static pathology, but instead by charts depicting how a person’s risk of future disease changes with time.
Karlawish explored implications of the evolving meaning of AD by looking at other risk-factor diseases. Consider dyslipidemia. People with this condition have high levels of the so-called “bad” (aka low-density lipoprotein, or LDL) cholesterol, which puts them at elevated risk for future heart attack. Diagnosis is easy. Physicians sitting at a desktop computer can simply enter patient data (age, gender, total cholesterol, and so forth) into a Web-based risk calculator, estimate the person’s 10-year risk of having a heart attack and, importantly, prescribe medications that can reliably lower this risk. This latter criterion has not yet been met for AD. Another key feature of risk-factor conditions that AD lacks is a “discrete clinical event” that is part and parcel to how one defines the disease. The metrics that define AD are “not as clear as a bone fracture or heart attack or stroke,” Karlawish said. Given AD’s insidious onset, it can be hard to define endpoints for clinical trials that test potential AD treatments.
Nevertheless, as AD heads toward a risk model, decisions about who should and should not receive treatment may get prickly. “Why should someone who is just below the threshold of risk of persons treated in the trial, not have access to the treatment?” Karlawish said. At a more personal level, the risk factor approach presents challenges because people tend to be “very present-biased in our decision making,” Karlawish said. “We prefer to enjoy a tasty, high-fat snack and consequent weight gain and risk of diabetes, to a less pleasurable, future-oriented activity such as exercise to reduce the risk of diabetes.”
Karlawish closed by describing how economic debates over costs of risk assessments and treatment over many years could come to “replace the heart-wrenching stories of suffering patients.” He appealed to attendees not to let this happen—to be “unified in our resolve that we do not forget the patients who started us on this journey into this (most recent) age of AD.” (For more, see Karlawish plenary text).
• Click on the image below to view PowerPoint slides
Indeed, patients are indispensable for advancing AD research, perhaps nowhere as poignantly as in the realm of clinical trials. This is where TrialMatchTM comes in. “If we have the treatment out there, sitting in a test tube, but we can’t get it into people to test its efficacy, then the FDA will not approve it,” said Ron Petersen of the Mayo Clinic in Rochester, Minnesota, in an ICAD press briefing. At least 50,000 volunteers (with and without AD) are needed for more than 100 trials currently underway to test some 90-100 experimental AD compounds, Petersen said. Reisa Sperling sees patients and directs AD clinical trials at Brigham and Women’s Hospital in Boston. In her experience, “we have a huge number of studies available for every stage of AD, and yet all of them are slow to enroll.” She blames lack of awareness on the part of both physicians and families for this problem. “We need get trial information to [primary care physicians] in a way they can use when they have only seven minutes with a patient in their offices,” Sperling said.
As for patients and caregivers, even those keen on participating in research often have no idea how to start looking for such opportunities. TrialMatchTM at present contains about 150 trials, including all ongoing AD trials in the U.S. government’s ClinicalTrials.gov repository, as well as a few others, said William Thies of the Alzheimer’s Association, which created the confidential website and phone system. By providing basic information including their age, gender, where they live, education level, clinical diagnosis, and medication use, patients can get connected with trials that may be a good fit for them. In addition to the online format, the service offers matching over the phone (1-800-272-3900) from 7 a.m. to 7 p.m. U.S. Central Time, Monday through Friday. Based on patient profiles and trial eligibility criteria, the association will contact potential participants about trial options and connect them with experts at those sites. Beyond the fringe benefits of research participation—i.e., access to cutting-edge research, close monitoring of health—participants gain from simply knowing they are being proactive. “Being a part of trials is really a heroic act,” Thies said. “It’s contributing your essential self.”—Esther Landhuis.
No Available Comments
No Available Further Reading
Amyloid plaques and neurofibrillary tangles of tau proteins are the two classic hallmarks of Alzheimer disease, but the connection between their two respective proteins—Aβ and tau—has remained mysterious. Now for the first time, a paper appearing July 22 in Cell details a molecular mechanism that links tau to Aβ toxicity at the synapse. Researchers led by Jürgen Götz and Lars Ittner at the University of Sydney, Australia, show that tau has a previously unknown role in the dendrite. Tau targets the Src kinase Fyn to the N-methyl-D-aspartic acid (NMDA) receptor, these authors report. This allows tau to mediate Aβ-induced excitotoxicity at the synapse. When tau is deleted or mistargeted in an AD model mouse, survival and memory improve to those of wild-type levels, although plaque burden and Aβ levels do not change. The same group shows, in a July 19 PNAS paper, that hyperphosphorylation of tau in a tau mouse model can be successfully treated with sodium selenate, leading to rescue of memory, motor performance, and neurogeneration. Both findings suggest promising new tau-based strategies for the treatment of dementias. Lars Ittner presented these data on July 15 in the very last session of the International Conference on Alzheimer’s Disease in Honolulu, Hawaii, where a diminished crowd of diehards gave it a favorable reception.
“I am very enthusiastic about th[is] paper for several reasons,” Lennart Mucke of the University of California San Francisco, wrote to ARF (see full comment below).
During the last decade, researchers led by Mike Hutton, then at the Mayo Clinic in Jacksonville, Florida, and Jürgen Götz, who at the time worked with Roger Nitsch at the University of Zurich, have shown that Aβ can worsen tau pathology, and therefore Aβ must act upstream of tau (see ARF related news story on Götz et al., 2001 and Lewis et al., 2001). Aβ is known to have excitotoxic effects in people and in animal models (see Amatniek et al., 2006; Palop et al., 2007 and ARF related news story; and Minkeviciene et al., 2009 and ARF related news story). The story leapt a step forward again when researchers led by Erik Roberson and Lennart Mucke at the University of California in San Francisco tied tau to excitotoxicity by showing that the removal of tau protein in an AD mouse model protected neurons from Aβ and other excitotoxic insults (see ARF related news story on Roberson et al., 2007). Nonetheless, it was not clear how tau mediated excitotoxicity.
One clue came from the fact that tau protein contains a binding site for Fyn kinase. Fyn mills around at the post-synaptic density in wild-type mice, where it phosphorylates the 2b subunit of the NMDA receptor (NR2b). This strengthens the interaction of the NMDA receptor with the post-synaptic density protein 95 (PSD-95), and leads to excitotoxic downstream signaling. Overexpression of Fyn increases Aβ toxicity (see Chin et al., 2004 and Chin et al., 2005).
First authors Ittner and Yazi Ke looked for Fyn in a tau knockout (KO) mouse, and found it to be reduced by two-thirds at the synapse. This indicates that tau plays an important role in targeting Fyn to the synapse, although some Fyn arrives at the synapse independently of tau. Ittner and colleagues then generated a transgenic mouse that expresses a truncated version of the tau protein (Δtau74) under a neuronal promoter. The truncated version lacks microtubule-binding domains and cannot form aggregates, but includes the amino-terminal projection domain with its binding site for Fyn kinase. Truncated tau localizes to the membrane of the cell body, but is not present in dendrites, and so is incapable of targeting Fyn to the synapses themselves.
Ittner and colleagues found that in the Δtau74 transgenic mouse, Fyn was down by three-quarters at the synapse, despite the presence of endogenous tau. It turns out that truncated tau acts as a dominant-negative mutation by competing with endogenous tau to bind Fyn and mistarget it. As evidence of this, in the Δtau74 mouse, co-immunoprecipitation with Fyn mostly pulls down truncated tau, not endogenous tau. As might be expected with less Fyn at the synapse, in both Δtau74 and tau-null mice there was less phosphorylation of NR2b, and fewer NR subunits co-immunoprecipitated with PSD-95, indicating a weaker interaction. Importantly, Δtau74 and tau-null mice were less susceptible to seizures, which result from overstimulation. Despite these changes, synaptic currents were normal in both tau mutant strains.
The authors then looked at what effect these changes in tau might have on AD by crossing the two tau strains with an AD mouse model (APP23), both independently and in combination. Both tau deletion and transgenic tau independently improved the memory of APP23 mice to wild-type levels. Both tau double-crosses also survived longer than the APP23 mice, which have a premature mortality phenotype, and in fact, the combination of transgenic tau with endogenous tau deletion fully rescued survival. The tau crosses also reduced excitotoxicity in the APP23 mice, decreasing the severity of seizures. Significantly, Aβ levels and plaque load were unchanged in these double-crosses, indicating that tau acts downstream of Aβ. Importantly, Ittner and colleagues used a different tau KO strain and different AD mouse strain than the 2007 study by Roberson et al., and yet they saw the exact same effect, demonstrating that this finding is robust and not dependent on a particular mouse strain.
These results implied that the NMDA receptor-PSD-95 interaction is a crucial feature of Aβ-induced excitotoxicity. To test this, the authors made use of a small peptide, Tat-NR2B9c, which has been shown to interfere with the NMDA receptor PSD-95 interaction and is already known to reduce excitotoxicity in a mouse model of ischemia (see Aarts et al., 2002). When primary neuronal cultures from wild-type mice were treated with this peptide, the neurons became more resistant to cell death induced by Aβ treatment. The authors then used osmotic mini-pumps to infuse the peptide into APP23 mice for eight weeks. Treated mice had fewer seizures, their memory improved, and their survival returned to near wild-type levels, even several months after treatment.
The results suggest several therapeutic possibilities, Götz said. Reducing tau levels can improve symptoms in AD mouse models, and therefore might be beneficial in people. Another exciting avenue might be to treat with a peptide or, better yet, small molecule that disrupts the NR2b-PSD-95 interaction, or with the tau projection domain, since these interventions weaken excitotoxicity without interfering with normal synaptic transmission. It is especially intriguing that a narrow therapeutic window of peptide treatment led to long-term protection, Götz said, and one of the more fascinating questions he intends to pursue is what might be the biological basis of that window. Other questions include discovering how Aβ acts to exert toxicity. Does it act extracellularly or intracellularly? The authors would also like to investigate whether the interaction between tau and PSD-95 is direct or indirect, Götz said.
Previous research had shown that Aβ can accelerate an existing tau pathology, but “these new findings show that Aβ toxicity is dependent on the presence of tau, and provide a molecular mechanism for that,” Götz said. The authors have also demonstrated a critical role for tau in dendrites, in contrast to the traditional conception of tau as an axonal protein. The findings are not in disagreement with previous work on tau, Götz said. “I believe there are different cellular compartments of tau, because tau most likely is not free; it exists always bound to something.” The majority of tau is bound to microtubules, and when it becomes hyperphosphorylated, it detaches from microtubules and forms tangles in cell cytoplasm (see Geschwind, 2003). By contrast, the dendritic pool of tau is small, Götz said.
In the second paper, the authors focused on hyperphosphorylated tau, rather than dendritic tau. First author Janet van Eersel used two tau mutant mouse models: pR5 mice that develop neurofibrillary tangles (NFTs) at six months, and K3 mice that develop parkinsonism and memory impairment. The authors showed that treatment with the small compound sodium selenate reduced tau phosphorylation and eliminated NFTs in both tau mouse models, in vitro and in vivo. Selenium is a crucial trace element in brain. Some forms, such as sodium selenite, are associated with toxicity; however, the authors found no toxic effects from sodium selenate, a more oxidized form of selenium, after four months of treatment.
Protein phosphatase 2A (PP2A) is a major phosphatase responsible for tau dephosphorylation, and both the level and activity of PP2A are down in the AD brain. The authors found that selenate treatment greatly increased the amount of PP2A that co-immunoprecipitated with tau, implying that selenate stabilizes the tau-PP2A complexes, allowing the phosphatase to more readily dephosphorylate tau. To test this idea, van Eersel and colleagues crossed the pR5 mouse with the Dom5 transgenic mouse, which expresses a dominant-negative form of PP2A, and demonstrated that selenate treatment was no longer able to reduce tau phosphorylation and NFTs.
Since sodium selenate mitigates tau pathologies in several tau model mice strains, it is a promising compound for drug development, Götz said. “It amazes me how this compound works, and you see it works on several levels,” Götz said, explaining that it not only reduces tau phosphorylation and tangle formation, but it also ameliorates motor deficits in the K3 mice and memory impairment in the pR5 mice, and prevents neurodegeneration of cerebellar basket cells.—Madolyn Bowman Rogers
If β amyloid is indeed an early driver of Alzheimer disease pathogenesis, AD patients presumably churn out the peptide too quickly, clear it too slowly, or both. These possibilities have remained speculative, though, without in-vivo human data on Aβ’s comings and goings in the central nervous system (CNS). Now, scientists have measured real-time Aβ turnover in the cerebrospinal fluid (CSF) of AD patients and age-matched controls in the same study. As reported at the International Conference on Alzheimer’s Disease (ICAD) held 10-15 July in Honolulu, the AD patients kept pace with control subjects in generating Aβ, but didn’t get rid of it fast enough. Meanwhile, emerging evidence has highlighted an intriguing connection among Aβ, dementia, and sleep as a hypothesized nightly period of Aβ reduction, and several other ICAD studies seem to bear this out.
Several years ago, Randall Bateman, David Holtzman, and coworkers at Washington University School of Medicine, St. Louis, Missouri, developed a method using in-vivo labeling of proteins to quantify production and clearance of Aβ in people. The scientists collected CSF samples from each subject every hour for 36 hours, immunoprecipitated out the Aβ, and used liquid chromatography and mass spectrometry to measure amounts of freshly made Aβ relative to the person’s total CNS Aβ pool. Using this approach to quantify Aβ dynamics in healthy volunteers, the St. Louis team found that the peptide gets turned over at a steady clip, with 7 to 8 percent of the CNS Aβ pool made and cleared every hour (Bateman et al., 2006 and ARF related news story). More recently, Bateman and colleagues applied the technique to determine that a single dose of a candidate AD drug could curb Aβ production 50 to 90 percent over 12 hours (Bateman et al., 2009 and ARF related news story).
At ICAD, Bateman presented preliminary data from a study in which he measured Aβ production and clearance rates in 12 mild AD patients (CDR 0.5 or 1), comparing them with 12 age-matched healthy controls. Aβ production rates (Aβ42 and Aβ40) were essentially the same for both groups, hovering just under 7 percent per hour. Aβ42 and Aβ40 clearance rates in healthy volunteers fell just above 7 percent per hour, consistent with measurements from prior publications using the in vivo labeling technique. However, the AD patients cleared Aβ42 about 30 percent more slowly, and their Aβ40 clearance was down 25 percent, relative to controls.
The data suggest that in AD, “the primary processing impairment is one of clearance,” Bateman said. Another takeaway is “the magnitude of the change—30 percent,” he added, noting that the field has long wrestled with how much one needs to modulate Aβ production or clearance to make a difference. On a broader level, the new findings should guide future mechanistic studies. Identifying aging-related changes that stymie Aβ clearance would help “get at how we might prevent or mitigate effects of those changes to delay amyloidosis,” Bateman told ARF.
A recent study by Holtzman and colleagues suggests that sleep may have something to do with CNS Aβ levels. Using microdialysis to track extracellular Aβ in the brains of living mice, the scientists found that Aβ levels fall during sleep, and that chronic sleep deprivation drives up plaque formation in Tg2576 AD transgenic mice (Kang et al., 2009 and ARF related news story).
A poster presented at ICAD suggests this could also hold true in people. First author Ricardo Osorio, Centro Alzheimer de la Fundacion Reina Sofia, Madrid, Spain, and colleagues led by senior investigator Blas Frangione, New York University, analyzed 86 cognitively normal adults and found that those reporting poor sleep quality had reduced CSF Aβ42 or Aβ42/Aβ40 (biomarkers for brain amyloid burden) relative to the sound sleepers. In addition, a study by Kristine Yaffe of the University of California, San Francisco, suggests that dementia-free older women with sleep-disordered breathing were more likely to develop mild cognitive impairment or dementia in five years compared to those without the sleep disorder. This backs large epidemiological studies suggesting that sleep complaints drive up a person’s risk of developing AD at follow-up (Lobo et al., 2008).—Esther Landhuis.
No Available Comments
For those who had all but turned their noses up at the dearth of big clinical trial news at the International Conference on Alzheimer’s Disease, held 10-15 July 2010 in Honolulu, Hawaii, a Hot Topics presentation offered a whiff of fresh air. Suzanne Craft of the University of Washington, Seattle, presented data from a four-month Phase 2 study of intranasal insulin, which seemed to improve certain measures of cognition and daily function, as well as biomarker profiles, in patients with mild cognitive impairment (MCI) or early Alzheimer disease.
Known for its role in managing diabetes, insulin also appears important for aging brains, Craft reminded attendees. For starters, the hippocampus, entorhinal cortex, and frontal cortex—brain areas important for cognition and hit hard in AD—all have a fair share of insulin receptors. Furthermore, insulin readily reaches the brain, where it helps neurons manage their glucose usage and maintain synapses. A few studies (e.g., van der Heide et al., 2005) suggest that insulin can modulate long-term potentiation, a change in synaptic strength that underlies learning and memory. More directly relevant to AD, insulin promotes Aβ trafficking out of cells and regulates levels of insulin-degrading enzyme, which also breaks down Aβ, Craft said. And in AD patients, Craft and others have reported reduced levels of cerebrospinal fluid (CSF) insulin and less brain insulin at early stages of disease, i.e., between Braak stages zero-I and II-III (Rivera et al., 2005).
These findings fueled an early pilot study in which insulin, delivered intranasally with a nebulizer, boosted CSF insulin levels within 30 minutes and improved memory in young, healthy adults. Importantly, the treatment did not affect plasma glucose or insulin levels, suggesting that the nasal delivery targeted the brain while averting systemic side effects. Craft and colleagues then took the treatment into disease populations. In a double-blind, placebo-controlled study of 24 amnestic MCI and early AD patients, three weeks of daily treatment with 20 IU (international units) intranasal insulin improved verbal recall without affecting peripheral glucose or insulin in treated participants (Reger et al., 2008 and ARF related news story).
In Honolulu, Craft presented data from a four-month trial of 104 patients with amnestic MCI or early AD. Aptly named SNIFF-120 (Study of Nasal Insulin to Fight Forgetfulness), this double-blind Phase 2 trial was funded by the National Institute on Aging. Participants were randomized into three roughly equal groups that received saline placebo or insulin (20 or 40 IU) through the nose each day, and underwent cognitive testing at baseline, two months, and four months. A small subset got brain scans using fluorodeoxyglucose positron emission tomography (FDG-PET) to measure glucose utilization, and spinal taps to check CSF biomarkers. Primary outcome measures were four-month change on cognitive (ADAS-Cog, delayed story recall) and functional (ADCS-ADL, Dementia Severity Rating Scale) tests. CSF and FDG-PET profiles served as secondary measures.
The placebo group declined by about 1.5 points on the ADAS-Cog, which was within expectation for a four-month timeframe, Craft said. For comparison, a recent meta-analysis of 87 double-blind, placebo-controlled AD trials showed ADAS-cog changes of 1.44 at six months and 4.13 at one year (see ARF ICAD 2008 story). The insulin-treated participants did not decline on the ADAS-Cog and showed net improvement on the functional tests. The low-dose (20 IU) insulin group also improved their scores in the delayed recall test. As for the eight placebo and 18 insulin-treated participants who underwent spinal taps, the pooled low- and high-dose insulin group had lower CSF Aβ40/Aβ42 ratios, which correlate with lower AD risk, compared to participants on placebo. Daily insulin also seemed to protect treated patients (n = 24) from the four-month reduction in glucose utilization seen on FDG-PET in the placebo group (n = 16).
The team is in the process of applying for NIA funding for a longer, larger Phase 3 trial of intranasal insulin, Craft told ARF.
On the basic science front, other studies presented at ICAD added to the emerging connection between insulin and regulation of both Aβ and tau. Ewan McNay, University at Albany, State University of New York, in collaboration with Craft, reported that rat models of type 1 and type 2 diabetes show impaired production, accumulation, and clearance of Aβ. In other studies with wild-type rats, McNay and colleagues reported that small Aβ oligomers known as ADDLs (amyloid-derived diffusible ligands) may mediate these effects. Hippocampal injection with synthetic ADDLs into wild-type rats caused problems on a spatial memory task and with glucose transporter translocation and insulin signaling. Evidence for a potential role for Aβ oligomers also appeared on a poster by Fernanda De Felice, Federal University of Rio de Janeiro, Brazil, and colleagues. In cultures of rat hippocampal neurons, the researchers found that—similar to what occurs in type 2 diabetes—ADDLs trigger abnormal phosphorylation of insulin receptor substrate-1 and that c-Jun N-terminal kinase might be responsible. Cheng-Xin Gong and colleagues at the New York State Institute for Basic Research on Staten Island found depressed insulin-PI3K-AKT signaling in the brains of patients with AD and type 2 diabetes. In quantitative Western analyses of postmortem brain tissue, these researchers correlated the downregulation of insulin-PI3K-AKT signaling components with calpain-1 overactivation and abnormal tau phosphorylation.
All told, the recent intranasal insulin trial data, as well as emerging evidence from in vivo and in vitro studies, suggest that a growing number of scientists are ferreting out ties between insulin signaling and Alzheimer disease.—Esther Landhuis.
TDP-43 was an unknown in the neurodegeneration field until four years ago, but since then has received plenty of attention. Linked to both amyotrophic lateral sclerosis and frontotemporal dementia, the misbehaving protein took its share of the limelight with more than a dozen talks and posters at the International Conference on Alzheimer’s Disease, held 10-15 July 2010 in Honolulu, Hawaii.
Perhaps the biggest take-home message from the meeting: The TDP-43 field is slowly accumulating a handy collection of animal models (see ARF related news story). But those models vary widely in their design—some overexpression, some knockouts, some knockdowns—and yield varying results. Meanwhile, stress—or rather, an affected cell’s poor ability to handle it—was another theme that bubbled up amid the wave of disparate TDP-43 data shown at ICAD.
Jada Lewis and colleagues from the Mayo Clinic College of Medicine in Jacksonville, Florida, surfed in with multiple presentations on new mice expressing human TDP-43. Bettina Schmid of the Ludwig Maximilians University in Munich, Germany, made a splash with her zebrafish loss-of-function model. Worms, too, swam in with TDP-43 data from Brian Kraemer of the University of Washington in Seattle and Peter Ash from the Mayo Clinic in Jacksonville, Florida. Blair Leavitt, from the University of British Columbia in Vancouver, Canada, sailed through the TDP-43 marina with a talk about progranulin knockout mice. These may provide a useful model for TDP-43 study since progranulin mutations are linked with frontotemporal dementia (FTD). And Ben Wolozin of Boston University visited the TDP-43 beach with a discussion of an upcoming publication on that protein’s association with stress granules. Kraemer and others cited Wolozin’s presentation as one of the most interesting of the meeting.
Scientists first identified a connection between TDP-43 and neurodegeneration in 2006 (see ARF related news story on Neumann et al., 2006) with news of its presence in inclusion bodies. The link received another boost two years later with the discovery of TDP-43 mutations in some people with amyotrophic lateral sclerosis (ALS; see ARF related news story on Sreedharan et al., 2008 and Gitcho et al., 2008). In disease, the normally nuclear protein, which is involved in RNA processing, moves into the cytoplasm and forms inclusions. Some evidence indicates that cleavage by caspases produces a toxic carboxyl-terminal fragment (see ARF related news story on Zhang et al., 2009). However, the mechanism for TDP-43 pathology remains uncertain. “The question is still open if TDP-43 inclusions and mutations are loss or gain of function,” observed Christian Czech of F. Hoffman-La Roche in Basel, Switzerland, in an e-mail to ARF after taking in TDP-43 presentations in Honolulu. Here, ARF rounds up TDP-43 news from ICAD.
TDP-43 Proteinopathy Spreads
TDP-43 inclusions are common in ALS and FTD, but researchers are finding that the protein’s reach extends even further than that. Frederic Calon of Université Laval in Québec City, Canada, examined insoluble TDP-43 in brain tissue from people who had mild cognitive impairment or Alzheimer disease. A few of the MCI samples, and the majority of AD tissues, evinced increased insoluble TDP-43 in the parietal cortex. Accumulated amyloid-β and phosphorylated tau tended to accompany the increase in insoluble TDP-43. Calon found similar pathology in triple-transgenic AD model mice, though only in aged ones (18 months old). “It’s possible that TDP-43 is also a neuropathological marker of AD,” Calon told ARF.
Lisa Taylor-Reinwald, working with Nigel Cairns at Washington University in St. Louis, Missouri, took that thought further with a study of brain tissue from people who had been cognitively normal at their time of death. In seven out of 50 samples, she observed abnormal TDP-43 staining coincident with amyloid-β and tau pathology. TDP-43 pathology might precede symptoms but comes after amyloid-β accumulation, Taylor-Reinwald suggested to ARF. However, she cautioned that a cross-sectional pathology study such as hers cannot, by itself, clearly delineate what happens first.
TDP-43’s Next Top Model: Candidates Strut Their Stuff
To understand TDP-43’s function from the start of disease to its end, many researchers are turning to animal models. “Everybody is trying to get that [TDP-43] mouse,” Taylor-Reinwald noted, and Honolulu featured several presentations related to such mouse models—not to mention worms, fish, and even monkey. Mayo’s Lewis, along with Ashley Cannon and Yafei Xu, presented soon-to-be-published mouse models, made in collaboration with Leonard Petrucelli, also at the Jacksonville Mayo Clinic. The mice express wild-type human TDP-43. The extra protein, they found, caused TDP-43 truncation, a cellular increase in ubiquitin, abnormal mitochondrial aggregates surrounding the nucleus, and reactive gliosis. These mice suffered neural degeneration and had trouble walking. They died young, their survival rate being proportional to the amount of excess TDP-43 present. However, they did not exhibit TDP-43 inclusions.
Lewis’s mice “look interesting,” Kraemer said. At ICAD, Kraemer discussed his own TDP-43 knockout mice (see ARF related news story on Kraemer et al., 2010), as well as new data from the study of nematodes. His C. elegans expressing wild-type TDP-43 suffer mild neural dysfunction, but no neurodegeneration. With the mutant gene, Kraemer observed neurodegeneration and toxicity. However, he found no strong evidence that this toxicity arose from TDP-43 aggregation or truncation by caspases. Either the process in worms is different from that in mammals, which is possible, or caspases are not a player in it, Kraemer told ARF. Ash presented recently published data from C. elegans, suggesting that TDP-43 causes neurotoxicity via two mechanisms: action of the full-length, nuclear protein and aggregation of cytoplasmic fragments (Ash et al., 2010).
Takanori Yokota, of the Tokyo Medical and Dental Universal University in Japan, brought news of a potential non-human primate model. He used a viral vector to express wild-type human TDP-43 gene in the spinal cord of macaques, and found that this diverted TDP-43 to the cytoplasm, as it does in human disease.
However, most of those models have in common a flaw, Schmid told ARF in an e-mail. They rely on TDP-43 overexpression. “Too much TDP-43 is toxic in every species analyzed thus far, and might not have anything to do with the disease,” she wrote. However, researchers have struggled to develop models that have reduced TDP-43, in part because the protein is essential for embryonic development. Other models, such as animals heterozygous for a TDP-43 knockout, showed no phenotype (Sephton et al., 2010; Wu et al., 2010).
At the meeting, Schmid presented the first vertebrate loss-of-function model to show a neuronal phenotype. Zebrafish carry two TDP-43 orthologs; knockdown of both—via a new technique based on zinc finger nucleases—caused motor neuron dysfunction in embryos that survived for only six days. The fish embryos also had circulatory problems—the heart was beating, but no blood was moving through it. Schmid hopes to rescue this phenotype by introducing human TDP-43 mRNA and then test whether various TDP-43 mutants can do the same. In addition, this model provides the opportunity to search, in vivo, for RNAs that are TDP-43 targets and relevant to disease, Schmid told ARF. The knockout approach was clever, and the analysis careful, Czech commented.
Other researchers are working with animals that have normal TDP-43 genes, but might still offer information on how this protein might go rogue. One such example is Leavitt’s mice. They are conditional progranulin knockouts that suffer subtle abnormalities in social behavior, long-term potentiation, and synaptic spine density, Leavitt told ARF. TDP-43 fans were interested because progranulin mutations can give rise to FTD with TDP-43 pathology. “My guess is these mice will be very important for studying progranulin-TDP-43 interaction,” said Salvatore Oddo of the University of Texas Health Science Center in San Antonio. However, the mice have not yet shown any TDP-43 pathology up until nine months of age, Leavitt said. Leavitt also provided a technical tip for researchers interested in progranulin: The Santa Cruz N-19 antibody is not specific for progranulin immunocytochemistry in mouse brain, he wrote in an e-mail to ARF.
Another such example of a TDP-43-related model is the flies by J. Paul Taylor and colleagues at St. Jude Children’s Research Hospital in Memphis, Tennessee (Ritson et al., 2010). In this study, the researchers generated Drosophila carrying mutations in valosin-containing protein (VCP). VCP mutations are associated, variously, with FTD, as well as Paget’s disease of bone or inclusion body myopathy. In a genetic screen, the Memphis group found that VCP interacts with three RNA-binding proteins including TDP-43. If either TDP-43 or VCP was mutated, then TDP-43 redistributed to the cytoplasm, causing cytotoxicity. “The toxicity initiated by mutations in VCP is mediated, at least in part, by TDP-43,” Taylor told ARF. He suggested that TDP-43 may be part of a common mechanism for neurodegeneration, not simply a secondary pathology, in diverse diseases.
TDP-43: Misplaced, Miscleaved, Misaggregated
Besides the numerous animal models discussed at ICAD, several scientists presented in-vitro data. For example, Janet van Eersel, working with Jürgen Götz of the University of Sydney, Australia, is developing a cell culture model for TDP-43 proteinopathy by treating a variety of cell types with proteasome inhibitors. This means of disposal being blocked, TDP-43 aggregation and fragmentation went up, as it does in disease, suggesting that proteasome dysfunction may play a role in the proteinopathy. This simple treatment could be useful in studying TDP-43 activity and dysfunction in culture. Yonjie Zhang of the Mayo Clinic in Jacksonville is also working on a cell culture model for ALS and FTD. He created a stable human neuroblastoma line with inducible expression of the carboxyl-terminus of TDP-43. That model exhibits TDP-43 phosphorylation and aggregation in ubiquitin-positive inclusions.
Stressed-Out Cells
Wolozin presented data from an upcoming paper linking TDP-43 and stress granules. An association between TDP-43 and stress granules in cell culture has been shown before, but Wolozin now also discovered TDP-43 in stress granules in human brain tissue. Other researchers were unable to find the association in human tissue (Colombrita et al., 2009), he suggested, because it is a weak signal easily buried by background autofluorescence. Graduate student Liqun Liu-Yesucevitz used the dye Sudan black to quelch background fluorescence, and was then able to detect TDP-43 in stress granules in brain samples from people who had had ALS or FTD.
The Boston researchers determined that TDP-43 connects with stress granules both directly and indirectly. The protein binds stress granule proteins such as TIA-1, and it also interacts with the mRNA in the complexes. Liu-Yesucevitz explored the location of four TDP-43 mutants in cells under stress. “The mutations all showed dramatic two- to threefold increases in the number of inclusions that formed, and the number of cells that actually formed inclusions,” Wolozin told ARF. “Perhaps most dramatic was the extent to which the mutant TDP-43 left the nucleus and went to the cytoplasm.” It is rare to find consistently abnormal activity among many TDP-43 mutants, but this study did, Wolozin said. Hence, the work suggests that an altered response to stress may be a key part of mutant TDP-43 pathology. Notably, Christian Haass, of Munich’s Ludwig Maximilian University, was also at ICAD presenting data on stress granules and FUS, another protein associated with ALS (see ARF related news story on Dormann et al., 2010).
For his part, Anthony White of the University of Melbourne in Australia also discussed the role of stress in TDP-43 proteinopathy. His group has found that changes in cellular metal levels, as well as oxidative stress, can alter the processing of TDP-43 (Caragounis et al., 2010). Cellular stress factors like these are common in neurodegenerative disease, and they somehow lead to TDP-43 fragmentation and relocalization to cytoplasmic aggregates. “These findings indicate that in sporadic cases of ALS, changes to TDP-43 may occur through neuronal stress,” White wrote in an e-mail to ARF.
Antonella Caccamo presented her work on TDP-43 and autophagy, conducted with Oddo at the University of Texas in San Antonio (Caccamo et al., 2009). These researchers found that blocking autophagy increased levels of the TDP-43 carboxyl-terminal fragment. When transfected into cells, this fragment then recruited extra TDP-43 to the cytoplasm. However, treatment with rapamycin—which bolsters autophagy—rescued the effects of the fragmented protein. The experiments were all in vitro, Oddo cautioned. If they hold up in vivo, the work would suggest that the carboxyl-terminus of TDP-43 is important for pathology, and that improving autophagy could counter its effects. On the last morning of ICAD, Oddo presented more data on rapamycin treatment in a different neurodegenerative paradigm, where driving autophagy in this way improved Aβ- and tau-related endpoints in mouse models (Caccamo et al., 2010).
The conference gave TDP-43 researchers plenty to think about, but little to be certain of. Major questions remain, Schmid noted. First and foremost among them, she wondered: “What is the physiological function of TDP-43, especially in neurons? Is the disease a gain of toxic function of the aggregates or oligomers, or a loss of function in the nucleus?”—Amber Dance.
No Available Comments
Over the years, hopes of using neural stem cells to patch wide swaths of failing brain circuitry in Alzheimer disease have met mostly with skepticism. However, new data from several independent studies suggest these approaches may have a fighting chance, and that their heroics rest less with making new neurons and more with boosting glial cells and neurotrophins. At the International Conference on Alzheimer’s Disease (ICAD) held 10-15 July 2010 in Honolulu, Hawaii, Maria Grazia Spillantini, University of Cambridge, U.K., reported that neural precursor cells, which become astrocytes and oligodendrocytes after transplantation, prevent cortical neurodegeneration in a mouse model of human tauopathy. These findings, just published in the July 28 Journal of Neuroscience, struck a chord with recent work by the lab of Frank LaFerla, University of California, Irvine, who spoke at ICAD on the latest tweaks to his approach using neural stem cells to replenish synapses and restore cognition in AD transgenic mice. Meanwhile, a strategy based on a neurotrophic peptide reverses memory loss and drives neurogenesis in the same AD strain (3xTg) and in a mouse model of Down syndrome, according to studies presented by Inge Grundke-Iqbal and colleagues at New York State Institute for Basic Research, Staten Island. The time seems ripe to focus on regenerative approaches, particularly with glial cells, as the first clinical trial using human embryonic stem cells gained approval by the U.S. Food and Drug Administration last week (see The New York Times story). The experimental therapy will use oligodendrocyte precursors to treat patients with spinal cord injury.
On the preclinical front, Spillantini and colleagues began exploring neuroprotective strategies after discovering age-related, cortical cell loss in a tauopathy mouse model they had created and characterized previously as having a largely motor phenotype (Allen et al., 2002). These transgenic mice express mutated (P301S) human tau and, at two to three months of age, start to develop a range of motor phenotypes (e.g. poor grip, crossed hind limbs, tremor, muscle weakness) that become severely disabling by five to six months. At this point, P301S mice have lost half their motor neurons and show extensive aggregation of hyperphosphorylated tau (Delobel et al., 2008). However, it was initially unclear whether the massive neuronal death extended into cortical regions, prime areas of destruction in human frontotemporal dementias.
Using cresyl violet and NeuN staining to count neurons in the cerebral cortex of P301S transgenic and age-matched wild-type mice, first authors David Hampton and Daniel Webber, of the University of Cambridge and now at the University of Edinburgh, found no appreciable cell loss in the tau transgenics at two months. However, three-month-old P301S mice had pronounced neuronal death, and “very few cells left” by five months, Spillantini told the ICAD audience. While this mimics human disease, “no one knows how the cells died,” she said. “In human studies, all you have is end-stage tissue to analyze.”
In the P301S mice, however, the researchers were able to stain cortical tissue with the phosphorylation-dependent anti-tau antibody AT8 at various time points to track pathological changes accompanying the cell loss. They saw ring-like tau staining at two months, increased cell body and dendritic signal at three months, and many tau inclusions by five months. Among neurons with ring-like tau deposits, some died before tau tangles appeared, while others didn’t seem to form tangles at all, Spillantini reported.
The tangle-free cells seemed salvageable, prompting the scientists to test whether a cell-based approach could curb the neuronal demise in the tauopathy model. After all, recent rodent studies suggest that astrocyte precursors can help in motor neuron disease (Lepore et al., 2008 and ARF related news story; Yamanaka et al., 2008). Moreover, LaFerla’s lab has used neural stem cells to improve memory in 3xTg AD mice (Blurton-Jones et al., 2009 and ARF related news story). This mouse line develops both amyloid and tau pathologies without obvious motor deficits, making it amenable to cognitive tests that require motor function.
Spillantini and colleagues tried a similar strategy in the P301S tauopathy mice, injecting fluorescent neural stem cells into their cortex at two months of age, and analyzing the transplant region one or three months later. The stem cell transplants brought neuron counts in the tau transgenics up to wild-type levels, as judged by NeuN staining at both time points. Glial cells seemed to mediate these effects, as the vast majority of transplanted cells did not become neurons but instead differentiated into astrocytes and oligodendrocytes. This jibes with LaFerla’s study, which found hardly any neurons among the progeny of the stem cells transplanted into 3xTg mice. Furthermore, when Spillantini and colleagues cultured neural stem cells and pushed them toward the astrocytic lineage in vitro, they found that transplantation of the pre-differentiated astrocytes rescued cell death in the tauopathy mice just as the stem cell transplants had done before.
Consistent with other studies demonstrating bystander effects of neuroprotection (Nagahara et al., 2009 and ARF related news story), cells derived from the transplanted neural precursors expressed high levels of glial cell-derived neurotrophic factor (GDNF) and ciliary neurotrophic factor (CNTF) mRNA, as judged by quantitative PCR (qPCR). Brain-derived neurotrophic factor (BDNF) levels looked normal by qPCR, but were elevated, along with those of nerve growth factor, in immunohistochemical staining of brain tissue from stem cell-transplanted mice. Other studies, including LaFerla’s stem cell transplantations in 3xTg mice, have identified BDNF as a mediator of neuroprotection. Directly injecting the neurotrophin into the brains of transgenic mice helped them do better on spatial memory tests, whereas BDNF-depleted stem cells failed to improve the animals’ cognition (Blurton-Jones et al., 2009 and ARF related news story).
At ICAD, LaFerla and first author Mathew Blurton-Jones presented, in separate talks, the latest developments in this ongoing work. The researchers had shown that the neural stem cells could rescue cognition one month after transplantation, but wonder whether the neuroprotection would hold three to six months later and beyond, as the mice continued racking up brain Aβ. In other words, “if you don’t get rid of that amyloid, is that a bad thing? Does it matter? We’d been thinking it’s a bad thing,” LaFerla told ARF. Studies to determine if the cognitive benefits wane after the first month post-transplantation are ongoing.
In the meantime, acting on a hunch that the benefits may fade with time, the scientists have stably expressed neprilysin, an Aβ-degrading enzyme, in neural stem cells. This gives the cells a one-two punch, as they not only make a neurotrophic factor, but also churn out a protein that helps slow amyloid pathology within transplanted animals. In preliminary studies, the neprilysin-expressing stem cells markedly reduced plaque load in aged 3xTg mice. Knowing the approach reduces Aβ, the team has begun a new set of longitudinal behavioral studies to ask whether the neprilysin provides additional benefit or longer-lasting effects on cognition in the AD mice, Blurton-Jones told ARF. Importantly, the effects so far seem to extend beyond the transplanted region, opening the door in the future for introducing neuroprotective factors via the periphery rather than brain injections. “If the neprilysin works, you may be able to modify mesenchymal cells or blood cells to express neprilysin and hopefully have that circulate through the brain,” LaFerla speculated.
Julie Blanchard, working with Grundke-Iqbal, described a pharmacological approach for neuroregeneration in mice that does involve peripheral injection, though not of engineered cells but of a peptide that spurs neuronal differentiation of endogenous progenitor cells. Grundke-Iqbal’s lab has synthesized a brain-permeable 11-mer (aka peptide 6) corresponding to the active region of CNTF and shown previously that it enhances memory in wild-type mice (Chohan et al., 2009). At ICAD, Blanchard reported that daily intraperitoneal injection of this peptide for six weeks restored not only short- and long-term memory, but also neurogenesis, and dendritic and synaptic plasticity, in seven- to eight-month-old 3xTg AD mice. Like LaFerla’s stem cell studies in the same mouse strain, the CNTF peptide had no effect on Aβ or tau pathology, Blanchard said. Her 3xTg findings were accepted last week for publication in the journal Acta Neuropathologica.
On a poster, Grundke-Iqbal reported that 30-day slow release of the peptide in the form of subcutaneous pellets was able to restore neurogenesis and cognition in 11- to 15-month-old Ts65Dn Down syndrome mice. Adult neurogenesis also goes awry in the Ts1Cje mouse model of Down syndrome, which has fewer neural progenitors and an excess of astrocytes, according to a report published July 28 in PLoS One (Hewitt et al., 2010). Taken together, these recent studies “further support the notion that trophic-based therapies should be aggressively pursued,” Blurton-Jones said.—Esther Landhuis
Updated 5 August 2010
Following a media call hosted by Alzheimer's Association, The New York Times ran a second story on the revised diagnostic guidelines.
Every newspaper editor knows it: dare tinker with the established look and feel of the paper, and some readers’ initial response will be irritated, even vituperative, no matter how carefully the modernization was done. Something similar—though on a deeper, more serious issue—happened to perhaps the biggest news story of the International Conference on Alzheimer’s Disease, hosted 10-15 July 2010 at the Hawai’i Convention Center in Honolulu. There, three expert groups convened by the National Institute on Aging and the Alzheimer’s Association presented to the assembled research and clinical community draft results of their ongoing, year-long effort to incorporate scientific advances of the past quarter-century into a revision of the current diagnostic criteria for Alzheimer disease (AD). Published in 1984, these criteria predate much of what’s now known about imaging and biomarker research, about genetics, related diseases, and, indeed, the molecular pathophysiology underlying early AD. They are widely used and have never been formally updated.
In Honolulu, a panel of leading scientists addressed nearly 1,000 of their colleagues in the main lecture hall, and then spoke with attending reporters. For its part, the Association had issued a press release and made the revised criteria—all 30 pages of them—freely available to reporters and the public. Even so, the story spun out of control. Some press reports implied, erroneously, that biomarker testing would triple the number of diagnoses starting this fall, driving up costs while needlessly upsetting people who would never get dementia and for whom nothing could be done anyhow. The issue prompted public concern among Alzheimer disease scientists who question the amyloid hypothesis, among medical practitioners, and among psychiatrists who questioned the wisdom of attempting a diagnosis when the person feels no “dis-ease,” i.e., is still at ease.
What happened? Most of all, what were the scientists really trying to say? And what do their colleagues think of the new criteria?
The scientific workgroups had broken the task of revising diagnostic criteria into three parts. One group, led by Guy McKhann of Johns Hopkins University in Baltimore, Maryland, took on Alzheimer’s dementia. A second group, led by Marilyn Albert, also of Hopkins, dealt with mild cognitive impairment due to AD, and a third, led by Reisa Sperling of Harvard Medical School, tackled preclinical AD. Stating that the underlying Alzheimer disease process advances continuously starting with a decade-long or maybe even longer asymptomatic period, the three sets of new guidelines each incorporate biomarker measurements. The guidelines gradually rely on biomarkers more and more the earlier in the disease process one goes. The workgroups posted them on the Web, and invite fellow clinicians to engage in a period of feedback throughout the month of August, before the guidelines are published in a peer-reviewed journal.
The key point of misunderstanding lay in the fact that the biomarker portions in the criteria for all three phases of AD are strictly for research purposes. At this point and for some years to come, the biomarker-driven criteria serve as a conceptual framework for testing and to facilitate therapeutic trials in secondary prevention. They are not meant for community physicians any time soon.
Can You Hear Me?
This was said explicitly. “I want to put a clear bright line between those criteria intended for general use by clinicians, and those criteria intended to be used as research criteria. These research criteria are not even available to us as we go over to the clinic to diagnose! This is hugely important with respect to how the world hears this issue,” Steven DeKosky of the University of Virginia told both scientists and the media. “We want to get out the message that we are trying to find ways to diagnose and treat earlier.” Alas, this was not universally heard. Some news stories mixed up what’s ready for prime time and what’s not, and lumped the recommendations for clinical use in with those for research use. And truth be told, the threesome of presentations at ICAD left this a bit vague. While the entirety of the preclinical criteria are meant for research only, and were unequivocally presented as such, there was less clarity at ICAD about the criteria for the MCI and dementia phases of the disease. For those two, the portion of each set of criteria dealing with biomarkers is also intended primarily for research, whereas their clinical and cognitive portions were indeed written to be broadly applicable in clinics around the country.
“The clinical and cognitive parts of the AD dementia and MCI criteria are for community use now. But throughout all three sets of criteria, anything that has to do with biomarkers has to be evaluated in research,” Albert told ARF.
Those esteemed readers who missed the hullabaloo may want to grab a cup of coffee and catch up on how the issue played in news reports and in the blogosphere before they read on for a summary of the actual ICAD presentation in Honolulu. Here’s a sampling: One article widely cited as accurate appeared in Medscape Medical News, one that caused head-scratching among scientists, and a slew of angry comments appeared in The New York Times. This was followed by a New York Times Op-Ed. Coverage appeared on ABC News, CBS News, Forbes.com (see also guest reply below the blog), and blogs by clinicians, for example, The Health Care Blog. Some of the coverage prompted Maria Carrillo from the Alzheimer’s Association to clarify the purpose of the draft criteria on CNN. And the Association is holding a media briefing today to do so again.
Resistance against these biomedicine-inspired diagnoses stirred in the psychiatric community, as well, for example, an article in PsychiatricTimes. In an invited independent comment, Allen Frances, professor emeritus at Duke University School of Medicine in Durham, North Carolina, called the guidelines suggested at ICAD a “dreadful mistake”; see comment below. Frances lead the American Psychiatric Division’s DSM-IV task force. In an earlier article about unintended consequences of changing diagnostic criteria in mental illnesses, Frances had similarly critiqued the upcoming DSM-V, which has proposed its own separate set of draft diagnostic guidelines for AD. (If this link pulls up a registration wall, Google “Allen Frances, a warning sign”.) In a response on behalf of the DSM-V’s psychosis group, William Carpenter of the University of Maryland School of Medicine in Baltimore notes, “The field is moving towards early detection, secondary prevention, and more robust therapeutic results,” echoing a similar trend in the AD field.
Taken together, public criticism voiced so far argues against moving toward a pathophysiology-inspired definition of AD in the absence of a cure. But some criticism also comes from the opposite direction. For some researchers in the AD field, the revisions did not go far enough, though most scientists find common ground around the concept that early detection in a research setting is a means towards finding better drugs. For more, see Part 2 and Part 3 of this series.—Gabrielle Strobel.
This is Part 1 of a three-part series. See Part 2 and Part 3.
No Available Further Reading
At the International Conference on Alzheimer’s Disease held 10-15 July 2010 in Honolulu, three workgroups of clinicians from the U.S. and Europe appointed by the National Institute on Aging and the Alzheimer’s Association offered new guidelines for the diagnosis of AD, making them freely available for download and feedback. The guidelines span a conceptual continuum beginning with a preclinical stage—and they sparked intense reactions in the media and medical circles (see Part 1). Confused? Perhaps a detailed summary of exactly what was actually presented at ICAD will help inform the debate.
The basics have been widely publicized and are available through the draft documents themselves. In brief, Creighton Phelps from the National Institute on Aging in the summer of 2009 set up three workgroups. The initiative grew out of ongoing discussion at prior meetings convened by the Alzheimer’s Association, which itself was spurred in part by the publication of a widely noticed paper redefining early AD (Dubois et al., 2007). In essence, these European-American authors defined a large subset of the heterogeneous group of people who would otherwise be diagnosed as having mild cognitive impairment as indeed having prodromal AD. Their method combined a cognitive test probing attention and delayed word learning with an AD marker such as MRI brain volumetry, PET imaging, or spinal fluid biochemistry. Only people who do not have prodromal AD by way of these tests are then considered to have the clinical syndrome of MCI. These research criteria are being used in Europe and in some U.S. therapeutic trials.
How do the criteria of Dubois et al. and the new NIA/AA sets relate to each other? “Our three workgroups attempted to be more detailed and comprehensive. Our criteria span the entire disease continuum. They contain guidance for community physicians who have no access to neuropsychology or biomarkers, for clinicians who have some access, as well as for tertiary care settings where these markers are being studied and clinical trials conducted,” said Phelps.
Because it was the third set of criteria—for preclinical AD—that touched off the lion’s share of controversy, this news account continues with a detailed transcript of the presentation by its leader, Reisa Sperling of Harvard Medical School in Boston. It will conclude with highlights of the ensuing scientific discussion at ICAD for all those who were unable to be there (Part 3).
The Preclinical Criteria—A Work in Progress
Phelps introduced Sperling as having the hardest task because her workgroup was charged with breaking new ground. On behalf of her group, Sperling told the audience: “We think it is time to define this stage because converging evidence suggests that the pathological process of AD begins years, perhaps more than a decade, prior to the diagnosis of dementia. This long preclinical phase provides a critical opportunity for potential intervention with disease-modifying therapy. We need to elucidate the link between the pathophysiological disease process and the emergence of the clinical syndrome further and determine the best predictors of clinical decline.” The draft research criteria, she said, aim to enable exactly this research. The workgroup dealt with the factors that predict decline to MCI and dementia due to the brain pathology of AD, even though many other factors clearly intersect with the AD pathway, for example, cerebrovascular and Lewy body diseases. But to facilitate the design of secondary prevention trials, the workgroup looked for factors they considered central to the core AD process in the decade prior to MCI and dementia symptoms, Sperling said.
What to call the stage before MCI? Asymptomatic? Presymptomatic? Latent? Pre-MCI? “We did some arm wrestling about that but then decided to use the term ‘preclinical.’ This term captures the continuum from an asymptomatic individual to one with very early symptoms that do not rise to the level of MCI,” Sperling said.
Evidence of this preclinical decade is mounting, though it is not ready for broad clinical use, Sperling stressed. It includes data from brain imaging, CSF assays, and other biomarkers that point to AD pathologic changes in vivo. Epidemiologic and cognitive research, too, have pointed to risk factors and subtle cognitive changes years before a person meets criteria for MCI.
Acknowledging controversy around that concept, Sperling encouraged the field to look beyond its own borders to other areas of medicine, where the concept of a preclinical stage of disease finds wide acceptance. Examples include carcinoma in situ, or coronary artery disease detected on cardiac catheterization. Quite often, symptoms are not required to diagnose disease. For example, renal insufficiency or liver cirrhosis can be detected by blood tests, and subsequent treatment can then prevent the emergence of symptoms. Regarding concerns about false-positive test results, Sperling noted that in hypercholesterolemia, not all individuals later develop atherosclerosis; even so, blood cholesterol testing and control are still considered useful.
Compared to those diseases, one glaring difference is that AD lacks proven disease-modifying drugs. But this is the point of research diagnoses: it may be more feasible to develop these with a biomarker-based diagnosis of preclinical AD than in mild to moderate AD as per the 1984 NINDS/ADRDA criteria.
The other big knowledge gap yet to fill in AD is that scientists lack widely reproduced, definitive evidence linking the pathologic process to the emergence of symptoms, Sperling said. This is the leading edge of ongoing research in many laboratories at present (e.g., ARF Toronto story, ARF St. Louis imaging story, ARF St. Louis biomarker story, ARF Toronto abstracts). The working hypothesis holds that Aβ accumulation is an early inciting event that is necessary but may not be sufficient to cause the clinical syndrome of AD, Sperling said. Multiple factors—possibly different ones in different people—mediate the relationship from AD pathology to clinical manifestation.
Evidence to date in this research area suggests that clinically “normal” people who have brain amyloid as per PET imaging or CSF Aβ42 measurement tend to have dysfunctional synapses in AD-relevant areas on functional MRI and FDG-PET, as well as elevated CSF tau/phospho-tau and a thinning of the cortex and atrophy of the hippocampus as seen with MRI. Studies are beginning to report subtle decrements in cortical connectivity by BOLD fMRI and in cognitive performance. This is early data, Sperling acknowledged; it needs to be followed up longitudinally and confirmed independently. Limiting caveats at this point include that many of the data come from highly select, or self-selected cohorts that over-represent people with advanced education or concerns about memory or family history. Also, the current Aβ biomarkers likely reflect monomeric or fibrillar Aβ, not its oligomeric forms.
Still, the overall data gave the workgroup confidence to devise draft operational research criteria for preclinical AD. They offer a common language, and a framework, for researchers who conduct longitudinal natural history studies to determine whether the presence of Aβ, either alone or with markers of neurodegeneration, predicts cognitive decline. One step beyond that, Sperling said, is that the criteria can support clinical trials of candidate disease-modifying agents in this phase of AD. The criteria start from the twin postulates that AD represents a sequence of biological events that begins far in advance of clinical dementia with cerebral amyloidosis and that having this biomarker affects a person’s future clinical decline and responsiveness to treatment.
Before laying out the group’s criteria themselves, Sperling again emphasized that they are not intended for routine clinical use in the near future. For that to become appropriate, further research must first determine that the requisite biomarkers indeed predict an individual person’s future clinical decline. For practical reasons, the research criteria cut the spectrum of preclinical AD into discrete stages, because stages can be more easily tailored to the goals of a given study, Sperling said. For example, some FDA clinical trials might enroll people in a late preclinical stage to ensure rapid progression to MCI within the duration of the trial, whereas other therapeutic strategies might be most successful if started earlier, many years prior to MCI.
With this qualified introduction, Sperling then unveiled the draft criteria. Stage I represents asymptomatic amyloidosis, whereby people have low CSF Aβ42 or elevated brain amyloid by PET imaging, but normal cognition for their age and education. Stage II is still defined by normal cognition but requires an added AD-like pattern of abnormality on downstream markers. These could be synaptic dysfunction as per FDG hypometabolism or functional connectivity MRI, increased CSF tau/phospho-tau, or MRI findings of cortical thinning or atrophy in the hippocampus or entorhinal cortex. By Stage III, symptoms start to encroach. This can be evident in the form of subtle cognitive decline over time but still within the normal range, or by low performance on certain sensitive cognitive measures. Or it could be a person’s subjective complaint that memory is slipping, which stays below the threshold for MCI.
Sperling said that these stages may change once alterations prior to Aβ become more widely documented; at present, some cohort studies in ApoE4 carriers suggest that synaptic dysfunction may actually precede Aβ markers. On the practical side, the field at large has to clear the significant hurdle of standardization. It has to set cutoffs and validate all biomarkers involved. One such initiative is underway (see story on Quality Control Initiative), and initial data were reported in a separate talk at ICAD, where, indeed, one entire session dealt with the challenge of how to standardize biomarkers and brain imaging. The current draft criteria can actually facilitate the standardized collection of new biomarker data, Sperling noted. On a different front, the cognitive measures suitable for tracking progression toward MCI are not fully worked out; studies to establish them must take care to recruit cohorts that represent a cross-section of the general population.
Sperling ended her presentation with a note of urgency for the field to get started, because definitive studies to prove or disprove these hypotheses may take a decade. The risk/benefit equation of cognitively normal people taking potential disease-modifying drugs, with their likely side effects, has been intensely discussed in general terms for some years. To be able to move beyond the abstract and weigh this issue quantitatively, the field needs to obtain a closer numerical grip on how likely a given person is to progress to MCI, Sperling added. Finally, she pointed to initiatives that are already planning secondary prevention trials in preclinical populations at various levels of risk, for example, DIAN, API, and the ADCS’s A4 trial.
Compared to the preclinical criteria, the proposed criteria for Alzheimer disease dementia sparked little controversy. They, too, are described in downloadable PDFs, so this story only highlights the main points in brief. Guy McKhann of Johns Hopkins University in Baltimore, Maryland, who already led development of the widely used current criteria (McKhann et al, 1984), told the ICAD audience that research since then has laid bare some shortcomings of these criteria. For example, the long pre-symptomatic and the middle MCI phase were poorly understood at the time, as were the deficits in cognitive domains such as executive function, visuospatial function, and language that are quite common in AD. Nothing at all was known about biomarkers and little about other dementing conditions such as dementia with Lewy bodies or corticobasal degeneration, to pick but two examples. Awareness of early onset AD has rendered the age cutoffs largely obsolete, some genetic information is available, and the category of “possible” dementia has been difficult to pin down in practice all along, McKhann said.
The new guidelines retain the core of the previous dementia diagnosis, i.e., that there has to be evidence of a performance decrement in two domains and of a progressive decline. “In their essence, the old criteria were, and still are, extremely durable,” said Steven DeKosky of the University of Virginia School of Medicine in Charlottesville. The new guidelines refine existing categories to incorporate new knowledge in the research areas named above. They are intended for clinics everywhere. Indeed, the workgroup purposely kept them flexible so they would be useful in many countries and cultural contexts, McKhann said. For example, they make it possible to diagnose AD dementia clinically without use of neuropsychology testing or biomarkers in settings where those services are unavailable, or for large multinational clinical trials; they also recommend the use of biomarkers to add certainty to a questionable diagnosis in settings where this is doable.
On the middle phase of the disease process, Marilyn Albert of Johns Hopkins said to the assembled audience that after months of intensive work, the group settled on the term “MCI due to AD.” Terminology is important—and frequently contentious—in efforts to define diagnostic criteria. Gone from the new proposed vocabulary are the words “amnestic MCI (aMCI)” or “conversion,” reflecting consensus that the disease worsens progressively rather than stepping over a discrete threshold.
The criteria for MCI due to AD remain similar to the old MCI criteria in that they require a cognitive concern and impairment in one or more domains. They are different in that they broaden the domains in which the cognitive deficit can occur to now include language, visuospatial function, or executive functions such as reasoning and problem solving, as did the AD Dementia workgroup. They also require a higher level of functional independence. Importantly, the draft criteria incorporate biomarker measurements in various ways for research purposes. They distinguish amyloid evidence—which is considered evidence that Alzheimer’s, rather than a different disease, is ongoing—from downstream measures of structural or functional change. The proposed MCI due to AD criteria fall into three types, with increasing levels of certainty bestowed, in part, by increasing biomarker/imaging data. In short, the criteria focus on what other groups call prodromal AD. They describe other conditions to the extent that they assist the differential diagnosis. The biggest news lies in their use of biomarkers to increase the certainty of the diagnosis.
These draft criteria ultimately are also intended to be helpful throughout the world, not just in tertiary care centers, Albert said. But she added a caution. “Although we are trying to incorporate biomarkers, we still think there is much to be learned about them. Todd Golde, in his plenary lecture here at ICAD, called the use of biomarkers in AD diagnosis a paradigm shift, and we agree. It would be premature to ask all clinicians in community settings to shift toward using them at this point.” The draft criteria will evolve as new biomarker data come in, Albert added.—Gabrielle Strobel.
This is Part 2 of a three-part series. For intro and media reaction, see Part 1. See also Part 3.
No Available Comments
No Available Further Reading
A comprehensive proposal of new diagnostic criteria—the first official revision of the current NINDS/ADRDA criteria published a quarter-century ago (McKhann et al., 1984)—marked a highlight at the International Conference on Alzheimer’s Disease held 10-15 July 2010 in Honolulu, Hawaii. The criteria drew mixed responses in the media (see Part 1), some of which missed the point that the more ambitious proposals are meant to guide research and therapeutic studies (see Part 2). Those readers who saw media stories but were not at ICAD, or were in a parallel session, may want to know how the research community at the conference reacted to the presentations. In a packed session, Guy McKhann of Johns Hopkins Medical School in Baltimore, Maryland, introduced the criteria for Alzheimer Disease Dementia, Marilyn Albert of Hopkins summarized criteria for MCI of the Alzheimer’s Type, Reisa Sperling of Harvard Medical School presented criteria for preclinical AD, and Steven DeKosky of the University of Virginia Medical School added clinical perspective. In an open-mike discussion after these talks and in hallway and phone conversations with a reporter afterward, ICAD attendees expressed mostly positive feedback interspersed with a few concerns. Unlike the media, some scientists thought the changes did not go far enough. Below is a summary of views. For further perspective, see commentary below by William Jagust at the University of California, Berkeley, Dave Holtzman of Washington University, St. Louis, and Sperling herself (see comment).
Overall, clinicians from different centers across the world commended the effort to evolve current criteria by defining the entire natural history of AD from its asymptomatic beginning to full-blown dementia. Most welcomed a biomarker-enhanced staging of MCI as opposed to the prior clinical definition. Others welcomed the inclusion of atypical presentations of Alzheimer disease that tend to be overlooked by the current criteria. Some clinicians remarked on the opposite, i.e., that diseases that are not AD sometimes get misdiagnosed as such because, to most non-specialists, AD is the best known of the age-related dementias. These clinician-researchers welcomed the inclusion of biomarkers and genetic tests, for example, for tau or even serum progranulin, to help clinicians avoid labeling people as having AD when in fact they have a different disease. Some noted the need for biomarkers for α-synuclein to help delineate those diseases and to provide a more detailed roadmap to navigate the different forms of dementia they see in their clinics.
Several clinicians particularly welcomed the proposed guidelines’ recognition of mixed pathologies. For example, if a patient showed symptoms of parkinsonism, the revised criteria recommend consideration of dementia with Lewy bodies; if a patient has disinhibition, the criteria recommend looking into frontotemporal dementia. Some clinicians were pleased at the recognition of vascular factors in the proposed guidelines, and called for finer-grained distinctions of vascular and mixed dementias to help them with differential diagnosis.
In the way of constructive criticism, some scientists pointed to confusing terminology that arises chiefly because the three work groups have so far worked largely independently of each other. The group did not harmonize their language before rolling out the draft criteria. This, some commentators said, creates apparent paradoxes at the borders of the separate stages. For example, the most advanced “preclinical” patients have symptoms, and the most advanced MCI patients are labeled as having “prodromal Alzheimer’s dementia,” but by definition cannot have dementia if they fall into the MCI group. One scientist pointed out that, as per the AD dementia group’s criteria, patients who meet clinical criteria but have had biomarker tests that came back negative are still considered to have “possible AD dementia,” whereas the MCI workgroup places patients with negative biomarker results in a group called “MCI of the neurodegenerative etiology,” where the likelihood that the underlying process is AD is considered low.
Some clinicians pointed to the high cost of a full neuropsychology workup. In response, DeKosky noted to a reporter that as computerized cognitive tests become more validated and widely available, some of that can be done easily by the patients themselves as they wait to be seen.
One theme echoed through conversations with scientists at ICAD and beyond. It is that the set of proposed criteria devised individually for each of three separate phases of AD appears to many to be quite complex. In particular, several scientists picked up on a question Jesse Cedarbaum of Elan Pharmaceuticals in South San Francisco had asked during the scientific session. Cedarbaum complimented the staging system proposed for the preclinical period, and asked if it could be extended across the entire disease continuum to reflect the progressive nature of Alzheimer disease. Cedarbaum noted that in cancer, continuous staging has proven useful, even though cancer, too, comes in many different forms. A paper published recently in the Journal of Nutrition, Health, and Aging formalized this suggestion with a staging scheme stretching from “No clinical or biomarker evidence” (Stage 0) through to “Incapacitating cognitive and functional decline (Stage 5, see Cedarbaum et al., 2010). Like the criteria of Dubois et al., this proposal, too, is geared primarily toward facilitating clinical trials in early AD.
It is not the first time a continuous staging system has been suggested, others pointed out. Barry Reisberg of New York University and, more recently, David Bennett of Rush University Medical Center, Chicago, have done so before. Several ICAD attendees said that it might deserve another look. “Jesse’s paper proposing a staging system as opposed to three sets of criteria for preclinical, MCI, and AD is, I believe, a better way of thinking of the disease. In fact, that is basically what we do now in our own research studies. We use the clinical dementia rating scale in combination with making a clinical diagnosis in combination with biomarkers,” David Holtzman of Washington University in St. Louis, Missouri, wrote to ARF (for additional comment, see below). For her part, Sperling wrote to ARF: “The idea of a staging system from asymptomatic to early symptoms to MCI to dementia makes sense. Our group tried to operationalize the preclinical stages for research purposes and to lay out a hypothetical model and conceptual framework to test these hypotheses. We might want to consider seven stages overall instead of the five Jesse published, but would need to discuss with the three workgroups before making any specific recommendations” (see full Sperling comment below).
Finally, some commentators noted that multiple groups have established their own diagnostic nomenclature for AD. Some of those are undergoing their own revision. They include the DSM, the WHO’s International Classification of Diseases (ICD-10), and the AAGP. Each use their own terminology, and they do not at present speak to each other in a formal way to explore if they could harmonize their language. Of these four sets of diagnostic schemes, the NIA/Alzheimer’s Association one appears the most biologically oriented; the approaches by Dubois et al. and Cedarbaum et al. carry the reliance on biologic markers further still. In an indication of how sensitive this discussion is, most commentators for this story requested anonymity on any disagreement they might have with the proposed criteria, or with the views of their colleagues in psychiatry. But despite quibbles about language and about how many categories there need to be, the overall tenor among AD scientists was resoundingly this: “Like it or not, biomarker-supported early diagnosis is where the field has to go.” What do you think? We value your feedback. Write your comment in the box below.—Gabrielle Strobel.
Beyond the tried-and-true, though woefully imperfect lab mouse, Alzheimer disease researchers may have access to a growing repertoire of brainier, brawnier species for their in vivo studies. In particular, vervet and rat models strutted their stuff at the recent International Conference on Alzheimer’s Disease, held 10-15 July at the Hawai’i Convention Center in Honolulu. Cynthia Lemere of Brigham and Women’s Hospital, Boston, reported that Caribbean vervets naturally develop not only pathology but also cognitive decline and the fluid and plasma biomarker changes that mimic human aging and AD. And Terrence Town of Cedars-Sinai Medical Center and the University of California, Los Angeles, presented brand-new data on a transgenic rat that exhibits robust amyloid and tau pathology and a key feature absent from the vast majority of current rodent models—neuron loss. If these data hold up, the new models seem poised to advance AD drug discovery by expanding possibilities for translational research.
Imported from Senegal to the eastern Caribbean island of St. Kitts, the vervets (aka African green monkeys) in Lemere’s study are 96 percent homologous to humans and live some 20 years in the wild, up to 30 in captivity. In an Elan/Wyeth-funded study, aging vervets responded well to immunotherapy with an N-terminal Aβ vaccine; Immunized animals had reduced plaque load and improved memory (Lemere et al., 2004 and ARF related conference story). At this year’s ICAD, Lemere presented behavioral and biomarker data from an ongoing longitudinal study on the vervets, as well as extended pathological data. The Boston researchers collaborate with McGill scientists Roberta Palmour and Frank Ervin at the Behavioral Science Foundation in St. Kitts, who maintain a colony of ~1,000 vervets and conduct all behavioral testing. When an animal in the colony dies, they fix or freeze tissue samples and send them to Boston for pathological and biochemical analyses.
For the longitudinal study that began in 2007, scientists in St. Kitts collect blood twice and CSF three times annually from young (five to 10 years), middle-aged (11-15 years), and old (16-26 years) vervets, 10 females and 10 males per group, and ship the samples to Boston for biomarker analysis. With help from Anne Fagan at Washington School of Medicine, St. Louis, Missouri, Lemere’s group found that in CSF, Aβ42 levels rise as the vervets age, as do scores on an object retrieval test for episodic memory. However, about half of the old animals have dramatic drops in Aβ42 that associate with declining performance on that test, much like what happens in people with preclinical AD. Using a Rules-Based Medicine proteomics assay, the researchers have preliminary data suggesting that plasma levels of several inflammatory proteins, for example, complement C3, IL-1 receptor α, CD40 ligand, and cortisol, may go up with age and correlate with memory loss. Stereological neuronal counts are underway in the basal nucleus of Meynert and hippocampus in aged vervets. Grossly, “it looks likely that some animals show at least some neuron loss,” Lemere wrote in an e-mail to ARF.
Extending previous pathological studies to include 28 vervets (age range 12 to 32 years) to date, “we’ve seen at least some Aβ deposition and vascular amyloid in all animals 17 years and older,” Lemere said. Like in people, more plaques had Aβ42 than Aβ40. The plaques build up in frontal and parietal areas, temporal cortex, with fewer seen in hippocampus. There is great variability among animals; some show surprisingly little plaque deposition even into their twenties. Many of the plaques and vascular amyloid contained pyrogluAβ3-42, a post-translationally modified form of Aβ made by truncating the N-terminus and cyclizing its new end (see ARF related news story).
While activated microglia cozied up to many of the compact plaques, hyperphosphorylated tau did so far less often. “Every once in a while, we’ll see some intracellular tau inclusions—possibly tangle-like but more likely just inclusions. And in three animals, we’ve seen a very rare intracellular staining pattern,” Lemere said. “But it’s just in several neurons, not many neurons per brain.” The researchers picked up some phospho-tau in the monkeys’ cerebrospinal fluid (CSF), as well, “but only rarely in old animals,” Lemere reported. “It’s not something we see in mildly impaired or younger animals at all.”
All told, the data suggest that the vervets may “provide a good model for translational research,” Lemere said, noting that several company researchers who heard the talk have approached her about using the vervets for their preclinical studies.
A Transgenic Rat That Has It All?
A clear disadvantage of vervet studies—the cost of setting up and maintaining colonies—could be mitigated with smaller animals that can still recapitulate the full slate of AD pathological and behavioral features. At an ICAD Hot Topics session, Town argued that his TgF344-AD rat may be the first rodent to fit the bill. This AD model is the result of a collaboration with Robert Cohen and postdoctoral researcher Kavon Rezai-Zadeh at the same institutions. It debuted earlier this spring at a Keystone meeting in Copper Mountain, Colorado (see ARF related conference story), and made a splash in Honolulu with additional data on tau pathology and cognitive decline.
Why create a rat AD model? Mouse models do a good job mimicking cerebral amyloidosis, Town said, but many lack tau pathology and most show no appreciable neuron loss. Furthermore, rats are four to five million years closer evolutionarily to humans and, like people, express all six tau isoforms, Town believes, whereas mice only express three. There is debate about the number of tau isoforms in rats, in part arising from differences in the biochemical protocols used to separate the isoforms on electrophoretic gels; for his part, Town cited a recent paper that he believes supports the presence of all six tau isoforms in the rat (Hanes et al., 2009).
With both transgenes driven by the mouse prion promoter, the TgF344-AD rat makes a triple dose of human amyloid precursor protein with the Swedish mutation (APPswe), and overexpresses exon 9-deleted human presenilin-1 (PS1ΔE9) 13-fold, Town said. Amyloid deposition gets underway around six months of age, and generally precedes tau pathology. Town reported that his team has detected oligomers (i.e., 5-mers), in addition to Aβ monomers, within brains of the transgenic rats. In 16-month-old rats, the researchers found abnormal tau decorating Aβ plaques in the cingulate cortex, as seen by immunogold labeling. Aging rats also racked up increasing amounts of insoluble tau and of the pathogenic tau-associated kinases Cdk5 and GSK3.
It appears these pathological developments could be choking the life out of neurons, or at least correlate with their demise, Town said. Neuron numbers in the hippocampus and cingulate cortex were down 23-38 percent in TgF344-AD rats compared to wild-type controls when determined by blinded manual subfield counting, and as much as 45 percent by stereological counts, Town said. The neurons appear to die by apoptosis in close vicinity of Aβ plaques, as judged by TUNEL and caspase-3 analyses. Furthermore, as shown in behavioral studies not yet complete when Town introduced the TgF344-AD rat at Keystone, learning and memory starts fading by around six months and intensifies through 15-16 months of age. The researchers measured cognition by performance in an open field test and in the Barnes maze.
Because it faithfully reproduces all major AD pathological features, the TgF344-AD rat should become a widely used model for advancing drug discovery, Town said. “The Aβ models are excellent tools, but I’m a bit worried that we haven’t been able to translate any of the therapeutics from mouse models to humans.”—Esther Landhuis.
No Available Comments
Guided by the latest biomarker and imaging data, scientists have drafted a new set of diagnostic research criteria redefining Alzheimer disease as a condition that develops—and eventually could warrant intervention—decades prior to obvious symptoms. Most clinicians welcomed the changes, which were proposed last month at the International Conference on Alzheimer’s Disease (ICAD) in Honolulu, Hawaii, but some questioned the benefit of earlier diagnosis while there is yet no way to stop the disease in its tracks (see ARF related news story). Amid this debate looms the critical question of how well biomarkers can predict who among the cognitively normal is heading toward dementia and, eventually, full-blown AD. This report recaps a sampling of ICAD studies that address this issue. By and large, the data suggest that seniors who appear normal on cognitive tests, but nonetheless suspect their memory is off, or who have high brain amyloid or other pathological reads, may already be quietly on the wane.
If amyloid-positive “normals” are, in fact, in the earliest phase of a disease continuum, how might one design studies to assess treatment efficacy in the preclinical AD population? This question motivated a study by Michael Donohue, University of California, San Diego, and colleagues. The researchers divided control participants of the Alzheimer’s Disease Neuroimaging Initiative (ADNI) into two groups—those with high amyloid burden, as assessed by cerebrospinal fluid (CSF) assays or positron emission tomography (PET) using the radiotracer Pittsburgh compound B (PIB), and those without. Among the candidate measures of disease progression, “we wanted to see if the amyloid-positives separate from the amyloid-negatives,” he said, as this would demonstrate the measure’s ability to capture “disease-specific” progression. Indeed, that is what they found. Compared with amyloid-negative research participants, amyloid-positive volunteers had greater hippocampal atrophy, more glucose hypometabolism (a measure of brain function judged by fluorodeoxyglucose, or FDG-PET), and more cognitive deterioration, measured by the Mini-Mental State Examination (MMSE) and Functional Activities Questionnaire (FAQ) scores, over a two-year period, Donohue reported. All told, the findings seem to underscore the usefulness of CSF amyloid measures or in vivo amyloid imaging. Using a single parameter, i.e., brain amyloid load, “you can identify a cohort of normals who show accelerated longitudinal decline on imaging and cognitive measures,” Donohue said, noting that this selection criterion may establish the feasibility of future preclinical AD trials.
For otherwise normal individuals, having a lot of brain amyloid seems to bode ill in other, subtler, ways, as well. The elderly often have trouble matching names to faces, in part because of age-related problems with their default network, a set of brain areas that are active during rest and suppressed when the person performs a focused, specific task. As detected by functional MRI (fMRI), people suppress this network less and less well as they age, and the dysfunction worsens in those with high amyloid burden. This trend came through in a study by Patrizia Vannini, Brigham and Women’s Hospital, Boston, and colleagues, who challenged 27 young and 41 older adults, all cognitively normal, with a task requiring them to learn the names associated with a series of faces. Consistent with prior research, seniors with amyloid-laden brains suppressed their default network less than did those without brain amyloid, who in turn suppressed to a smaller extent than did young participants. The novelty in the current research is that, after repeated trials of the face-name task, amyloid-free elderly showed reduced suppression for each repetition, while seniors with brain amyloid did not demonstrate this practice effect. Instead, they continued to engage their default network across repeated trials. Throughout the testing, both groups performed similarly on the task itself, Vannini noted in an e-mail to ARF. The findings suggest that amyloid pathology in cognitively intact seniors “is related to disrupted synaptic activity in the networks supporting memory function before any clinical symptoms are evident in these subjects,” she wrote.
Recent functional connectivity MRI (fcMRI) studies by Prashanthi Vemuri, Mayo Clinic, Rochester, Minnesota, and colleagues also seem to bear out the notion that amyloid buildup takes a toll on brain networks early in the disease process. While most fcMRI studies have focused on the default-mode network, showing disrupted connectivity in amyloid-positive seniors (Hedden et al., 2009; Sheline et al., 2010), Vemuri’s team looked at global functional connectivity. They analyzed people with AD or mild cognitive impairment (MCI), and cognitively normal elderly, from the Mayo Clinic Study of Aging. At a day-long pre-ICAD imaging meeting, Vemuri reported that among amyloid-positive research participants, global connectivity is high in controls, lower in people with MCI, and further reduced in the AD group. This suggests that functional connectivity drops off with clinical deterioration, and that fcMRI could serve as an AD marker. Within the control group, though, a measure of global functional connectivity differed between amyloid-positive and amyloid-negative subjects in an unexpected fashion—the former had higher connectivity. This suggests a possible compensatory mechanism early in the disease process that warrants validation through further study, Vemuri wrote in an e-mail to ARF.
Similarly, a study by Gaël Chetelat, who moved from Austin Health, Melbourne, Australia, to Inserm-EPHE-University of Caen, France, stratified control subjects from the Australian Imaging, Biomarkers and Lifestyle (AIBL) study into high and low brain amyloid groups, and saw a not-so-straightforward pattern. Compared to amyloid-negative normal participants, those with high amyloid burden had better episodic memory and greater temporal gray matter volume at baseline. The researchers took this to mean that people with more temporal gray matter could tolerate more Aβ. By contrast, within the group of participants with subjective cognitive impairment (i.e., those who appeared normal on cognitive tests but themselves thought they had problems with their memory), the amyloid-positive subset had more gray matter atrophy.
Together, these and other studies are making a case that amyloid deposits are worrisome despite their presence in some 15 to 40 percent of seniors who do fine on standard memory tests (Aizenstein et al., 2008; see also ARF related conference story). But the science is not clear-cut at this point. “If you have symptomatic MCI (mild cognitive impairment) and you have amyloid in your brain, that's not a good thing,” said William Klunk, University of Pittsburgh, Pennsylvania, in a phone interview. “Your chances of converting to dementia over the next two to three years are very high. We can say that with a high degree of certainty in MCI.” However, for those who have yet to develop memory problems, the significance of a high amyloid load is murkier. “Some studies have shown slightly smaller brain size in these folks, others a little less metabolism,” Klunk said of amyloid-positive normals. “But these are very slight changes, so while we think these are the people who will progress to MCI and to AD, we don't have that data in hand yet.” The evidence points to amyloid accumulating in the brain more than a decade ahead of symptoms, yet live brain imaging techniques have only recently emerged on the research scene. Swedish investigators performed the first PIB scan in 2002, and it wasn’t until several years later that others began doing the same. Thus, as for whether amyloid deposition reliably predicts dementia years down the road, “we really haven’t studied enough normal people throughout the natural history of the disease to have time to know,” Klunk told ARF.
The situation becomes more complex when one considers the multitude of factors that could influence rate of decline in normals. “There are good things—for example, having more gray matter or higher network density—that mitigate decline,” Klunk said, noting that one of the PIB-positive normal controls in the 2002 Swedish cohort was still cognitively intact as of early 2010, eight years after his initial scan. “But there are also bad things—such as strokes, vascular disease, and head injury—that speed your progress through the asymptomatic stage into the symptomatic stage.”
Furthermore, brain amyloid isn’t the only predictive biomarker. Others can also signal impending drops in cognition, it seems, and figuring out how certain markers work together or complement each other is a key challenge in the field (see ARF Live Discussion on biomarkers). At the ICAD pre-meeting on imaging, Adam Fleisher of Banner Alzheimer’s Institute, Phoenix, Arizona, presented new data using a statistical algorithm to measure how well PIB-PET and FDG-PET data correlate with a known AD predictor, i.e., ApoE4 status, in cognitively normal elderly. Converting complex neuroimaging datasets into numerical scores, the scientists saw that amyloid load and glucose metabolism individually tracked with ApoE4 gene dose, and that using both imaging modalities further improved the correlation. “By combining the two using statistics, we could distill the patterns of amyloid imaging and hypometabolism into a single score, and that score was predictive for how much genetic risk of AD the person had,” Fleisher told ARF. The recent analysis also showed that amyloid deposition and glucose hypometabolism occur in different parts of the brain, with the exception of the hard-hit precuneus, where PIB and FDG signals converged. “In these cognitively normal people, areas of brain dysfunction are not the same areas in which amyloid was being deposited,” Fleisher said. “This suggests that maybe we don't have a clear understanding of how amyloid plaque deposition affects brain networks prior to dementia. It may be more indirect than we realize.”
ICAD featured other efforts to flesh out how well various AD biomarkers forecast fading cognition in normal seniors. For their part, Susan Landau of the University of California, Berkeley, and colleagues examined baseline brain glucose metabolism (FDG-PET) and hippocampal volume (structural MRI), as well as ApoE status, in 92 ADNI normals. “We classified the ADNI normals as 'abnormal' or 'normal' on each of those three variables, and looked at whether that status at baseline predicted change on ADAS-cog over about a 2.5-year period,” Landau told ARF in a post-meeting phone interview. For ApoE status, “abnormal” simply meant having an E4 allele, whereas non-carriers were designated “normal.” For the two imaging measures, the researchers determined cut points, i.e., values that distinguish the “abnormal” and “normal” categories, from external cohorts, and then applied these to the ADNI normals.
Of the three biomarkers tested, only hippocampal volume ended up predicting ADAS-cog change if the normals were considered as a group. However, if stratified into high- and low-performing subgroups according to baseline scores on the Auditory Verbal Learning Test, which measures long-term memory, the findings came out entirely different. Among high performers, none of the biomarkers predicted cognitive change, whereas in the low-performing group, all three had predictive power. In particular, low performers who had abnormal hippocampal volumes and at least one E4 allele saw their ADAS-cog scores plummet 2.3 points per year more than did low performers with normal hippocampal volume and ApoE status. “The bottom line is that baseline biomarker status was more useful at predicting cognitive change in the low-performing subpopulation of normals, which may include individuals with early, subclinical pathology,” Landau said.
Even without formal cognitive testing, older adults who merely suspect their memory is off could, in fact, be on to something. Analyzing more than a thousand seniors (ages 60 and up) from the AIBL study at baseline and 18 months later, Jonathan Foster of the Health Department of Western Australia, Cassandra Szoeke of University of Melbourne, and colleagues found that memory complainers and non-complainers did not differ in brain amyloid load by PIB-PET. However, the complainers had worse performance on category fluency and Boston naming tests when they were initially tested, Szoeke reported at ICAD. Those reporting memory loss at baseline also seemed more likely to decline to MCI over time, though the number of converts was too small to be statistically significant (3.7 percent of memory complainers converted to MCI, compared with 1.2 percent of non-complainers), she said.
Though far from cut and dry, the evidence to date does seem to converge on the idea that biomarkers can help identify not only which cognitively normal elderly people are most likely to decline, but also which measures may best track this deterioration as the disease progresses. “It could take quite a while to understand the full natural history of the cognitive normal stage of AD pathology,” Klunk said. “But we have to understand this asymptomatic stage so we're in a position to understand the effects of treatments when we do have attractive therapies for prevention trials. We have to know what the natural history is going in, so we'll be that much more able to understand whether a drug is having an effect.”—Esther Landhuis.
No Available Comments
Comments
No Available Comments
Make a Comment
To make a comment you must login or register.