1. Study suggests body clock disruptions occur years before memory loss in Alzheimer’s

    February 15, 2018 by Ashley

    From the Washington University in St. Louis press release:

    People with Alzheimer’s disease are known to have disturbances in their internal body clocks that affect the sleep/wake cycle and may increase risk of developing the disorder. Now, new research at Washington University School of Medicine in St. Louis indicates that such circadian rhythm disruptions also occur much earlier in people whose memories are intact but whose brain scans show early, preclinical evidence of Alzheimer’s.

    The findings potentially could help doctors identify people at risk of Alzheimer’s earlier than currently is possible. That’s important because Alzheimer’s damage can take root in the brain 15 to 20 years before clinical symptoms appear.

    The research is published Jan. 29 in the journal JAMA Neurology.

    “It wasn’t that the people in the study were sleep-deprived,” said first author Erik S. Musiek, MD, PhD, an assistant professor of neurology. “But their sleep tended to be fragmented. Sleeping for eight hours at night is very different from getting eight hours of sleep in one-hour increments during daytime naps.”

    The researchers also conducted a separate study in mice, to be published Jan. 30 in The Journal of Experimental Medicine, showing that similar circadian disruptions accelerate the development of amyloid plaques in the brain, which are linked to Alzheimer’s.

    Previous studies at Washington University, conducted in people and in animals, have found that levels of amyloid fluctuate in predictable ways during the day and night. Amyloid levels decrease during sleep, and several studies have shown that levels increase when sleep is disrupted or when people don’t get enough deep sleep, according to research by senior author, Yo-El Ju, MD.

    “In this new study, we found that people with preclinical Alzheimer’s disease had more fragmentation in their circadian activity patterns, with more periods of inactivity or sleep during the day and more periods of activity at night,” said Ju, an assistant professor of neurology.

    The researchers tracked circadian rhythms in 189 cognitively normal, older adults with an average age of 66. Some had positron emission tomography (PET) scans to look for Alzheimer’s-related amyloid plaques in their brains. Others had their cerebrospinal fluid tested for Alzheimer’s-related proteins. And some had both scans and spinal fluid testing.

    Of the participants, 139 had no evidence of the amyloid protein that signifies preclinical Alzheimer’s. Most had normal sleep/wake cycles, although several had circadian disruptions that were linked to advanced age, sleep apnea or other causes.

    But among the other 50 subjects — who either had abnormal brain scans or abnormal cerebrospinal fluid — all experienced significant disruptions in their internal body clocks, determined by how much rest they got at night and how active they were during the day. Disruptions in the sleep/wake cycle remained even after the researchers statistically controlled for sleep apnea, age and other factors.

    The study subjects, from Washington University’s Knight Alzheimer’s Disease Research Center, all wore devices similar to exercise trackers for one to two weeks. Each also completed a detailed sleep diary every morning.

    By tracking activity during the day and night, the researchers could tell how scattered rest and activity were throughout 24-hour periods. Subjects who experienced short spurts of activity and rest during the day and night were more likely to have evidence of amyloid buildup in their brains.

    These findings in people reinforce the mouse research from Musiek’s lab. In that study, working with first author Geraldine J. Kress, PhD, an assistant professor of neurology, Musiek studied circadian rhythm disruptions in a mouse model of Alzheimer’s. To disrupt the animals’ circadian rhythms, his team disabled genes that control the circadian clock.

    “Over two months, mice with disrupted circadian rhythms developed considerably more amyloid plaques than mice with normal rhythms,” Musiek said. “The mice also had changes in the normal, daily rhythms of amyloid protein in the brain. It’s the first data demonstrating that the disruption of circadian rhythms could be accelerating the deposition of plaques.”

    Both Musiek and Ju said it’s too early to answer the chicken-and-egg question of whether disrupted circadian rhythms put people at risk for Alzheimer’s disease or whether Alzheimer’s-related changes in the brain disrupt circadian rhythms.

    “At the very least, these disruptions in circadian rhythms may serve as a biomarker for preclinical disease,” said Ju. “We want to bring back these subjects in the future to learn more about whether their sleep and circadian rhythm problems lead to increased Alzheimer’s risk or whether the Alzheimer’s disease brain changes cause sleep/wake cycle and circadian problems.”

    Reference: Kress, GJ, Liao F, Dimitry J, Cedeno MR, Fitzgerald GA, Holtzman DM, Musiek ES. Regulation of amyloid-beta dynamics and pathology by the circadian clock. The Journal of Experimental Medicine, Jan. 30, 2018.


  2. Study suggests cognitive training helps regain a younger-working brain

    February 8, 2018 by Ashley

    From the Center for BrainHealth press release:

    Relentless cognitive decline as we age is worrisome, and it is widely thought to be an unavoidable negative aspect of normal aging. Researchers at the Center for BrainHealth at The University of Texas at Dallas, however, say their research could provide new hope for extending our brain function as we age.

    In a randomized clinical study involving adults age 56 to 71 that recently published in Neurobiology of Aging, researchers found that after cognitive training, participants’ brains were more energy efficient, meaning their brain did not have to work as hard to perform a task.

    Dr. Michael Motes, senior research scientist at the Center for BrainHealth and one of the lead authors of the study, said, “Finding a nonpharmacological intervention that can help the aging brain to perform like a younger brain is a welcome finding that potentially advances understanding of ways to enhance brain health and longevity. It is thrilling for me as a cognitive neuroscientist, who has previously studied age-related cognitive decline, to find that cognitive training has the potential to strengthen the aging brain to function more like a younger brain.”

    To investigate changes in brain efficiency, the research team studied neural activity while the participant performed a task. For the study, 57 cognitively normal older adults were randomly assigned to a cognitive training group, a wait-listed control group, or physical exercise control group. The cognitive training utilized the Strategic Memory Advanced Reasoning Training (SMART) program developed at the Center for BrainHealth.

    Cognitive training strategies included how to focus on the most relevant information and filter out the less relevant; ways to continually synthesize information encountered in daily life to encourage deeper thinking; and how to inspire innovative thinking through generating diverse interpretations, solutions and perspectives. Because aerobic exercise has been shown to lead to improvements in processing speed and functional changes within the frontal and other brain regions, it was included as one of the study groups.

    The cognitive training was conducted over the course of 12 weeks. Participants in the active control physical exercise program exceeded physical activity guidelines of 150 minutes per week for the 12 weeks.

    Using functional magnetic resonance imaging (fMRI), an imaging technique that measures brain activity, researchers examined all three groups at the beginning (baseline), middle, and end of the study while participants performed computer-based speed tasks in the scanner.

    The fMRI results provided evidence that cognitive training improved speed-related neural activity. While all groups showed faster reaction times across sessions, the cognitive training group showed a significant increase in the association between reaction time and frontal lobe activity. After training, faster reaction times were associated with lower frontal lobe activity, which is consistent with the more energy-efficient neural activity found in younger adults.

    In contrast to the cognitive training group, the wait-listed and physical exercise groups showed significant decreases across sessions in the association between reaction time and frontal lobe activation.

    “This discovery of neural efficiency profiles found in the SMART-trained older adults is promising,” said Dr. Sandra Bond Chapman, one of the lead authors, Center for BrainHealth founder and chief director. “If replicated, this work paves the way for larger clinical trials to test the ability to harness the potential of the aging mind and its ability to excel — by working like a younger brain with all the rich knowledge and expertise accrued over time. To counteract the pattern of age-related losses and even enhance the brain’s inner workings by ‘thinking’ in smarter ways is an achievable and highly desirable goal.”


  3. Study looks at personality changes during transition to developing mild cognitive impairment

    February 7, 2018 by Ashley

    From the American Geriatrics Society press release:

    A key feature of Alzheimer’s disease is memory loss and losing one’s ability to think and make decisions (also called “cognitive ability“). Those changes can begin slowly, during a phase called “mild cognitive impairment” (or MCI). A variety of diseases can cause MCI, but the most common is Alzheimer’s disease.

    Not all people who have MCI develop Alzheimer’s disease — but if memory loss is a person’s key MCI symptom, and if that person’s genes (DNA) suggests they may be likely to develop Alzheimer’s disease, the risk for the condition can be as high as 90 percent.

    Personality changes and behavior problems that come with Alzheimer’s disease are as troubling as memory loss and other mental difficulties for caregivers and those living with the condition. Mayo Clinic researchers wondered if personality changes that begin early, when MCI memory loss becomes noticeable, might help predict Alzheimer’s disease at its earliest stages. The researchers created a study to test their theory and published their findings in the Journal of the American Geriatrics Society.

    Researchers recruited cognitively normal participants 21-years-old and older who were genetically more likely to develop Alzheimer’s disease. The recruitment period began in January 1994 and ended in December 2016. Researchers also recruited people without a genetic likelihood for developing Alzheimer’s disease to serve as a control group. All participants took several tests, including medical and neurological (or brain) exams. They were also screened for depression, as well as cognitive and physical function.

    After analyzing results, the researchers concluded that personality changes, which can lead to changes in behavior, occur early on during the development of Alzheimer’s disease. The behavioral changes, however, may be barely noticeable, and can include mood swings, depression, and anxiety. They suggested that further research might be needed to learn whether diagnosing these early personality changes could help experts develop earlier, safer, and more effective treatments — or even prevention options — for the more severe types of behavior challenges that affect people with Alzheimer’s disease.


  4. Study suggests curcumin improves memory and mood

    February 3, 2018 by Ashley

    From the UCLA press release:

    Lovers of Indian food, give yourselves a second helping: Daily consumption of a certain form of curcumin — the substance that gives Indian curry its bright color — improved memory and mood in people with mild, age-related memory loss, according to the results of a study conducted by UCLA researchers.

    The research, published online Jan. 19 in the American Journal of Geriatric Psychiatry, examined the effects of an easily absorbed curcumin supplement on memory performance in people without dementia, as well as curcumin’s potential impact on the microscopic plaques and tangles in the brains of people with Alzheimer’s disease.

    Found in turmeric, curcumin has previously been shown to have anti-inflammatory and antioxidant properties in lab studies. It also has been suggested as a possible reason that senior citizens in India, where curcumin is a dietary staple, have a lower prevalence of Alzheimer’s disease and better cognitive performance.

    “Exactly how curcumin exerts its effects is not certain, but it may be due to its ability to reduce brain inflammation, which has been linked to both Alzheimer’s disease and major depression,” said Dr. Gary Small, director of geriatric psychiatry at UCLA’s Longevity Center and of the geriatric psychiatry division at the Semel Institute for Neuroscience and Human Behavior at UCLA, and the study’s first author.

    The double-blind, placebo-controlled study involved 40 adults between the ages of 50 and 90 years who had mild memory complaints. Participants were randomly assigned to receive either a placebo or 90 milligrams of curcumin twice daily for 18 months.

    All 40 subjects received standardized cognitive assessments at the start of the study and at six-month intervals, and monitoring of curcumin levels in their blood at the start of the study and after 18 months. Thirty of the volunteers underwent positron emission tomography, or PET scans, to determine the levels of amyloid and tau in their brains at the start of the study and after 18 months.

    The people who took curcumin experienced significant improvements in their memory and attention abilities, while the subjects who received placebo did not, Small said. In memory tests, the people taking curcumin improved by 28 percent over the 18 months. Those taking curcumin also had mild improvements in mood, and their brain PET scans showed significantly less amyloid and tau signals in the amygdala and hypothalamus than those who took placebos.

    The amygdala and hypothalamus are regions of the brain that control several memory and emotional functions.

    Four people taking curcumin, and two taking placebos, experienced mild side effects such as abdominal pain and nausea.

    The researchers plan to conduct a follow-up study with a larger number of people. That study will include some people with mild depression so the scientists can explore whether curcumin also has antidepressant effects. The larger sample also would allow them to analyze whether curcumin’s memory-enhancing effects vary according to people’s genetic risk for Alzheimer’s, their age or the extent of their cognitive problems.

    “These results suggest that taking this relatively safe form of curcumin could provide meaningful cognitive benefits over the years,” said Small, UCLA’s Parlow-Solomon Professor on Aging.


  5. Anxiety: An early indicator of Alzheimer’s disease?

    January 25, 2018 by Ashley

    From the Brigham and Women’s Hospital press release:

    A new study suggests an association between elevated amyloid beta levels and the worsening of anxiety symptoms. The findings support the hypothesis that neuropsychiatric symptoms could represent the early manifestation of Alzheimer’s disease in older adults.

    Alzheimer’s disease is a neurodegenerative condition that causes the decline of cognitive function and the inability to carry out daily life activities. Past studies have suggested depression and other neuropsychiatric symptoms may be predictors of AD’s progression during its “preclinical” phase, during which time brain deposits of fibrillar amyloid and pathological tau accumulate in a patient’s brain. This phase can occur more than a decade before a patient’s onset of mild cognitive impairment. Investigators at Brigham and Women’s Hospital examined the association of brain amyloid beta and longitudinal measures of depression and depressive symptoms in cognitively normal, older adults. Their findings, published today by The American Journal of Psychiatry, suggest that higher levels of amyloid beta may be associated with increasing symptoms of anxiety in these individuals. These results support the theory that neuropsychiatric symptoms could be an early indicator of AD.

    “Rather than just looking at depression as a total score, we looked at specific symptoms such as anxiety. When compared to other symptoms of depression such as sadness or loss of interest, anxiety symptoms increased over time in those with higher amyloid beta levels in the brain,” said first author Nancy Donovan, MD, a geriatric psychiatrist at Brigham and Women’s Hospital. “This suggests that anxiety symptoms could be a manifestation of Alzheimer’s disease prior to the onset of cognitive impairment. If further research substantiates anxiety as an early indicator, it would be important for not only identifying people early on with the disease, but also, treating it and potentially slowing or preventing the disease process early on.” As anxiety is common in older people, rising anxiety symptoms may prove to be most useful as a risk marker in older adults with other genetic, biological or clinical indicators of high AD risk.

    Researchers derived data from the Harvard Aging Brain Study, an observational study of older adult volunteers aimed at defining neurobiological and clinical changes in early Alzheimer’s disease. The participants included 270 community dwelling, cognitively normal men and women, between 62 and 90 years old, with no active psychiatric disorders. Individuals also underwent baseline imaging scans commonly used in studies of Alzheimer’s disease, and annual assessments with the 30-item Geriatric Depression Scale (GDS), an assessment used to detect depression in older adults.

    The team calculated total GDS scores as well as scores for three clusters symptoms of depression: apathy-anhedonia, dysphoria, and anxiety. These scores were looked at over a span of five years.

    From their research, the team found that higher brain amyloid beta burden was associated with increasing anxiety symptoms over time in cognitively normal older adults. The results suggest that worsening anxious-depressive symptoms may be an early predictor of elevated amyloid beta levels — and, in turn AD — and provide support for the hypothesis that emerging neuropsychiatric symptoms represent an early manifestation of preclinical Alzheimer’s disease.

    Donovan notes further longitudinal follow-up is needed to determine whether these escalating depressive symptoms give rise to clinical depression and dementia stages of Alzheimer’s disease over time.


  6. Can training improve memory, thinking abilities in older adults with cognitive impairment?

    January 23, 2018 by Ashley

    From the American Geriatrics Society press release:

    Cognition is the ability to think and make decisions. Medication-free treatments that maintain cognitive health as we age are attracting the attention of medical experts. Maintaining the ability to think clearly and make decisions is crucial to older adults’ well-being and vitality.

    Mild cognitive impairment (MCI) is a condition that affects people who are in the early stages of dementia or Alzheimer’s disease. People with MCI may have mild memory loss or other difficulties completing tasks that involve cognitive abilities. MCI may eventually develop into dementia or Alzheimer’s disease. Depression and anxiety also can accompany MCI. Having these conditions can increase the risk of mental decline as people age.

    A new, first-of-its-kind study was published in the Journal of the American Geriatrics Society by scientists from research centers in Montreal and Quebec City, Canada. They designed a study to learn whether cognitive training, a medication-free treatment, could improve MCI. Studies show that activities that stimulate your brain, such as cognitive training, can protect against a decline in your mental abilities. Even older adults who have MCI can still learn and use new mental skills.

    For their study, researchers recruited 145 older adults around the age of 72 from Canadian memory clinics. The participants had been diagnosed with MCI, and were assigned to one of three groups. Each group included four or five participants, and met for eight weekly sessions for 120 minutes.

    The three groups were:

    • Cognitive training group. Members of this group participated in the MEMO program (MEMO stands for a French phrase that translates to “training method for optimal memory”). They received special training to improve their memory and attention span.
    • Psycho-social group. Participants in this group were encouraged to improve their general well-being. They learned to focus on the positive aspects of their lives and find ways to increase positive situations.
    • Control group. Participants had no contact with researchers and didn’t follow a program.

    During the time the training sessions took place, 128 of the participants completed the project. After six months, 104 completed all the sessions they were assigned.

    People in the MEMO group increased their memory scores by 35 to 40 percent, said Sylvie Belleville, PhD, a senior author of the study. “Most importantly, they maintained their scores over a six-month period.”

    What’s more, the improvement was the largest for older adults with “delayed recall.” This means memory for words measured just 10 minutes after people have studied them. Because delayed memory is one of the earliest signs of Alzheimer’s disease, this was a key finding.

    Those who participated in the MEMO group said they used the training they learned in their daily lives. The training gave them different ways to remember things. For example, they learned to use visual images to remember names of new people, and to use associations to remember shopping lists. These lessons allowed them to continue maintaining their memory improvements after the study ended.

    The people in the psycho-social group and the control group didn’t experience memory benefits or improvement in their mood.


  7. Try exercise to improve memory and thinking, new guideline urges

    January 14, 2018 by Ashley

    From the Mayo Clinic press release:

    For patients with mild cognitive impairment, don’t be surprised if your health care provider prescribes exercise rather than medication. A new guideline for medical practitioners says they should recommend twice-weekly exercise to people with mild cognitive impairment to improve memory and thinking.

    The recommendation is part of an updated guideline for mild cognitive impairment published in the Dec. 27 online issue of Neurology, the medical journal of the American Academy of Neurology.

    “Regular physical exercise has long been shown to have heart health benefits, and now we can say exercise also may help improve memory for people with mild cognitive impairment,” says Ronald Petersen, M.D., Ph.D., lead author, director of the Alzheimer’s Disease Research Center, Mayo Clinic, and the Mayo Clinic Study of Aging. “What’s good for your heart can be good for your brain.” Dr. Petersen is the Cora Kanow Professor of Alzheimer’s Disease Research.

    Mild cognitive impairment is an intermediate stage between the expected cognitive decline of normal aging and the more serious decline of dementia. Symptoms can involve problems with memory, language, thinking and judgment that are greater than normal age-related changes.

    Generally, these changes aren’t severe enough to significantly interfere with day-to-day life and usual activities. However, mild cognitive impairment may increase the risk of later progressing to dementia caused by Alzheimer’s disease or other neurological conditions. But some people with mild cognitive impairment never get worse, and a few eventually get better.

    The academy’s guideline authors developed the updated recommendations on mild cognitive impairment after reviewing all available studies. Six-month studies showed twice-weekly workouts may help people with mild cognitive impairment as part of an overall approach to managing their symptoms.

    Dr. Petersen encourages people to do aerobic exercise: Walk briskly, jog, whatever you like to do, for 150 minutes a week — 30 minutes, five times or 50 minutes, three times. The level of exertion should be enough to work up a bit of a sweat but doesn’t need to be so rigorous that you can’t hold a conversation. “Exercising might slow down the rate at which you would progress from mild cognitive impairment to dementia,” he says.

    Another guideline update says clinicians may recommend cognitive training for people with mild cognitive impairment. Cognitive training uses repetitive memory and reasoning exercises that may be computer-assisted or done in person individually or in small groups. There is weak evidence that cognitive training may improve measures of cognitive function, the guideline notes.

    The guideline did not recommend dietary changes or medications. There are no drugs for mild cognitive impairment approved by the U.S. Food and Drug Administration.

    More than 6 percent of people in their 60s have mild cognitive impairment across the globe, and the condition becomes more common with age, according to the American Academy of Neurology. More than 37 percent of people 85 and older have it.

    With such prevalence, finding lifestyle factors that may slow down the rate of cognitive impairment can make a big difference to individuals and society, Dr. Petersen notes.

    “We need not look at aging as a passive process; we can do something about the course of our aging,” he says. “So if I’m destined to become cognitively impaired at age 72, I can exercise and push that back to 75 or 78. That’s a big deal.”

    The guideline, endorsed by the Alzheimer’s Association, updates a 2001 academy recommendation on mild cognitive impairment. Dr. Petersen was involved in the development of the first clinical trial for mild cognitive impairment and continues as a worldwide leader researching this stage of disease when symptoms possibly could be stopped or reversed.

     


  8. Blueberry vinegar improves memory in mice with amnesia

    January 9, 2018 by Ashley

    From the American Chemical Society press release:

    Dementia affects millions of people worldwide, robbing them of their ability to think, remember and live as they once did. In the search for new ways to fight cognitive decline, scientists report in ACS’ Journal of Agricultural and Food Chemistry that blueberry vinegar might offer some help. They found that the fermented product could restore cognitive function in mice.

    Recent studies have shown that the brains of people with Alzheimer’s disease, the most common form of dementia, have lower levels of the signaling compound acetylcholine and its receptors. Research has also demonstrated that blocking acetylcholine receptors disrupts learning and memory. Drugs to stop the breakdown of acetylcholine have been developed to fight dementia, but they often don’t last long in the body and can be toxic to the liver. Natural extracts could be a safer treatment option, and some animal studies suggest that these extracts can improve cognition. Additionally, fermentation can boost the bioactivity of some natural products. So Beong-Ou Lim and colleagues wanted to test whether vinegar made from blueberries, which are packed with a wide range of active compounds, might help prevent cognitive decline.

    To carry out their experiment, the researchers administered blueberry vinegar to mice with induced amnesia. Measurements of molecules in their brains showed that the vinegar reduced the breakdown of acetylcholine and boosted levels of brain-derived neurotrophic factor, a protein associated with maintaining and creating healthy neurons. To test how the treatment affected cognition, the researchers analyzed the animals’ performance in mazes and an avoidance test, in which the mice would receive a low-intensity shock in one of two chambers. The treated rodents showed improved performance in both of these tests, suggesting that the fermented product improved short-term memory. Thus, although further testing is needed, the researchers say that blueberry vinegar could potentially be a promising food to help treat amnesia and cognitive decline related to aging.


  9. Study suggests quality of contact with grandparents is key to youths’ views of ageism

    January 7, 2018 by Ashley

    From the Society for Research in Child Development press release:

    Ageism — stereotypes that lead to prejudice and discrimination against older people — occurs frequently in young adults and can be seen in children as young as 3. A new study from Belgium sought to identify the factors underlying this form of discrimination. It found that ageist stereotypes in children and adolescents generally decrease around ages 10 to 12, and that young people who say they have very good contact with their grandparents have the lowest levels of ageism.

    The study, conducted by researchers at the University of Liege in Belgium, appears in the journal Child Development.

    “The most important factor associated with ageist stereotypes was poor quality of contact with grandparents,” explains Allison Flamion, a PhD student in psychology at the University of Liege, who led the research team. “We asked children to describe how they felt about seeing their grandparents. Those who felt unhappy were designated as having poor quality of contact. When it came to ageist views, we found that quality of contact mattered much more than frequency.”

    To assess aspects of ageism, the researchers studied 1,151 children and adolescents ages 7 to 16 in the French-speaking part of Belgium; the youths were primarily White, from urban and rural areas, and from a range of socioeconomic statuses. In questionnaires, the researchers asked the youths their thoughts on getting old and about the elderly. They also collected information about the health of the youths’ grandparents, how often the two generations met, and how the young people felt about their relationships with their grandparents.

    In general, views on the elderly expressed by the children and adolescents were neutral or positive. Girls had slightly more positive views than boys; girls also tended to view their own aging more favorably, the researchers note.

    Ageist stereotypes fluctuated with the ages of the youths studied, with 7- to 9-year-olds expressing the most prejudice and 10- to 12-year-olds expressing the least, the study found. This finding mirrors other forms of discrimination (e.g., those related to ethnicity or gender) and is in line with cognitive-developmental theories: For example, acquiring perspective-taking skills around age 10 reduces previous stereotypes. With ageism, prejudice seemed to reappear when the participants in this study reached their teen years: 13- to 16-year-olds also had high levels of ageism.

    Grandparents’ health was also a factor in youths’ views on ageism: Young people with grandparents in poor health were more likely to hold ageist views than youths with grandparents in better health.

    The most important factor influencing youths’ views of the elderly was the quality of their contact with their grandparents. The study characterized youths’ contact as good or very good when they said they felt happy or very happy (respectively) when they saw and shared with their grandparents. Those who described their contact with grandparents as good or very good had more favorable feelings toward the elderly than those who described the contact less positively. Furthermore, the benefit of meaningful contact occurred in both children with the lowest level of ageism and those with the highest level, and boys seemed to benefit more than girls from high-quality contact.

    Frequency of contact, while mattering considerably less, also played a role: 10- to 12-year-olds who saw their grandparents at least once a week had the most favorable views toward the elderly, likely because of the multiplying effect of frequency with quality, the researchers suggest.

    “For many children, grandparents are their first and most frequent contact with older adults,” notes Stephane Adam, professor of psychology at the University of Liege, who coauthored the study. “Our findings point to the potential of grandparents to be part of intergenerational programs designed to prevent ageism. Next, we hope to explore what makes contacts with grandparents more rewarding for their grandchildren as well as the effects on children of living with or caring for their grandparents.”


  10. Study suggests mutations in neurons accumulate with age; may explain normal cognitive decline and neurodegeneration

    January 2, 2018 by Ashley

    From the Boston Children’s Hospital press release:

    Scientists have wondered whether somatic (non-inherited) mutations play a role in aging and brain degeneration, but until recently there was no good technology to test this idea. A study published online today in Science, led by researchers from Boston Children’s Hospital and Harvard Medical School, used whole-genome sequencing of individual neurons and found strong evidence that brain mutations accumulate as we age. They also found that mutations accumulate at a higher rate in people with genetic premature aging disorders causing early brain degeneration.

    “It’s been an age-old question as to whether DNA mutations can accumulate in neurons — which usually don’t divide — and whether they are responsible for the loss of function that the brain undergoes as we get older,” says Christopher A. Walsh, MD, PhD, chief of the Division of Genetics and Genomics at Boston Children’s and co-senior author on the paper. “It hasn’t been possible to answer this question before, because we couldn’t sequence the genome of a single cell, and each mutation accumulated is unique to each cell.”

    Testing neurons one by one

    The research team tested DNA from 161 single neurons, taken from postmortem samples from the NIH NeuroBioBank. They came from 15 neurologically normal people of different ages (4 months to 82 years) and nine people with one of two accelerated aging and early-onset neurodegenerative disorders: Cockayne syndrome and xeroderma pigmentosum.

    Using the latest experimental and data analysis techniques, the team was able to detect mutations as small as single-letter changes in each neuron’s genetic code. Each cell had to have its genome amplified — by generating a multitude of copies — before its DNA sequence could be determined, and a large amount of data had to be analyzed.

    “Because many experimental artifacts arise during the single-cell experiments, a new computational method that can distinguish true mutations from the experimental noise was critical to the success of the project,” says Peter J. Park, PhD, of Harvard Medical School’s Department of Biomedical Informatics (DBMI), the paper’s other co-senior author.

    The neurons tested came from two areas of the brain implicated in age-related cognitive decline: the prefrontal cortex (the part of the brain most highly developed in humans) and the dentate gyrus of the hippocampus (a focal point in age-related degenerative conditions like Alzheimer’s).

    In neurons from neurologically normal people, the number of genetic mutations increased with age in both brain areas. However, mutations accumulated at a higher rate in the dentate gyrus. The researchers think this may be because the neurons have the ability to divide, unlike their counterparts in the prefrontal cortex.

    In neurons from people with Cockayne syndrome and xeroderma pigmentosum, there was an increase in mutations in the prefrontal cortex over time — more than two-fold compared to the normal rate. Additionally, the researchers found that the portions of the genome that neurons used the most accumulated mutations at the highest rate, with help from collaborators at WuXi NextCODE.

    The aging genome

    The researchers coined the term “genosenium” — combining the concepts of genome and senescence/senility — to capture the idea of gradual and inevitable accumulation of mutations contributing to brain aging.

    The mutations themselves fell into three categories. “We were able to take all the mutations we found and use mathematical techniques to deconstruct them into different types of DNA changes,” says Michael Lodato, PhD, one of six co-first authors on the paper. “It’s like hearing an orchestra and teasing out the different instruments.”