1. Dementia: BACE inhibitor improves brain function

    August 18, 2017 by Ashley

    From the Technical University of Munich (TUM) press release:

    The protein amyloid beta is believed to be the major cause of Alzheimer’s disease. Substances that reduce the production of amyloid beta, such as BACE inhibitors, are therefore promising candidates for new drug treatments. A team at the Technical University of Munich (TUM) has recently demonstrated that one such BACE inhibitor reduces the amount of amyloid beta in the brain. By doing so, it can restore the normal function of nerve cells and significantly improve memory performance.

    Around 50 million people worldwide suffer from dementia. To date, no effective drug is available that is able to halt or cure the disease. Moreover, the exact causes of the disease have yet to be definitively explained. However, there is a greater accumulation of the protein amyloid beta in Alzheimer’s patients than in healthy people. As a result, the protein clumps together and damages nerve cells.

    Affected cells can become hyperactive. They then constantly send false signals to neighboring cells. In addition, certain brain waves such as slow oscillations spin out of control. These waves play a key role in the formation of memories by transferring learned information into long-term memory.

    Brain functions restored in mice

    “A successful treatment must take effect as early in the course of the disease as possible. In our experiments, we have therefore blocked the enzyme beta secretase BACE, which produces amyloid beta,” explains Dr Marc Aurel Busche, young investigator group leader at the Institute for Neuroscience of the TUM and psychiatrist in the Department of Psychiatry and Psychotherapy of the TUM university hospital rechts der Isar.

    The researchers tested a substance that inhibits beta secretase in a mouse model of Alzheimer’s. The mice produce large amounts of amyloid beta, which, as in humans, leads to the formation of amyloid beta plaques in the brain and causes memory loss. During the study, the mice were given the inhibitor in their food for up to eight weeks, after which they were examined. For this purpose, the researchers used a special imaging technique known as two-photon microscopy, which allowed them to observe individual nerve cells in the brain.

    As expected, the mice had less amyloid beta in their brain after this period, since its production was inhibited. However, the effect of the substance was much more far-reaching: the animals’ brain functions actually normalized. There were fewer hyperactive nerve cells, and the slow-wave brain patterns once again resembled those in healthy mice. A key finding for the scientists was the observation that the animals’ memory also improved. The mice were able to locate a hidden platform in a water-filled maze as quickly as their healthy counterparts.

    Clinical trial planned

    “What really impressed and amazed us was the reversibility of the symptoms. Before the treatment, the mice had a marked clinical picture with amyloid beta plaques in their brain. Nevertheless, the substance was able to restore important brain functions and abilities,” explains Aylin Keskin, lead author of the publication. Moreover, the researchers’ study showed yet another benefit: “We were also able to demonstrate which neural deficits really are caused by amyloid beta. That was not fully understood with regard to hyperactive nerve cells, for example,” Keskin says.

    The scientists’ findings will soon find its way into clinical practice: A large-scale clinical trial is planned with around 1000 participants to test a slightly modified form of the BACE inhibitor. “Needless to say, we very much hope that the promising discoveries in the animal model will translate to humans,” Busche says.


  2. Study examines how branding and prices influence children’s decisions about snack purchases

    August 15, 2017 by Ashley

    From the Tufts University, Health Sciences Campus press release:

    What determines how children decide to spend their cash on snacks? A new study shows that children’s experience with money and their liking of brands influenced purchase decisions — and that for some children, higher prices for unhealthy snacks might motivate healthier choices. The study is published in the journal Appetite.

    Besides parents, many actors such as schools, governments and food manufacturers influence and modify the consumption of energy-dense, nutrient-poor snack food among children through a multitude of venues such as parental guidance and restriction of such foods and snacks available to children at home, school programs (school lunch and vending machines), governmental mandates (taxation, nutrition education campaigns) and marketing messages.

    Researchers co-led by Professors Monika Hartmann from the University of Bonn and Sean Cash from the Friedman School of Nutrition Science and Policy at Tufts University studied the ways that children aged 8 to 11 use their own disposable income, predominantly an allowance provided by a parent, to purchase snack food on their own. In the study, they report that brand awareness was not necessarily aligned with preference of snack foods. Importantly, the extent of children’s experience with money influenced their purchase decisions, suggesting that higher prices for unhealthy snacks might be helpful in motivating at least those children that have experience with money to choose healthier options.

    The research took place in after-school programs in the Boston area with a sample of 116 children ages 8 to 11 years. The study consisted of a survey, two cognitive tests, and a discrete choice experiment (DCE). The participants were paid $2.00 each as compensation after completing the cognitive tests that they could then spend in the DCE part of the study.

    “We don’t know much about how kids spend their cash on snacks,” said corresponding author Sean Cash, Ph.D., economist and associate professor at the Friedman School of Nutrition Science and Policy at Tufts University. “It’s an understudied area. Currently, many children have their own disposable income, usually an allowance. They’re deciding how to spend that money on snacks, but we know very little about how they make those decisions.” This is especially true with respect to the extent to which children assess typical consumer tradeoffs regarding price and other food attributes.

    Cookies, apple slices, or yogurt

    The researchers presented a series of snack options to the children — cookies, apple slices, and tubes of yogurt. Each child was presented ten times with pairs of photographs of two snack items that differed by product type, price and brand. Each time they could select one of the two products or decide to make no choice.

    The child was told that at the end of the experiment, one choice would be drawn at random from the ten decisions the child made and the child would be obligated to purchase the chosen snack. The child had to pay the stated price of that choice, which ranged between $0.30 and $0.70, from the money previously earned in the study and they received the designated snack. This feature of the design made each choice more realistic for the kids. One group of snacks was from McDonald’s in order to test the importance of branding on children’s choices.

    Experience with money makes a difference

    Children made choices based primarily on food types (cookie, apple slices, yogurt). In the experiment, children most often chose the chocolate-chip cookies, with apple slices in second place. Other results were surprising.

    “What we found is that brand awareness was not a key factor in purchasing the snacks. What mattered more is a child’s like or dislike the brand,” said first author Monika Hartmann, Ph.D., who was a visiting scholar at the Friedman School while the study was being conducted. She is based at the Institute of Food and Resource Economics at the University of Bonn.

    The other important result was a distinct dichotomy with respect to how price affected the children’s purchase decision. Children who were used to receiving an allowance from their parents paid attention to the price, whereas those who had little experience handling money generally did not. This result supports research that suggests experience with cash is an important aspect of learning what prices mean.

    “Overall, the literature on children’s price responsiveness and brand awareness is scarce,” the authors wrote in their article. “The former is especially true for younger children (elementary school).” At the same time, they observed that children spend a considerable amount of money on snacks while childhood incidence of chronic dietary-related disease (type-2 diabetes, coronary artery disease, and obesity) is high and increasing around the world.

    The authors point out that this is a small study, regionally biased, and with limited choices within the experiment. They suggest further research to explore the efficacy of using price and presentation (e.g., packaging, branding) as additional tools in the fight against the growing incidence of chronic disease among children. Their work was supported by a grant from the German Research Foundation (DFG) with additional support from the U.S. Department of Agriculture’s National Institute of Food and Agriculture.


  3. High-fat diet in pregnancy can cause mental health problems in offspring

    August 14, 2017 by Ashley

    From the Oregon Health & Science University press release:

    A high-fat diet not only creates health problems for expectant mothers, but new research in an animal model suggests it alters the development of the brain and endocrine system of their offspring and has a long-term impact on offspring behavior. The new study links an unhealthy diet during pregnancy to mental health disorders such as anxiety and depression in children.

    “Given the high level of dietary fat consumption and maternal obesity in developed nations, these findings have important implications for the mental health of future generations,” the researchers report.

    The research was published in the journal Frontiers in Endocrinology.

    The study, led by Elinor Sullivan, Ph.D., an assistant professor in the Division of Neuroscience at Oregon National Primate Research Center at OHSU, tested the effect of a maternal high-fat diet on nonhuman primates, tightly controlling their diet in a way that would be impossible in a human population. The study revealed behavioral changes in the offspring associated with impaired development of the central serotonin system in the brain. Further, it showed that introducing a healthy diet to the offspring at an early age failed to reverse the effect.

    Previous observational studies in people correlated maternal obesity with a range of mental health and neurodevelopmental disorders in children. The new research demonstrates for the first time that a high-fat diet, increasingly common in the developed world, caused long-lasting mental health ramifications for the offspring of non-human primates.

    In the United States, 64 percent of women of reproductive age are overweight and 35 percent are obese. The new study suggests that the U.S. obesity epidemic may be imposing transgenerational effects.

    “It’s not about blaming the mother,” said Sullivan, senior author on the study. “It’s about educating pregnant women about the potential risks of a high-fat diet in pregnancy and empowering them and their families to make healthy choices by providing support. We also need to craft public policies that promote healthy lifestyles and diets.”

    Researchers grouped a total of 65 female Japanese macaques into two groups, one given a high-fat diet and one a control diet during pregnancy. They subsequently measured and compared anxiety-like behavior among 135 offspring and found that both males and females exposed to a high-fat diet during pregnancy exhibited greater incidence of anxiety compared with those in the control group. The scientists also examined physiological differences between the two groups, finding that exposure to a high-fat diet during gestation and early in development impaired the development of neurons containing serotonin, a neurotransmitter that’s critical in developing brains.

    The new findings suggest that diet is at least as important as genetic predisposition to neurodevelopmental disorders such as anxiety or depression, said an OHSU pediatric psychiatrist who was not involved in the research.

    “I think it’s quite dramatic,” said Joel Nigg, Ph.D., professor of psychiatry, pediatrics, and behavioral neuroscience in the OHSU School of Medicine. “A lot of people are going to be astonished to see that the maternal diet has this big of an effect on the behavior of the offspring. We’ve always looked at the link between obesity and physical diseases like heart disease, but this is really the clearest demonstration that it’s also affecting the brain.”

    Sullivan and research assistant and first author Jacqueline Thompson said they believe the findings provide evidence that mobilizing public resources to provide healthy food and pre- and post-natal care to families of all socioeconomic classes could reduce mental health disorders in future generations.

    “My hope is that increased public awareness about the origins of neuropsychiatric disorders can improve our identification and management of these conditions, both at an individual and societal level,” Thompson said.


  4. Mediterranean-style diets linked to better brain function in older adults

    August 12, 2017 by Ashley

    From the American Geriatrics Society press release:

    Eating foods included in two healthy diets — the Mediterranean or the MIND diet — is linked to a lower risk for memory difficulties in older adults, according to a study published in the Journal of the American Geriatrics Society.

    The Mediterranean diet is rich in fruits, vegetables, whole grains, beans, potatoes, nuts, olive oil and fish. Processed foods, fried and fast foods, snack foods, red meat, poultry and whole-fat dairy foods are infrequently eaten on the Mediterranean diet.

    The MIND diet is a version of the Mediterranean diet that includes 15 types of foods. Ten are considered “brain-healthy:” green leafy vegetables, other vegetables, nuts, berries, beans, whole grains, seafood, poultry, olive oil, and wine. Five are considered unhealthy: red meat, butter and stick margarine, cheese, pastries, sweets and fried/fast foods.

    Researchers examined information from 5,907 older adults who participated in the Health and Retirement Study. The participants filled out questionnaires about their eating habits. Researchers then measured the participants’ cognitive abilities — mostly on their memory and attention skills.

    The researchers compared the diets of participants to their performance on the cognitive tests. They found that older people who ate Mediterranean and MIND-style diets scored significantly better on the cognitive function tests than those who ate less healthy diets. In fact, older people who ate a Mediterranean-style diet had 35% lower risk of scoring poorly on cognitive tests. Even those who ate a moderate Mediterranean-style diet had 15% lower risk of doing poorly on cognitive tests. The researchers noted similar results for people who ate MIND-style diets.

    This study suggests that eating Mediterranean and MIND-style diets is linked to better overall cognitive function in older adults, said the researchers. What’s more, older adults who followed these healthy diets had lower risks for having cognitive impairment in later life, noted the researchers.


  5. Study links aspect of neuroticism to lower mortality

    August 11, 2017 by Ashley

    From the Association for Psychological Science press release:

    Data from a longitudinal study of over 500,000 people in the United Kingdom indicate that having higher levels of the personality trait neuroticism may reduce the risk of death for individuals who report being in fair or poor health. The research, published in Psychological Science, a journal of the Association for Psychological Science, further revealed that a specific aspect of neuroticism related to worry and feelings of vulnerability was associated with lower mortality, regardless of self-reported health.

    “Our findings are important because they suggest that being high in neuroticism may sometimes have a protective effect, perhaps by making people more vigilant about their health,” says lead researcher Catharine R. Gale of the University of Edinburgh and University of Southampton.

    By definition, people with high levels of neuroticism are more likely to experience negative emotions — including irritability, frustration, nervousness, worry, and guilt — compared with their peers who have lower levels of neuroticism. Studies investigating links between neuroticism and mortality have produced inconsistent results, with some showing higher risk of death and others showing no relationship or even lower risk of death.

    Drawing from existing evidence, Gale and colleagues hypothesized that the relationship between neuroticism and risk of death may depend on how people rate their health.

    The researchers examined UK Biobank data collected from 502,655 people ages 37 to 73. Participants completed a validated personality assessment measuring neuroticism and indicated whether they thought they were in excellent, good, fair, or poor health overall. The data also included information on participants’ health behaviors (e.g., smoking, physical activity), physical health (e.g., body mass index, blood pressure), cognitive function, and medical diagnoses (e.g., heart problems, diabetes, cancer).

    Examining death certificates from the National Health Service Central Registry, the researchers found that a total of 4,497 participants had died in the follow-up period (which was about 6.25 years, on average). In general, the data showed that mortality was slightly higher among participants with higher levels of neuroticism. However, when Gale and colleagues adjusted for participants’ self-rated health, they found that the direction of the relationship reversed, with higher neuroticism being linked with slightly lowerrisk of death from all causes and from cancer.

    “When we explored this further, we found that this protective effect was only present in people who rated their health as fair or poor,” explains Gale. “We also found that people who scored highly on one aspect of neuroticism related to worry and vulnerability had a reduced risk of death regardless of how they rated their health.”

    Intriguingly, these relationships did not seem to vary according to participants’ health behaviors or medical diagnoses at the time they completed the neuroticism questionnaire, a finding which surprised the researchers.

    “Health behaviors such as smoking, exercise, diet and alcohol consumption did not explain any part of the link between high scores on the worry/vulnerability facet and mortality risk. We had thought that greater worry or vulnerability might lead people to behave in a healthier way and hence lower their risk of death, but that was not the case,” Gale says.

    Following on these findings, Gale and colleagues plan to further investigate the different facets of neuroticism to understand why worry and vulnerability may have specific protective effects.


  6. Increased risk of dementia in patients who experience delirium after surgery

    by Ashley

    From the Oxford University Press USA press release:

    Delirium is common in elderly hospitalized patients, affecting an estimated 14 — 56% of patients. It frequently manifests as a sudden change in behavior, with patients suffering acute confusion, inattention, disorganized thinking and fluctuating mental status.

    Pre-existing cognitive impairment or dementia in patients undergoing surgery are widely recognized as risk factors for postoperative delirium, increasing its likelihood and severity.

    However, little previous research has focused on whether delirium itself portends or even accelerates a decline into dementia in patients who showed no previous signs of cognitive impairment.

    Research published today in the British Journal of Anaesthesiafocuses on patients over the age of 65 who were assessed as cognitively normal prior to surgery. This study, led by Professor Juraj Sprung of the Mayo Clinic in Minnesota, finds those who developed postoperative delirium were three times more likely to suffer permanent cognitive impairment or dementia.

    Over a ten year period, patients over the age of 65 enrolled at the Mayo Clinic Study of Ageing in Olmsted County Minnesota who were exposed to general anesthesia were included in an investigation involving over two 2000 patients. Their cognitive status was evaluated in regular 15 month periods before and after surgery by neuropsychologic testing and clinical assessment. Out of 2014 patients, 1667 were deemed to be cognitively normal before surgery. Of the 1152 patients who returned for follow-up cognitive evaluation, 109 (9.5%) had developed mild cognitive impairment (pre-dementia) or dementia, and those who had suffered postoperative delirium were three times more likely to be subsequently diagnosed with permanent cognitive decline or dementia. This research is the first to focus on the association between delirium and long-term cognitive decline in patients with normal mental capacity before surgery.

    While previous studies have highlighted cognitive decline in the elderly following postoperative delirium, no others have involved such a detailed neuro-cognitive assessment identifying those with normal pre-operative cognitive abilities who go on to develop dementia. In conclusion, researchers believe that postoperative delirium could be a warning sign of future permanent cognitive impairment (dementia) in patients who at the time of surgery were still just above the threshold for registering cognitive decline. Alternatively, postoperative delirium could itself produce injury, which per se accelerates the trajectory of decline into dementia.

    “Our research shows that delirium after surgery is not only distressing for patients and their families, but also may be a warning that patients could later develop dementia, said Sprung. “We don’t yet know whether taking steps to prevent postoperative delirium could also help prevent dementia — but we need to find out.”

    British Journal of Anaesthesia Editor-in-Chief Professor Hugh Hemmings said: “This important research identifies a significant risk factor for developing dementia postoperatively, and highlights the need for more research in preventing, identifying and treating postoperative delirium.”


  7. Social interaction affects cancer patients’ response to treatment

    by Ashley

    From the NIH/National Human Genome Research Institute press release:

    How well cancer patients fared after chemotherapy was affected by their social interaction with other patients during treatment, according to a new study by researchers at the National Human Genome Research Institute (NHGRI), part of the National Institutes of Health, and the University of Oxford in the United Kingdom. Cancer patients were a little more likely to survive for five years or more after chemotherapy if they interacted during chemotherapy with other patients who also survived for five years or more. Patients were a little more likely to die in less than five years after chemotherapy when they interacted during chemotherapy with those who died in less than five years. The findings were published online July 12, 2017, in the journal Network Science.

    People model behavior based on what’s around them,” Jeff Lienert, lead author in NHGRI’s Social and Behavioral Research Branch and a National Institutes of Health Oxford-Cambridge Scholars Program fellow. “For example, you will often eat more when you’re dining with friends, even if you can’t see what they’re eating. When you’re bicycling, you will often perform better when you’re cycling with others, regardless of their performance.”

    Lienert set out to see if the impact of social interaction extended to cancer patients undergoing chemotherapy. Joining this research effort were Lienert’s adviser, Felix Reed-Tsochas, Ph.D., at Oxford’s CABDyN Complexity Centre at the Sai?d Business School, Laura Koehly, Ph.D., chief of NHGRI’s Social and Behavioral Research Branch, and Christopher Marcum, Ph.D., a staff scientist also in the Social and Behavioral Research Branch at NHGRI.

    They based their findings on electronic medical records data from 2000 to 20009 from two major hospitals in the United Kingdom’s National Health Service. The researchers examined the total time a patient spent with the same patients undergoing chemotherapy and their five-year survival rate. The five-year survival rate is the percentage of people who live at least five years after chemotherapy treatment is completed. For example, a five-year survival rate of 70 percent means that an estimated 70 out of 100 people are still alive five years after chemotherapy. They also reviewed a room schematic to confirm the assumption that patients were potentially positioned to interact.

    “We had information on when patients checked in and out of the chemotherapy ward, a small intimate space where people could see and interact for a long period of time,” Lienert said. “We used ‘time spent getting chemotherapy in a room with others as a proxy for social connection.”

    When patients were around those during chemotherapy who died in less than five years following chemotherapy, they had a 72 percent chance of dying within five years following their chemotherapy. The best outcome was when patients interacted with someone who survived for five years or longer: they had a 68 percent chance of dying within five years. The researchers’ model also predicted that if patients were isolated from other patients, they would have a 69.5 percent chance of dying within five years.

    “A two percent difference in survival — between being isolated during treatment and being with other patients — might not sound like a lot, but it’s pretty substantial,” Lienert said. “If you saw 5,000 patients in nine years, that 2 percent improvement would affect 100 people.”

    “Mr. Lienert’s research is the first to investigate, on a large scale, how social context in a treatment setting can play a significant role in disease outcomes,” said Koehly. “As cancer care moves more towards targeted therapies based on genomic tumor assessments, NHGRI is interested in understanding how these social environmental factors might impact treatment efficacy.”

    The researchers didn’t study why the difference occurred, but hypothesize that it may be related to stress response. “When you’re stressed, stress hormones such as adrenaline are released, resulting in a fight or flight response,” Lienert said. “If you are then unable to fight or fly, such as in chemotherapy, these hormones can build up.”

    While the researchers also didn’t investigate the impact of visitors on cancer patients undergoing therapy, the effect would likely be similar, he said.

    Positive social support during the exact moments of greatest stress is crucial,” Lienert said. “If you have a friend with cancer, keeping him or her company during chemotherapy probably will help reduce their stress. The impact is likely to be as effective, and possibly more effective, than cancer patients interacting with other cancer patients.”


  8. Risk for bipolar disorder associated with faster aging

    August 10, 2017 by Ashley

    From the King’s College London press release:

    New King’s College London research suggests that people with a family history of bipolar disorder may ‘age’ more rapidly than those without a history of the disease.

    The study, published in Neuropsychopharmacology, also shows that bipolar patients treated with lithium — the main medication for the illness — have longer telomeres (a sign of slower biological aging) compared to bipolar disorder patients not treated with lithium. This suggests that the drug may mask the aging effects associated with bipolar disorder, or even help to reverse it.

    Faster aging at the biological level could explain why rates of aging-related diseases such as cardiovascular disease, type-2 diabetes and obesity are higher amongst bipolar disorder patients. However, more research is needed in the relatives of bipolar disorder patients to better understand if they are also at a higher risk for aging-related diseases.

    Unaffected first-degree relatives represent a group of individuals at risk for bipolar disorder who have not been treated with medications, so studying them may represent a truer reflection of the relationship between aging and bipolar disorder. To measure biological aging, the researchers studied a feature of chromosomes called telomeres in 63 patients with bipolar disorder, 74 first-degree relatives and 80 unrelated healthy people.

    Telomeres sit on the end of our chromosomes and act like ‘caps’, protecting the strands of DNA stored inside each of our cells as we age. Telomeres shorten each time a cell divides to make new cells, until they are so short that they are totally degraded and cells are no longer able to replicate. Telomere length therefore acts as a marker of biological age, with shortened telomeres representing older cells, and commonly older individuals.

    The rate at which telomeres shorten across our lifespan can vary, based on a range of environmental and genetic factors. This means that two unrelated people of the same chronological age may not be the same age biologically.

    The researchers from King’s College London and the Icahn School of Medicine at Mount Sinai found that healthy relatives of bipolar patients had shorter telomeres compared to healthy controls (who had no risk for the disorder running in their family). This suggests that genetic or environmental factors associated with family risk for bipolar disorder are also linked to faster biological aging.

    They also conducted MRI (magnetic resonance imaging) scans to explore the relationship between telomere length and brain structure, particularly in the hippocampus, an area of the brain involved in the regulation of mood. They discovered that higher rates of biological aging (i.e. shorter telomeres) were associated with having a smaller hippocampus.

    The study authors suggest that a reduction in telomere length may be associated with a reduced ability of new brain cells to grow in the hippocampus, which can reduce the size of the hippocampus and consequently increase risk for mood disorders such as bipolar disorder.

    Dr Timothy Powell, first author of the study, from the Institute of Psychiatry, Psychology & Neuroscience (IoPPN) at King’s College London, said: ‘Our study provides the first evidence that familial risk for bipolar disorder is associated with shorter telomeres, which may explain why bipolar disorder patients are also at a greater risk for aging-related diseases.

    ‘We still need to dissect the environmental and genetic contributions to shortened telomeres in those at high risk for bipolar disorder. For instance, do those at risk for bipolar disorder carry genes predisposing them to faster biological aging, or are they more likely to partake in environmental factors which promote aging (e.g. smoking, poor diet)? Identifying modifiable risk factors to prevent advanced aging would be a really important next step.’

    Dr Sophia Frangou, co-senior author of the study, from the Icahn School of Medicine at Mount Sinai, said: ‘Our study shows that telomere length is a promising biomarker of biological aging and susceptibility to disease in the context of bipolar disorder. Moreover, it suggests that proteins which protect against telomere shortening may provide novel treatment targets for people with bipolar disorder and those predisposed to it.’

    Dr Gerome Breen, co-senior author, also at IoPPN, said: ‘Up to now it has been unclear whether or not bipolar disorder patients are at risk of accelerated aging. This study shows that they are at greater risk of faster aging and drugs commonly used to treat the disorder may actually mask or reverse this effect.’


  9. Perceiving oneself as less physically active than peers is linked to a shorter lifespan

    August 7, 2017 by Ashley

    From the Stanford University press release:

    Would you say that you are physically more active, less active, or about equally active as other people your age?

    Your answer might be linked to your risk of premature death decades from now — no matter how physically active you actually are, according to research by Stanford scholars Octavia Zahrt and Alia Crum.

    The research, appearing July 20 in Health Psychology, finds that people who think they are less active than others in a similar age bracket die younger than those who believe they are more active — even if their actual activity levels are similar.

    “Our findings fall in line with a growing body of research suggesting that our mindsets — in this case, beliefs about how much exercise we are getting relative to others — can play a crucial role in our health,” Crum said.

    Powerful effects of perception

    Crum, an assistant professor of psychology, and Zahrt, a doctoral candidate at the Graduate School of Business, analyzed surveys from more than 60,000 U.S. adults from three national data sets. The surveys documented participants’ levels of physical activity, health and personal background, among other measures. In one of the samples, participants wore an accelerometer to measure their activity over a week.

    Zahrt and Crum were interested in one question in particular: “Would you say that you are physically more active, less active, or about as active as other persons your age?”

    The researchers then viewed death records from 2011, which was 21 years after the first survey was conducted. Controlling for physical activity and using statistical models that accounted for age, body mass index, chronic illnesses and other factors, they found that individuals who believed that they were less active than others were up to 71 percent more likely to die in the follow-up period than individuals who believed that they were more active than their peers.

    Fit on the Farm?

    Much of the study’s inspiration derived from Zahrt’s experience when she arrived at Stanford. Zahrt, a native of Germany who previously studied in France and England, had stayed in shape by biking to school and making occasional trips to the gym.

    But at Stanford, Zahrt said it seemed that “everyone was incredibly active” and perhaps she wasn’t exercising as much as she should.

    “Suddenly, I felt like I had done something wrong all these years,” Zahrt said. “I felt unhealthy and I was stressed about fitting more exercise into my busy schedule. I really had a negative mindset.”

    While taking a health psychology class taught by Crum, Zahrt learned more about the effects of mindsets on health outcomes. For example, Crum’s prior research shows that the health benefits people get out of everyday activities depend in part on their mindsets — that is, whether or not they believe that they are getting good exercise. In her 2007 study, Crum made a group of hotel room attendants aware that the activity they got at work met recommended levels of physical activity. Through this shift in mindsets, the workers, many of whom had previously perceived themselves as inactive, experienced reductions in weight, body fat and blood pressure, among other positive outcomes. Zahrt wondered if many people, like her, had negative mindsets about their physical activity levels because of social comparison with more active peers, and if this might be harming their health. Her class paper on this topic sparked the collaboration leading to the published study.

    How mindsets influence us

    Zahrt and Crum offer possible explanations for mindsets and perceptions having such powerful effects on health. One is that perceptions can affect motivation, both positively and negatively. Those who are made aware of their healthy activity levels — like the hotel room attendants in Crum’s 2007 study — can build on them and exercise more. Those who deem themselves unfit are more likely to remain inactive, fueling feelings of fear, stress or depression that negatively affect their health.

    The researchers also cite the established influence of placebo effects, where patients who think they are getting a treatment experience physiological changes without receiving actual treatment. In the same way, people who believe they are getting good exercise may experience more physiological benefits from their exercise than those who believe they aren’t getting enough exercise.

    “Placebo effects are very robust in medicine. It is only logical to expect that they would play a role in shaping the benefits of behavioral health as well,” Crum said.

    The researchers emphasize that the study is correlational in nature and thus does not prove that perceptions of inactivity cause earlier death. However, other experimental research — such as Crum’s 2007 study — does suggest a causal nature to the link between perceived amounts of exercise and health outcomes.

    Taking mindsets seriously

    “So much effort, notably in public health campaigns, is geared toward motivating people to change their behavior: eat healthier, exercise more and stress less,” Crum said. “But an important variable is being left out of the equation: people’s mindsets about those healthy behaviors.”

    In fact, a growing volume of research from Crum and other labs shows that perceptions and mindsets predict health and longevity, for example, in the domains of stress, diet and obesity.

    That our mindsets could have such potent effects on our physiology may seem provocative and unlikely at first glance, but Crum reminds us that we shouldn’t be surprised by these results considering the “everyday experiences where our beliefs or a simple thought have very palpable and physiological effects.”

    “In the case of stress, a thought about something going wrong can make us sweat or [become] shaky or increase our heart rate,” Crum continued. “With sexual arousal, a simple thought or idea can have immediate physical effects. We experience these things regularly, and yet we’re not cataloguing them as something that matters. For whatever reason — dualism or a prioritization of the material — we tend to ignore the fact that our thoughts, mindsets and expectations are shaping our everyday physiology.”

    How can people use this finding? Many Americans think that vigorous exercise in a gym is the only way to attain a proper activity level, according to Zahrt and Crum. But being mindful of and feeling good about activities you do every day — like taking the stairs, walking or biking to work, or cleaning the house — could be an easy first step for everyone to benefit their health.

    “It’s time that we start taking the role of mindsets in health more seriously,” Crum said. “In the pursuit of health and longevity, it is important to adopt not only healthy behaviors, but also healthy thoughts.”


  10. Study suggests screening those at risk of psychosis may help prevent violence, reduce stigma

    by Ashley

    From the Columbia University Medical Center press release:

    A new study of young persons at clinical high-risk of developing psychosis has identified measures of violence potential that may be useful in predicting both the increased risk of future violent behavior and the actual development of psychosis.

    The article, “A Longitudinal Study of Violent Behavior in a Psychosis-Risk Cohort,” by Gary Brucato, PhD, Ragy Girgis, MD and colleagues at Columbia University Medical Center, was published in Neuropsychopharmacology.

    In the public imagination, individuals with psychosis are often identified with acts of violence. However, the reality is that persons with mental illness account for a very small proportion of violent crime in the U.S. But studies have shown that people with psychotic disorders are more prone to acts of mass violence involving strangers or intrafamily violence if they have not received effective treatment.

    “It is important that we acknowledge that violence can be fueled by mental illness and that steps be taken to identify those people who might be prone and treat them accordingly. That is why these findings are so important as they demonstrate that screening people with sensitive instruments can detect which people in the incipient stages of mental disorders are at greatest risk of violence,” noted Jeffrey A. Lieberman, MD, Lawrence C. Kolb professor and chair of psychiatry at Columbia University College of Physicians and Surgeons.

    The study followed 200 individuals at high-risk of psychosis over a period of two years. Twelve (6%) of them reported acts of violent behavior in the six months before joining the study, fifty-six (28%) reported violent ideation at the time of entry into the study, and eight (4%) committed acts of violence during the two-year follow-up period. As a result of the study evaluation, the study staff provided treatment and took preemptive action for ten additional individuals whose thoughts had developed into plans for violent acts.

    The results of the study showed that both thoughts of violence and recent violent behavior were associated with future incidents of violence, which occurred within an average of seven days of when the person developed psychotic symptoms.

    Only information contained in the description of the person’s symptoms predicted the violent behavior, and not direct questions of “whether you want to hurt anyone.” The authors suggest that this is likely why prior studies of violence in mental illness did not predict violent behavior. The direct question “have you had thoughts of harming anyone else?,” elicited zero responses of violent ideation from any of the 200 participants. However, the indirect question “have you felt that you are not in control of your own ideas or thoughts?” elicited reports of violent ideation from 56 individuals.

    Also, the targets of the violent thoughts at the beginning of the study were not those that the person subsequently. This suggests that the attacks may have been impulsive and opportunistic rather than planned, and the result of the person’s psychotic symptoms.

    These individuals feel that they themselves are not having violent fantasies, “said Dr. Gary Brucato, clinical psychologist and researcher in the department of psychiatry and first author on the paper. “They feel that the thoughts they are having are intrusive and not their own. Since they are not convinced that these thoughts are real, they tend not to report them or consider them meaningful.”

    A variety of factors, including alcohol and drug abuse, failure to take antipsychotic medications regularly, younger age, and psychotic symptoms such as delusions and hallucinations have been shown to have some effect on the risk of violence among people with psychosis. Earlier research has also indicated that the period around the time of a first psychotic episode is a time of high risk for violent behavior, and that violent behavior peaks at this time.

    “These findings indicate that pre-symptomatic individuals at-risk for psychosis should be screened for violent ideation, and, importantly, demonstrate how to do the screening effectively,” said Ragy Girgis, MD, assistant professor of psychiatry at Columbia University Medical Center and senior author on the paper. “We hope this finding and means of assessment will move the field to develop a more nuanced understanding of violent ideation in the context of psychotic symptoms. Much like suicidal ideation in depression, destigmatizing the experience of violent ideation in the attenuated phase of psychosis will allow patients to freely report it.”