1. Believing the future will be favorable may prevent action

    August 20, 2017 by Ashley

    From the Association for Psychological Science press release:

    People tend to believe that others will come around to their point of view over time, according to findings from a series of studies published in Psychological Science, a journal of the Association for Psychological Science. The findings show that this “belief in a favorable future” holds across various contexts and cultures, shedding light on some of the causes and consequences of the political polarization evident today.

    “It often seems that partisans believe they are so correct that others will eventually come to see the obviousness of their correctness,” says behavioral scientist Todd Rogers of the Harvard Kennedy School, lead author on the research. “Ironically, our findings indicate that this belief in a favorable future may diminish the likelihood that people will take action to ensure that the favorable future becomes reality.”

    In six related studies, Rogers and colleagues Don A. Moore (UC Berkeley Haas School of Business) and Michael I. Norton (Harvard Business School) explored how widely held the belief in a favorable future is, why the belief emerges, and what some of its consequences are.

    In one online study, the researchers asked 254 participants to report their views on nine topics: abortion, same-sex marriage, climate change, ideology, party affiliation, President Trump, soda, the National Basketball Association, and phone preferences. The participants also reported how they thought other people’s views on the same topics would change between now and the future. For all nine topics, participants’ own current beliefs were associated with their estimation of how others’ future beliefs will change. For example, 91% of participants who supported easier access to abortion predicted that more people would support easier access to abortion in the future compared with only 47% of those who supported making access to abortion more difficult.

    Data from over 800 people in China, Japan, the Netherlands, and the United Kingdom indicated that the belief in a favorable future is a cross-cultural phenomenon, and additional findings revealed that the biased belief is distinct from other phenomena such as optimism and the false-consensus effect. Even when people are given an incentive to make accurate predictions about how people’s beliefs will change between now and the future, they tend to believe others’ attitudes will change over time to fall in line with their own current beliefs.

    Importantly, field experiment data suggest that believing in a favorable future can influence people’s behavior in the here and now. Working with the Democratic Governors Association, Rogers and colleagues sent out two variations of a fundraising email to more than 660,000 supporters. Recipients were less likely to open the email if the subject indicated that a Democrat had the lead in a closely contested race compared with a message that suggested he was trailing in a closely contested race. Of those who opened the email, people were less likely to click the donation link and were less likely to make a donation when the Democrat was portrayed as having the lead compared to when the Democrat was portrayed as being behind.

    “The most interesting aspect of this to me is how robust it is,” says Rogers. “This pattern of findings emerges for an unexpectedly diverse range of preferences, views, and beliefs — and it emerges across cultures. People biasedly believe that others will change in ways that align with their current preferences, views, and beliefs.”

    According to the researchers, this bias could help to explain a whole host of behavioral phenomena, from staying in a bad job or relationship to underestimating future opposition to a specific political view.


  2. Study examines effect of alcohol on recall of earlier learning

    by Ashley

    From the University of Exeter press release:

    Drinking alcohol improves memory for information learned before the drinking episode began, new research suggests.

    In the University of Exeter study, 88 social drinkers were given a word-learning task. Participants were then split in two groups at random and told either to drink as much as they liked (the average was four units) or not to drink at all.

    The next day, they all did the same task again — and those who had drunk alcohol remembered more of what they had learned.

    The researchers are keen to stress that this limited positive effect should be considered alongside the well-established negative effects of excessive alcohol on memory and mental and physical health.

    “Our research not only showed that those who drank alcohol did better when repeating the word-learning task, but that this effect was stronger among those who drank more,” said Professor Celia Morgan, of the University of Exeter.

    “The causes of this effect are not fully understood, but the leading explanation is that alcohol blocks the learning of new information and therefore the brain has more resources available to lay down other recently learned information into long-term memory.

    “The theory is that the hippocampus — the brain area really important in memory — switches to ‘consolidating’ memories, transferring from short into longer-term memory.”

    The effect noted by the researchers has been shown under laboratory conditions before, but this is the first study to test it in a natural setting, with people drinking in their homes.

    There was also a second task which involved looking at images on a screen.

    This task was completed once after the drinkers had drunk alcohol and again the following day, and the results did not reveal significant differences in memory performance post-drinking.

    The study’s participants were 31 males and 57 females, aged 18-53.


  3. Study suggests babies can pick up on and respond to social dominance

    by Ashley

    From the University of Washington press release:

    The charismatic colleague, the natural leader, the life of the party — all are personal qualities that adults recognize instinctively. These socially dominant types, according to repeated studies, also tend to accomplish and earn more, from accolades and material wealth to friends and romantic partners.

    This social hierarchy may be so naturally ingrained, University of Washington researchers say, that toddlers as young as 17 months old not only can perceive who is dominant, but also anticipate that the dominant person will receive more rewards.

    The research, led by UW psychology professor Jessica Sommerville and graduate student Elizabeth Enright, appears in the July issue of the journal Cognition.

    “This tells us that babies are sorting through things at a higher level than we thought. They’re attending to and taking into consideration fairly sophisticated concepts,” Sommerville said. “If, early on, you see that someone who is more dominant gets more stuff, and as adults, we see that and say that’s how the world is, it might be because these links are present early in development.”

    The study evaluated the reactions of 80 toddlers, each of whom watched three short videos of puppets in simple social situations. Researchers measured the length of time the children focused on the outcome of each video in an effort to determine what they noticed.

    Measuring a baby’s “looking time” is a common metric used in studies of cognition and comprehension in infants, the researchers explained.

    “Really young babies can’t talk to us, so we have to use other measures such as how long they attend to events, to gauge their understanding of these events,” said Enright. “Babies will look longer at things they find unexpected.”

    The same is true of adults, she pointed out. Adults will focus on the result of a magic trick, for instance, or a car accident on the side of the road. Both defy expectations about what normally happens.

    While other research has found that infants and young children expect equal distributions and react positively toward sharing, the UW study is one of the first to explore the impact of a personality trait, such as social dominance, on those expectations.

    For the study, each toddler watched an introductory video at least six times; this brief clip aimed to establish the “dominant” puppet in the scene — the one who appeared to win a minor competition with a second puppet over a special chair. Then each child watched a second set of videos so that researchers could compare how the toddler reacted to various outcomes. The researchers employed puppets, rather than people, for the videos because the puppets look essentially the same, offer no facial or other emotional reaction, and don’t draw an infant’s attention the way that differences among humans might, said Sommerville.

    The researchers set up three narrative scenarios using the puppets. In one scenario, a clip showed the dominant puppet receiving more Legos, while another clip showed both puppets receiving the same number. In the other scenario, a clip again showed the puppets receiving the same number of Legos, while a different clip showed the submissive puppet receiving more.

    The study found that toddlers looked an average of 7 seconds longer at the videos in which the weaker puppet received more Legos, or when the two puppets received the same number, versus when the dominant puppet received more Legos. This indicates that the children didn’t expect those outcomes, Sommerville said, because their lingering gaze suggests their brains were continuing to process the information on the screen.

    The results demonstrated toddlers’ expectation that a dominant individual receives more resources and that toddlers are able to adjust their thinking about resource distribution based on their perceptions of social status of the recipients, the researchers said.

    However, the experiment suggests other questions, which Enright is exploring now in a new study: What other traits could inform infants’ and young children’s expectations about resources? Using a similar approach with puppets, researchers will show toddlers a series of videos that aims to portray competence — a puppet who does a better job at completing a goal than another puppet — and test expectations about which puppet receives the reward.

    “Is the issue dominance? From the videos, it could be that the dominant one was perceived as more persistent or competent,” Enright said. “This could be the very start of finding out what infants know about social status.”

    Video: https://www.youtube.com/watch?v=eNk_6WLNvBU

    Video: https://www.youtube.com/watch?v=NoPvN47dAK4


  4. Mild cognitive impairment, Alzheimer’s diagnoses trigger lower self-ratings of quality of life

    by Ashley

    From the University of Pennsylvania School of Medicine press release:

    Researchers at Penn Medicine have discovered that a patient’s awareness of a diagnosis of cognitive impairment may diminish their self-assessment of quality of life. In a study published this month in the Journal of Gerontology: Psychological Sciences the researchers report that older adults who were aware of their diagnosis — either Mild Cognitive Impairment or mild stage Alzheimer’s disease dementia — reported greater depression, higher stress, and lower quality of life than those who were unaware. They also found that older adults who had an expectation that their disease would worsen over time reported lower overall satisfaction with daily life.

    “These findings suggest that a patient’s quality of life could be impacted by a diagnostic label and their expectations for the prognosis. So, when a clinician discloses the diagnosis and prognosis of Mild Cognitive Impairment or mild stage Alzheimer’s disease, a patient may experience additional symptoms, like anxiety or depression,” said the study’s lead author, Shana Stites, PsyD, MA, MS, a clinical psychologist in the Penn Memory Center, senior research investigator for the Penn Project on Precision Medicine for the Brain (P3MB).

    For many years, a diagnosis of Alzheimer’s disease was often not made until a patient had substantial memory and cognitive problems — by which time patients themselves were often unaware of their diagnosis. Advances in awareness, as well in diagnostic methods, mean doctors are diagnosing Alzheimer’s disease earlier, and in the future, routine diagnosis may occur before symptoms even begin. According to Stites, early diagnosis holds the promise for opportunities to prevent cognitive and functional losses and to plan for these losses. But study results show that an early diagnosis of Alzheimer’s disease can also bring challenges.

    The Penn Researchers studied how awareness of diagnosis impacts on self-ratings of quality of life in people with one of two disorders, Mild Cognitive Impairment — a disorder defined by slight but noticeable declines in cognitive abilities — or mild stage Alzheimer’s disease dementia. They compared these ratings to a group of adults above the age of 65 with normal cognition. Study participants completed measures of multiple domains of quality of life including cognitive problems, activities of daily living, physical functioning, mental wellbeing, and perceptions of one’s daily life. The researchers compared the measure of quality of life by cognitive performance, diagnosis awareness, and diagnostic group.

    The findings help to identify psychological processes underlying relationships between cognitive decline and quality of life. According to Stites, the study has practical implications for current and future clinical practice.

    “It’s not just an issue of to tell or not to tell, it’s an issue of how you tell and what you tell because when you give someone a diagnosis you’re also communicating, either directly or indirectly, a lot of information that can affect the activities people do in daily life, their planning for employment and lifestyle, emotional wellbeing, and social relationships with close friends and family members. These issues need to be explicitly addressed with patients,” Stites said. “Maybe at this point we can’t prevent cognitive decline, but we certainly have effective interventions for treating depression and for managing other symptoms.”

    The researchers note that further study is needed to understand what drives the impact of awareness of diagnosis and prognosis on quality of life. Future studies might include pre-clinical research that is being done in Alzheimer’s disease, where clinicians are working to diagnose people who are at risk of developing the disease based on genes and biomarkers to determine how diagnosis awareness might affect an individual’s sense of identity and functioning in the world if they learn that they have a high probability of developing Alzheimer’s disease in the future.

    A diagnosis of Alzheimer’s disease can evoke assumptions, stereotypes, feelings, and attitudes that can affect a person’s quality of life, how they view themselves and how they are treated by others. This study is part of the research team’s ongoing efforts to understand how early diagnosis can impact a person’s quality of life and wellbeing. The results add to what they’ve been learning about the stigma of Alzheimer’s disease.


  5. Study suggests yoga may help reduce depression symptoms

    by Ashley

    From the American Psychological Association (APA) press release:

    People who suffer from depression may want to look to yoga as a complement to traditional therapies as the practice appears to lessen symptoms of the disorder, according to studies presented at the 125th Annual Convention of the American Psychological Association.

    “Yoga has become increasingly popular in the West, and many new yoga practitioners cite stress-reduction and other mental health concerns as their primary reason for practicing,” said Lindsey Hopkins, PhD, of the San Francisco Veterans Affairs Medical Center, who chaired a session highlighting research on yoga and depression. “But the empirical research on yoga lags behind its popularity as a first-line approach to mental health.”

    Hopkins’ research focused on the acceptability and antidepressant effects of hatha yoga, the branch of yoga that emphasizes physical exercises, along with meditative and breathing exercises, to enhance well-being. In the study, 23 male veterans participated in twice-weekly yoga classes for eight weeks. On a 1-10 scale, the average enjoyment rating for the yoga classes for these veterans was 9.4. All participants said they would recommend the program to other veterans. More importantly, participants with elevated depression scores before the yoga program had a significant reduction in depression symptoms after the eight weeks.

    Another, more specific, version of hatha yoga commonly practiced in the West is Bikram yoga, also known as heated yoga. Sarah Shallit, MA, of Alliant University in San Francisco investigated Bikram yoga in 52 women, age 25-45. Just more than half were assigned to participate in twice-weekly classes for eight weeks. The rest were told they were wait-listed and used as a control condition. All participants were tested for depression levels at the beginning of the study, as well as at weeks three, six and nine. Shallit and her co-author Hopkins found that eight weeks of Bikram yoga significantly reduced symptoms of depression compared with the control group.

    In the same session, Maren Nyer, PhD, and Maya Nauphal, BA, of Massachusetts General Hospital, presented data from a pilot study of 29 adults that also showed eight weeks of at least twice-weekly Bikram yoga significantly reduced symptoms of depression and improved other secondary measures including quality of life, optimism, and cognitive and physical functioning.

    “The more the participants attended yoga classes, the lower their depressive symptoms at the end of the study,” said Nyer, who currently has funding from the National Center for Complementary and Integrative Health to conduct a randomized controlled trial of Bikram yoga for individuals with depression.

    Elsewhere at the meeting, Nina Vollbehr, MS, of the Center for Integrative Psychiatry in the Netherlands presented data from two studies on the potential for yoga to address chronic and/or treatment-resistant depression. In the first study, 12 patients who had experienced depression for an average of 11 years participated in nine weekly yoga sessions of approximately 2.5 hours each. The researchers measured participants’ levels of depression, anxiety, stress, rumination and worry before the yoga sessions, directly after the nine weeks and four months later. Scores for depression, anxiety and stress decreased throughout the program, a benefit that persisted four months after the training. Rumination and worry did not change immediately after the treatment, but at follow up rumination and worry were decreased for the participants.

    In another study, involving 74 mildly depressed university students, Vollbehr and her colleagues compared yoga to a relaxation technique. Individuals received 30 minutes of live instruction on either yoga or relaxation and were asked to perform the same exercise at home for eight days using a 15-minute instructional video. While results taken immediately after the treatment showed yoga and relaxation were equally effective at reducing symptoms, two months later, the participants in the yoga group had significantly lower scores for depression, anxiety and stress than the relaxation group.

    “These studies suggest that yoga-based interventions have promise for depressed mood and that they are feasible for patients with chronic, treatment-resistant depression,” said Vollbehr.

    The concept of yoga as complementary or alternative mental health treatment is so promising that the U.S. military is investigating the creation of its own treatment programs. Jacob Hyde, PsyD, of the University of Denver, gave a presentation outlining a standardized, six-week yoga treatment for U.S. military veterans enrolled in behavioral health services at the university-run clinic and could be expanded for use by the Department of Defense and the Department of Veterans Affairs.

    Hopkins noted that the research on yoga as a treatment for depression is still preliminary. “At this time, we can only recommend yoga as a complementary approach, likely most effective in conjunction with standard approaches delivered by a licensed therapist,” she said. “Clearly, yoga is not a cure-all. However, based on empirical evidence, there seems to be a lot of potential.”


  6. Study suggests psychopaths are better at learning to lie

    by Ashley

    From the BioMed Central press release:

    Individuals with high levels of psychopathic traits are better at learning to lie than individuals who show few psychopathic traits, according to a study published in the open access journal Translational Psychiatry. The findings indicate that people with high psychopathic traits may not have a ‘natural’ capacity to lie better, but rather are better at learning how to lie, according to the researchers.

    Dr. Tatia Lee and Dr. Robin Shao of the State Key Laboratory of Brain and Cognitive Sciences and the Laboratory of Neuropsychology at The University of Hong Kong found that after practicing a task that involved giving a series of truthful or untruthful responses about whether or not they recognized people in a collection of photographs, individuals with high levels of psychopathic traits were able to lie much more quickly than before practice. By contrast, individuals with low levels of psychopathic traits showed no improvement in their lying speed.

    Dr Tatia Lee, the corresponding authors said: “The stark contrast between individuals with high and low levels of psychopathic traits in lying performance following two training sessions is remarkable, given that there were no significant differences in lying performance between the two groups prior to training.”

    Dr Shao added: “High psychopathy is characterized by untruthfulness and manipulativeness but the evidence so far was not clear on whether high-psychopathic individuals in the general population tend to lie more or better than others. Our findings provide evidence that people with high psychopathic traits might just be better at learning how to lie.”

    To find out if individuals with high levels of psychopathic traits were better at learning how to lie than others, the researchers recruited 52 students from The University of Hong Kong — 23 who showed low levels of psychopathic traits and 29 who showed high levels of psychopathic traits based on a questionnaire that can be used to assess psychopathy in a non-clinical setting.

    Students in both groups were shown a series of photographs of familiar and unfamiliar faces. They received a cue to give either an honest or a dishonest response when asked whether they knew the person in the photograph or not. The researchers measured the students’ reaction times for each response and observed their brain activity using functional magnetic resonance imaging methodology (fMRI). Participants then completed a two-session training exercise before repeating the task.

    The researchers found that following the training exercise, individuals with high levels of psychopathic traits had significantly shorter response times when being prompted to lie than during the initial task. Individuals with low levels of psychopathic traits showed no changes in response time. The difference may be due to how the brains of individuals with high and low levels of psychopathic traits process lies.

    Dr Lee said: “During lying, the ‘true’ information needs to be suppressed and reversed. Thus, lying requires a series of processes in the brain including attention, working memory, inhibitory control and conflict resolution which we found to be reduced in individuals with high levels of psychopathic traits. By contrast, in individuals with low levels of psychopathic traits this lie-related brain activity increased. The additional ‘effort’ it took their brains to process untruthful responses may be one of the reasons why they didn’t improve their lying speed.”

    The researchers caution that as all participants in this study were university students, further research is needed to be able to generalize the findings to individuals with high levels of psychopathic traits in other populations.


  7. Lutein, found in leafy greens, may counter cognitive aging

    August 19, 2017 by Ashley

    From the University of Illinois at Urbana-Champaign press release:

    Spinach and kale are favorites of those looking to stay physically fit, but they also could keep consumers cognitively fit, according to a new study from University of Illinois researchers.

    The study, which included 60 adults aged 25 to 45, found that middle-aged participants with higher levels of lutein — a nutrient found in green leafy vegetables such as spinach and kale, as well as avocados and eggs — had neural responses that were more on par with younger individuals than with their peers. The findings were published in the journal Frontiers in Aging Neuroscience.

    “Now there’s an additional reason to eat nutrient-rich foods such as green leafy vegetables, eggs and avocados,” said Naiman Khan, a professor of kinesiology and community health at Illinois. “We know these foods are related to other health benefits, but these data indicate that there may be cognitive benefits as well.”

    Most other studies have focused on older adults, after there has already been a period of decline. The Illinois researchers chose to focus on young to middle-aged adults to see whether there was a notable difference between those with higher and lower lutein levels.

    “As people get older, they experience typical decline. However, research has shown that this process can start earlier than expected. You can even start to see some differences in the 30s,” said Anne Walk, a postdoctoral scholar and first author of the paper. “We want to understand how diet impacts cognition throughout the lifespan. If lutein can protect against decline, we should encourage people to consume lutein-rich foods at a point in their lives when it has maximum benefit.”

    Lutein is a nutrient that the body can’t make on its own, so it must be acquired through diet. Lutein accumulates in brain tissues, but also accumulates in the eye, which allows researchers to measure levels without relying on invasive techniques.

    The Illinois researchers measured lutein in the study participants’ eyes by having participants look into a scope and respond to a flickering light. Then, using electrodes on the scalp, the researchers measured neural activity in the brain while the participants performed a task that tested attention.

    “The neuro-electrical signature of older participants with higher levels of lutein looked much more like their younger counterparts than their peers with less lutein,” Walk said. “Lutein appears to have some protective role, since the data suggest that those with more lutein were able to engage more cognitive resources to complete the task.”

    Next, Khan’s group is running intervention trials, aiming to understand how increased dietary consumption of lutein may increase lutein in the eye, and how closely the levels relate to changes in cognitive performance.

    “In this study we focused on attention, but we also would like to understand the effects of lutein on learning and memory. There’s a lot we are very curious about,” Khan said.


  8. Cultural activities may influence the way we think

    by Ashley

    From the American Friends of Tel Aviv University press release:

    A new Tel Aviv University study suggests that cultural activities, such as the use of language, influence our learning processes, affecting our ability to collect different kinds of data, make connections between them, and infer a desirable mode of behavior from them.

    “We believe that, over lengthy time scales, some aspects of the brain must have changed to better accommodate the learning parameters required by various cultural activities,” said Prof. Arnon Lotem, of TAU’s Department of Zoology, who led the research for the study. “The effect of culture on cognitive evolution is captured through small modifications of evolving learning and data acquisition mechanisms. Their coordinated action improves the brain network’s ability to support learning processes involved in such cultural phenomena as language or tool-making.”

    Prof. Lotem developed the new learning model in collaboration with Prof. Joseph Halpern and Prof. Shimon Edelman, both of Cornell University, and Dr. Oren Kolodny of Stanford University (formerly a PhD student at TAU). The research was recently published in PNAS.

    “Our new computational approach to studying human and animal cognition may explain how human culture shaped the evolution of human cognition and memory,” Prof. Lotem said. “The brain is not a rigid learning machine in which a particular event necessarily leads to another particular event. Instead, it functions according to coevolving mechanisms of learning and data acquisition, with certain memory parameters that jointly construct a complex network, capable of supporting a range of cognitive abilities.

    “Any change in these parameters may change the constructed network and thus the function of the brain,” Prof. Lotem said. “This is how small modifications can adapt our brain to ecological as well as to cultural changes. Our model reflects this.”

    To learn, the brain calculates statistics on the data it takes in from the environment, monitoring the distribution of data and determining the level of connections between them. The new learning model assumes a limited window of memory and constructs an associative network that represents the frequency of the connections between data items.

    “A computer remembers all the data it is fed. But our brain developed in a way that limits the quantity of data it can receive and remember,” said Prof. Lotem. “Our model hypothesizes that the brain does this ‘intentionally’ — that is, the mechanism of filtering the data from the surroundings is an integral element in the learning process. Moreover, a limited working memory may paradoxically be helpful in some cognitive tasks that require extensive computation. This may explain why our working memory is actually more limited than that of our closest relatives, chimpanzees.”

    Working with a large memory window imposes a far greater computational burden on the brain than working with a small window. Human language, for example, presents computational challenges. When we listen to a string of syllables, we need to scan a massive number of possible combinations to identify familiar words.

    But this is only a problem if the person who is learning really needs to care about the exact order of data items, which is the case with language, according to Dr. Lotem. On the other hand, a person only has to identify a small combination of typical features in order to discriminate between two types of trees in the forest. The exact order of the features is not as important, computation is simpler and a larger working memory may be better.

    “Some of these principles that evolved in the biological brain may be useful in the development of AI someday,” Dr. Lotem said. “Currently the concept of limiting memory in order to improve computation is not something that people do in the field of AI, but perhaps they should try and see whether it can paradoxically be helpful in some cases, as in our human brain.”

    “Excluding very recent cultural innovations, the assumption that culture shaped the evolution of cognition is both more parsimonious and more productive than assuming the opposite,” the researchers concluded. They are currently examining how natural variations in learning and memory parameters may influence learning tasks that require extensive computation.


  9. Study suggests engaging in casual video game play during rest breaks can help restore mood in response to workplace stress

    by Ashley

    From the Human Factors and Ergonomics Society press release:

    More than half of Americans regularly experience cognitive fatigue related to stress, frustration, and anxiety while at work. Those in safety-critical fields, such as air traffic control and health care, are at an even greater risk for cognitive fatigue, which could lead to errors. Given the amount of time that people spend playing games on their smartphones and tablets, a team of human factors/ergonomics researchers decided to evaluate whether casual video game play is an effective way to combat workplace stress during rest breaks.

    In their Human Factors article (now online), “Searching for Affective and Cognitive Restoration: Examining the Restorative Effects of Casual Video Game Play,” Michael Rupp and coauthors used a computer-based task to induce cognitive fatigue in 66 participants, who were then given a five-minute rest break. During the break, participants either played a casual video game called Sushi Cat, participated in a guided relaxation activity, or sat quietly in the testing room without using a phone or computer. At various times throughout the experiment, the researchers measured participants’ affect (e.g., stress level, mood) and cognitive performance.

    Those who took a silent rest break reported that they felt less engaged with work and experienced worry as a result, whereas those who participated in the guided relaxation activity saw reductions in negative affect and distress. Only the video game players reported that they felt better after taking the break.

    Rupp, a doctoral student in human factors and cognitive psychology at the University of Central Florida, notes, “We often try to power through the day to get more work finished, which might not be as effective as taking some time to detach for a few minutes. People should plan short breaks to make time for an engaging and enjoyable activity, such as video games, that can help them recharge.”


  10. Study suggests cognitive cross-training enhances learning

    by Ashley

    From the University of Illinois at Urbana-Champaign press release:

    Just as athletes cross-train to improve physical skills, those wanting to enhance cognitive skills can benefit from multiple ways of exercising the brain, according to a comprehensive new study from University of Illinois researchers.

    The 18-week study of 318 healthy young adults found that combining physical exercise and mild electric brain stimulation with computer-based cognitive training promoted skill learning significantly more than using cognitive training alone.

    The enhanced learning was skill-specific and did not translate to general intelligence. The study, the largest and most comprehensive to date, was published in the journal Scientific Reports.

    “Learning provides the foundation for acquiring new skills and updating prior beliefs in light of new knowledge and experience,” said study leader Aron Barbey, a professor of psychology. “Our results establish a method to enhance learning through multimodal intervention. The beneficial effects of cognitive training can be significantly enhanced with the addition of physical fitness training and noninvasive brain stimulation.”

    Psychologists have extensively studied and debated the merits of cognitive training, but have mainly focused on computer-based tasks, Barbey said. The few studies that have incorporated other training modalities, such as physical fitness training or noninvasive brain stimulation, have been small in sample size, short in time or narrow in scope, he said.

    The Illinois study divided its numerous subjects into five groups: three experimental groups and active and passive control groups. One experimental group received only cognitive training; the second group received cognitive training and exercise; and the third group received cognitive training, exercise and noninvasive brain stimulation delivered by electrodes on the scalp. The active control group completed different computer-based cognitive training tasks than the experimental group, but did the same number of sessions of the same amount of time as the experimental group. In the first week, participants took a pretest. In the following 16 weeks, the experimental and active groups completed 90-minute training sessions three times a week. In the final week of the study, all participants took a post-test.

    “Physical activity and aerobic fitness are known to have beneficial effects on the underlying structures and functions of the brain,” said Barbey, a member of the Beckman Institute for Advanced Science and Technology at Illinois. “And research has shown that specific brain stimulation protocols can enhance cognitive performance, prompting us to investigate their effects on cognitive training.”

    The study used six different training tasks, designed to measure specific cognitive skills such as memory, attention and task-switching. In the post-test, the groups that received cognitive training and physical fitness training or all three interventions performed significantly better than the group with cognitive training alone. The group that received all three interventions consistently performed the best, and showed substantial gains in two of the tasks over the group that received cognitive and physical fitness training but did not receive brain stimulation.