1. Learning with music can change brain structure

    July 21, 2017 by Ashley

    From the University of Edinburgh press release:

    Using musical cues to learn a physical task significantly develops an important part of the brain, according to a new study.

    People who practiced a basic movement task to music showed increased structural connectivity between the regions of the brain that process sound and control movement.

    The findings focus on white matter pathways — the wiring that enables brain cells to communicate with each other.

    The study could have positive implications for future research into rehabilitation for patients who have lost some degree of movement control.

    Thirty right-handed volunteers were divided into two groups and charged with learning a new task involving sequences of finger movements with the non-dominant, left hand. One group learned the task with musical cues, the other group without music.

    After four weeks of practice, both groups of volunteers performed equally well at learning the sequences, researchers at the University of Edinburgh found.

    Using MRI scans, it was found that the music group showed a significant increase in structural connectivity in the white matter tract that links auditory and motor regions on the right side of the brain. The non-music group showed no change.

    Researchers hope that future study with larger numbers of participants will examine whether music can help with special kinds of motor rehabilitation programmes, such as after a stroke.

    The interdisciplinary project brought together researchers from the University of Edinburgh’s Institute for Music in Human and Social Development, Clinical Research Imaging Centre, and Centre for Clinical Brain Sciences, and from Clinical Neuropsychology, Leiden University, The Netherlands.

    The results are published in the journal Brain & Cognition.

    Dr Katie Overy, who led the research team said: “The study suggests that music makes a key difference. We have long known that music encourages people to move. This study provides the first experimental evidence that adding musical cues to learning new motor task can lead to changes in white matter structure in the brain.”


  2. Systematic research investigates effects of money on thinking, behavior

    by Ashley

    From the Association for Psychological Science press release:

    Numerous studies have shown that being prompted to think about money can predispose people to engage in self-sufficient thinking and behavior — but some findings suggest that demographic characteristics may moderate this type of effect. In a new research article, scientists present results from three experiments that systematically explore these money-priming effects, finding inconsistent evidence for the effect of money primes on various measures of self-sufficient thinking and behavior.

    The research is published in Psychological Science, a journal of the Association for Psychological Science.

    Psychology researcher Eugene M. Caruso (University of Chicago Booth School of Business) and co-authors Oren Shapira (Stony Brook University) and Justin Landy (also Chicago Booth) were motivated to carry out this systematic exploration after conducting a set of studies in which they observed varied findings that were inconsistent with their predictions.

    In their initial studies, Caruso and his collaborators found that the effects of money reminders on participants’ thinking often seemed to depend on certain demographic characteristics, a result they were not expecting. In discussing these results, they discovered that colleagues had also observed unpredicted interaction effects in their research in this area.

    Importantly, the kinds of interaction effects observed seemed to vary across different studies that used different techniques for activating the concept of money.

    “These inconsistent results led us to step back to try to gain a better understanding of whether different money primes lead to similar effects, and whether they interact with sociodemographic characteristics in a reliable, and potentially theoretically meaningful, manner,” Caruso explains.

    To do this, Caruso, Shapira, and Landy decided to systematically evaluate the effects of various money-priming manipulations on a predetermined set of outcomes while accounting for the potential influence of certain sociodemographic factors, within a single experiment that sampled a diverse group of participants.

    In their first experiment, the researchers recruited a total of 2,167 participants for an online study, randomly assigning participants to receive specific primes. For example, some saw a faint image of $100 bills in the background of the instructions screen, others were asked to select the best sizes and shapes for new paper currency, some completed phrases that included money-related terms, while others were asked to imagine having ample access to money. Some participants saw a clear image of $100 bills and were explicitly asked to describe what money meant to them, while others were asked to recall a time when they felt powerful.

    The results showed that four of the five money primes did activate the concept of money. Participants exposed to these primes were more likely to complete word stems to create money-related words compared with participants who received a neutral prime or no prime — only those exposed to the background image of money showed no difference in the word completion task relative to their peers.

    But the primes seemed to have weak and inconsistent effects on participants’ feelings of wealth and self-sufficiency. Only participants who imagined an abundant life reported differences in self-sufficiency, and they actually reported lower self-sufficiency compared with those who received a neutral prime, an unexpected finding.

    Additionally, there was little evidence to suggest that the effects of the primes on various outcomes were moderated by any of the demographic characteristics measured, including gender, socioeconomic status, and political ideology.

    The researchers observed similar results in a second online experiment with 2,150 participants that omitted the money-activation measure.

    In a third experiment, Caruso, Shapira, and Landy conducted a lab-based study with 332 members of the university community. To examine the effects of money primes on self-sufficient behavior, the researchers measured how long participants spent working on a puzzle that was actually unsolvable before they asked for help.

    The results echoed those of the previous online studies: The three money primes tested had weak and inconsistent effects across the different outcome measures. Only those participants who unscrambled phrases including money-related terms reported greater feelings of self-sufficiency relative to the comparison group.

    “Contrary to what we expected based on the published literature, we did not find that any manipulation consistently affected any dependent measure across our three studies, nor did we find reliable evidence for statistical moderation by sociodemographic characteristics,” says Caruso.

    The researchers urged caution in interpreting the findings relative to specific published studies, given that the three experiments were not designed to be exact replications of any one study. Rather, this series of experiments can be seen as offering a rigorous and systematic examination of a particular effect.

    “Beyond the implications for money-priming research, we hope that our methodological approach — comparing multiple manipulations of a construct and assessing multiple individual-difference moderators within the same heterogeneous sample — can make a broader contribution by supplementing the emerging toolkit of methodologies for establishing the reliability of individual effects and the validity of the theories that attempt to explain them,” Caruso concludes.


  3. Motivation through punishment may not work

    by Ashley

    From the Julius-Maximilians-Universität Würzburg, JMU press release:

    Parents scold their children to correct their behaviour, hoping that their offspring will discontinue their misbehaviour as a result. What’s paradoxical about this kind of punishment is that it can have the opposite effect.

    Professor Andreas Eder at the Institute of General Psychology of the University of Würzburg made this discovery during a research project. He has now published his findings in the Journal of Experimental Psychology: General.

    Electric shock as punishment

    What was the experiment about? The team of project leader Eder asked test participants to complete a simple task. A number would flash up on a screen. “The participants had to decide whether the number is greater than or smaller than five,” the scientist explains. They had to communicate their decision by hitting a key: The left key was for values from one to four and the right key for six to nine.

    But previously, the participants had learned that when pressing one of the two keys, they would receive a slightly painful electric shock. “They had experienced to expect discomfort when hitting this key,” Eder says. The scientists had assumed that the participants would press the shock-delivering key more slowly.

    Surprisingly, the exact opposite was the case. The participants pressed the pain-inducing key even more quickly than before. The scientists were taken aback by this outcome. So punishment alone is not sufficient to stop undesirable behaviour.

    Assumptions not confirmed

    When looking for an explanation, the scientists assumed that the rapid pressing is caused by heightened arousal. “It could have been that the participants wanted to get over with the pain quickly and would therefore press it more rapidly because they were afraid,” Eder says.

    But another experiment showed that physical arousal is not responsible for the effect. “Again, the participants were asked to solve the task. Again there were two keys: one causing a weak electric shock, the other delivering a rather strong one.”

    It turned out that the participants pressed the key more quickly only when this was followed by a weak shock. There was no facilitative effect upon receiving a strong shock despite the fact that the person was more aroused by the latter. So increased arousal is not a plausible explanation for the effect. Then why did the participants expose themselves to the pain more quickly?

    “We were able to show that punishment alone does not automatically cause the punished behaviour to be suppressed,” Eder sums up the results. Instead, it can even facilitate performing the punished behaviour when applied regularly. “That is the case when the punitive stimulus is used as feedback to control behaviour.”

    So if it is about the consequence of the behaviour which is anticipated before pressing the key, it should also be possible to induce the reaction facilitation using a neutral stimulus. “A vibration should suffice in that case,” Eder says. This assumption was confirmed in further experiments.

    Put more simply: The brain uses behavioural consequences to trigger an action more easily even if the consequences are disagreeable for us.

    The type of punishment is decisive

    The psychologist emphasizes in this context: “It is not that punishment does not work generally. Only it does not always cause the behaviour to be suppressed.” Not even when the participants know that something unpleasant will follow.

    A paradoxical facilitative effect of punishment is likely if there is no alternative to the punished behaviour, an action needs to be taken quickly and the punishment is rather mild.

    Therefore, it is important to also give clear feedback for the desired behaviour as an orientation for the child. Because the child can only learn to stop the undesirable behaviour when it has a clear alternative to the problematic behaviour. Everyday educational practices should focus on pointing out these alternatives to the child.


  4. New insights into rare chronic pain condition

    by Ashley

    From the University of Bath press release:

    People suffering from chronic pain often find their condition distracting and debilitating, but new research reveals that some might in fact be paying less, rather than more, attention to the source of their pain.

    The findings from researchers at the universities of Bath and Oxford (UK), published today in the journal Brain, suggest that a rare chronic pain condition might involve changes in the way that the brain processes visual information, which in turn could provide new insights into how to treat the condition.

    Approximately 16,000 people in the UK are affected by a poorly understood condition called Complex Regional Pain Syndrome (CRPS). Individuals with CRPS report debilitating pain in an arm or leg, as well as swelling, temperature changes and movement difficulties. Symptoms include burning, stabbing, stinging or throbbing pain in the affected limb, and everyday sensations such as a breeze blowing across the skin can feel very painful.

    Whilst its exact causes are not yet known, it is thought that abnormal brain signals about the limb play an important part. CRPS usually follows limb damage from injury or surgery, but the pain experienced is disproportionate and may last longer than would be expected for the damage itself. For one case in every 10 there is no obvious trigger. And whereas most people recover well within a year, some people have some or all of the symptoms for many weeks, months or even years.

    For their study, scientists at Bath and Oxford were keen to understand more about how and why individuals suffering from CRPS report losing track of the position of their painful limb and not being able to move it.

    The team tested how quickly people with CRPS processed visual information in the side of their environment nearer to their painful limb compared to the other side of the environment. Using laser pointers controlled by a computer, they projected two flashes of light onto the left and right side of a board that was placed in front of the patients, and the patients had to say which light appeared first.

    Their results showed that people with CRPS processed the light on the affected side of the board more slowly than the light on the unaffected side, suggesting that information that is nearer to the affected side of the body is not well processed by the brain.

    Lead author, Dr Janet Bultitude from the University of Bath’s Centre for Pain Research, explained: “People with CRPS are usually in constant pain that they can’t ignore. Yet paradoxically they often report that they are not sure where their painful limb is unless they look at it directly, and that movements are not automatic — they have to ‘tell’ their limb to move. The odd sensations they experience suggest there could be a change in mechanisms that normally allow us to process information at different locations in the space around us.

    “Our results show that people with CRPS are slower to process visual information that comes from the side of their environment where their painful limb is normally located. Since we used a test of vision, the slower processing can’t be because of changes in the limb itself, but must be due to the way the brain processes information. We’re excited that these results can help propel us forward to developing new treatments for those affected by the condition.”

    Current treatments for CRPS include pain medications and rehabilitation therapies which are vital to normalise sensation in the limb and improve function and mobility. Dr Bultitude and her team are now investigating whether symptoms of CRPS could be reduced by therapies that are used to treat attention problems in people with brain injuries such as stroke.


  5. Some patients with dementia may experience delayed-onset PTSD

    by Ashley

    From the Wiley press release:

    Delayed-onset post-traumatic symptoms in the elderly may be misdiagnosed as falling under the umbrella of behavioural and psychological symptoms of dementia (BPSD), according to a recent review.

    The review describes three cases where post-traumatic stress disorder (PTSD) symptoms are experienced by patients suffering with dementia long after the original traumatic event.

    Considering PTSD in individuals with dementia is important because PTSD is usually associated with working-age adults and is infrequently diagnosed in the elderly. In the early stages of dementia, recognising early life trauma may enable patients to access psychological therapy prior to significant cognitive decline. In patients with more advanced dementias, an awareness of earlier trauma exposure can help clinicians differentiate between delayed PTSD and BPSD in patients suffering with emotional and behavioural disturbances.

    “Every patient with dementia has a unique narrative, which if captured in the earlier stages of the disease, enables clinicians and their families to understand the origin of their distress. Therefore, it is important to look for a history of previous trauma in patients with BPSD as this could be due to delayed onset PTSD,” said Dr. Tarun Kuruvilla, senior author of the Progress in Neurology & Psychiatry review.


  6. Why does prenatal alcohol exposure increase the likelihood of addiction?

    by Ashley

    From the University at Buffalo press release:

    One of the many negative consequences when fetuses are exposed to alcohol in the womb is an increased risk for drug addiction later in life. Neuroscientists in the University at Buffalo Research Institute on Addictions are discovering why.

    Through a research grant from the National Institute on Alcohol Abuse and Alcoholism (NIAAA) of the National Institutes of Health (NIH), Senior Research Scientist Roh-Yu Shen, PhD, is studying how prenatal alcohol exposure alters the reward system in the brain and how this change continues through adulthood.

    The key appears to lie with endocannibinoids, cannabis-like chemicals that are produced by the brain itself.

    “By understanding the role endocannibinoids play in increasing the brain’s susceptibility to addiction, we can start developing drug therapies or other interventions to combat that effect and, perhaps, other negative consequences of prenatal alcohol exposure,” Shen says.

    Prenatal alcohol exposure is the leading preventable cause of birth defects and neurodevelopmental abnormalities in the United States. Fetal Alcohol Spectrum Disorders (FASD) cause cognitive and behavioral problems. In addition to increased vulnerability of alcohol and other substance use disorders, FASD can lead to other mental health issues including Attention Deficit Hyperactivity Disorder (ADHD), depression, anxiety and problems with impulse control.

    After the prenatal brain is exposed to alcohol, the endocannibinoids have a different effect on certain dopamine neurons which are involved in addicted behaviors than when brain is not exposed to alcohol,” Shen says. “The end result is that the dopamine neurons in the brain become more sensitive to a drug of abuse’s effect. So, later in life, a person needs much less drug use to become addicted.”

    Specifically, in the ventral tegmental area (VTA) of the brain, endocannibinoids play a significant role in weakening the excitatory synapses onto dopamine neurons. The VTA is the part of the brain implicated in addiction, attention and reward processes. However, in a brain prenatally exposed to alcohol, the effect of the endocannabinoids is reduced due to a decreased function of endocannabinoid receptors. As a result, the excitatory synapses lose the ability to be weakened and continue to strengthen, which Shen believes is a critical brain mechanism for increased addiction risk.


  7. Generous people live happier lives

    July 20, 2017 by Ashley

    From the University of Zürich press release:

    Generosity makes people happier, even if they are only a little generous. People who act solely out of self-interest are less happy. Merely promising to be more generous is enough to trigger a change in our brains that makes us happier. This is what UZH neuroeconomists found in a recent study.

    What some have been aware of for a long time, others find hard to believe: Those who are concerned about the well-being of their fellow human beings are happier than those who focus only on their own advancement. Doing something nice for another person gives many people a pleasant feeling that behavioral economists call a warm glow. In collaboration with international researchers, Philippe Tobler and Ernst Fehr from the Department of Economics at the University of Zurich investigated how brain areas communicate to produce this feeling. The results provide insight into the interplay between altruism and happiness.

    Even a little generosity makes people happier

    In their experiments, the researchers found that people who behaved generously were happier afterwards than those who behaved more selfishly. However, the amount of generosity did not influence the increase in contentment. “You don’t need to become a self-sacrificing martyr to feel happier. Just being a little more generous will suffice,” says Philippe Tobler.

    Before the experiment started, some of the study participants had verbally committed to behaving generously towards other people. This group was willing to accept higher costs in order to do something nice for someone else. They also considered themselves happier after their generous behavior (but not beforehand) than the control group, who had committed to behaving generously toward themselves.

    Intent alone suffices to cause neural changes

    While the study participants were making their decision to behave or not to behave generously, the researchers examined activity in three areas of the participants’ brains: in the temporoparietal junction (where prosocial behavior and generosity are processed), in the ventral striatum (which is associated with happiness), and in the orbitofrontal cortex (where we weigh the pros and cons during decision-making processes). These three brain areas interacted differently, depending on whether the study participants had committed to generosity or selfishness.

    Simply promising to behave generously activated the altruistic area of the brain and intensified the interaction between this area and the area associated with happiness. “It is remarkable that intent alone generates a neural change before the action is actually implemented,” says Tobler.

    Benefit from the promise to behave generously

    “Promising to behave generously could be used as a strategy to reinforce the desired behavior, on the one hand, and to feel happier, on the other,” says Tobler. His co-author Soyoung Park adds: “There are still some open questions, such as: Can communication between these brain regions be trained and strengthened? If so, how? And, does the effect last when it is used deliberately, that is, if a person only behaves generously in order to feel happier?”

    About the experiment

    At the beginning of the experiment, the 50 participants were promised a sum of money that they would receive in the next few weeks and were supposed to spend. Half of the study participants committed to spending the money on someone they knew (experimental group, promise of generosity), while the other half committed to spending the money on themselves (control group).

    Subsequently, all of the study participants made a series of decisions concerning generous behavior, namely, whether to giving somebody who is close to them a gift of money. The size of the gift and the cost thereof varied: One could, for example, give the other person five francs at a cost of two francs. Or give twenty francs at a cost of fifteen. While the study participants were making these decisions, the researchers measured activity in three brain areas: in the temporoparietal junction, where prosocial behavior and generosity are processed; in the ventral striatum, which is associated with happiness; and in the orbitofrontal cortex, where we weigh the pros and cons during decision-making processes. The participants were asked about their happiness before and after the experiment.


  8. Coffee bubble phobia may stem from deep-seated aversion to parasites

    by Ashley

    From the University of Kent press release:

    Some people experience intense aversion and anxiety when they see clusters of roughly circular shapes, such as the bubbles in a cup of coffee or the holes in a sponge.

    Now psychologists at the University of Kent have found that the condition — known as trypophobia — may be an exaggerated response linked to deep-seated anxiety about parasites and infectious disease.

    Previous explanations for the condition include the suggestion that people are evolutionarily predisposed to respond to clusters of round shapes because these shapes are also found on poisonous animals, like some snakes and the blue-ringed octopus.

    Now new research, led by Tom Kupfer of the University’s School of Psychology, suggests that the condition may instead be related to an evolutionary history of infectious disease and parasitism that leads to an exaggerated sensitivity to round shapes.

    The team noted that many infectious diseases result in clusters of round shapes on the skin: smallpox, measles, rubella, typhus, scarlet fever etc. Similarly, many ectoparasites, like scabies, tics and botfly also lead to clusters of round shapes on the skin.

    The study saw over 300 people recruited to take part from trypophobia support groups. A comparison group of around 300 university students without trypophobia also took part. Both groups were invited to view sixteen cluster images These all depicted real objects. Eight were pictures of clusters relating to diseased body parts (e.g. circular rash marks on a chest; smallpox scars on a hand; a cluster of ticks).

    The other eight cluster images had no disease-relevant properties (e.g. drilled holes in a brick wall; a lotus flower seed pod) Both groups of participants reported finding the disease-relevant cluster images unpleasant to look at but whereas the university students didn’t find the disease-irrelevant cluster images unpleasant, the trypophobic group found them extremely unpleasant.

    This finding supports the suggestion that individuals with trypophobia experience an overgeneralised response, to the extent that even an image of bubbles on a cup of coffee can trigger aversion in the same way as a cluster of tics or lesions.

    Much previous research has shown that the function of the emotion disgust is to motivate people to avoid sources of potential infection, so the researchers predicted that unlike most phobias (e.g., snakes, heights, dogs) which mainly involve intense fear, people with trypophobia would predominantly experience intense disgust.

    They asked the individuals with trypophobia to describe their feelings when looking at cluster images. Analysis of these responses revealed that the majority of individuals with trypophobia experienced disgust or disgust-related feelings like nausea or the urge to vomit, even towards the disease-irrelevant cluster images like a sponge or bubbles. Only a small proportion described feeling fear or fear-related feelings.

    In addition to disgust, trypophobic individuals frequently reported feelings like skin itching, skin crawling or even the sensation of ‘bugs infesting the skin’. This skin response suggests that people with trypophobia may perceive cluster stimuli as if they are cues to ectoparasites, even leading some to feel as if they are infested.

    Overall, the findings showed that although trypophobia has been described as the ‘fear of holes’, it would be more accurately characterised as a predominantly disgust-based aversion to clusters of roughly circular objects.


  9. Brains evolved to need exercise

    by Ashley

    From the University of Arizona press release:

    Mounting scientific evidence shows that exercise is good not only for our bodies, but for our brains. Yet, exactly why physical activity benefits the brain is not well understood.

    In a new article published in the journal Trends in Neurosciences, University of Arizona researchers suggest that the link between exercise and the brain is a product of our evolutionary history and our past as hunter-gatherers.

    UA anthropologist David Raichlen and UA psychologist Gene Alexander, who together run a research program on exercise and the brain, propose an “adaptive capacity model” for understanding, from an evolutionary neuroscience perspective, how physical activity impacts brain structure and function.

    Their argument: As humans transitioned from a relatively sedentary apelike existence to a more physically demanding hunter-gatherer lifestyle, starting around 2 million years ago, we began to engage in complex foraging tasks that were simultaneously physically and mentally demanding, and that may explain how physical activity and the brain came to be so connected.

    “We think our physiology evolved to respond to those increases in physical activity levels, and those physiological adaptations go from your bones and your muscles, apparently all the way to your brain,” said Raichlen, an associate professor in the UA School of Anthropology in the College of Social and Behavioral Sciences.

    “It’s very odd to think that moving your body should affect your brain in this way — that exercise should have some beneficial impact on brain structure and function — but if you start thinking about it from an evolutionary perspective, you can start to piece together why that system would adaptively respond to exercise challenges and stresses,” he said.

    Having this underlying understanding of the exercise-brain connection could help researchers come up with ways to enhance the benefits of exercise even further, and to develop effective interventions for age-related cognitive decline or even neurodegenerative diseases such as Alzheimer’s.

    Notably, the parts of the brain most taxed during a complex activity such as foraging — areas that play a key role in memory and executive functions such as problem solving and planning — are the same areas that seem to benefit from exercise in studies.

    “Foraging is an incredibly complex cognitive behavior,” Raichlen said. “You’re moving on a landscape, you’re using memory not only to know where to go but also to navigate your way back, you’re paying attention to your surroundings. You’re multitasking the entire time because you’re making decisions while you’re paying attention to the environment, while you are also monitoring your motor systems over complex terrain. Putting all that together creates a very complex multitasking effort.”

    The adaptive capacity model could help explain research findings such as those published by Raichlen and Alexander last year showing that runners’ brains appear to be more connected than brains of non-runners.

    The model also could help inform interventions for the cognitive decline that often accompanies aging — in a period in life when physical activity levels tend to decline as well.

    “What we’re proposing is, if you’re not sufficiently engaged in this kind of cognitively challenging aerobic activity, then this may be responsible for what we often see as healthy brain aging, where people start to show some diminished cognitive abilities,” said Alexander, a UA professor of psychology, psychiatry, neuroscience and physiological sciences. “So the natural aging process might really be part of a reduced capacity in response to not being engaged enough.”

    Reduced capacity refers to what can happen in organ systems throughout the body when they are deprived of exercise.

    “Our organ systems adapt to the stresses they undergo,” said Raichlen, an avid runner and expert on running. “For example, if you engage in exercise, your cardiovascular system has to adapt to expand capacity, be it through enlarging your heart or increasing your vasculature, and that takes energy. So if you’re not challenging it in that way — if you’re not engaging in aerobic exercise — to save energy, your body simply reduces that capacity.”

    In the case of the brain, if it is not being stressed enough it may begin to atrophy. This may be especially concerning, considering how much more sedentary humans’ lifestyles have become.

    “Our evolutionary history suggests that we are, fundamentally, cognitively engaged endurance athletes, and that if we don’t remain active we’re going to have this loss of capacity in response to that,” said Alexander, who studies brain aging and Alzheimer’s disease as a member of the UA’s Evelyn F. McKnight Brain Institute. “So there really may be a mismatch between our relatively sedentary lifestyles of today and how we evolved.”

    Alexander and Raichlen say future research should look at how different levels of exercise intensity, as well as different types of exercise, or exercise paired specifically with cognitive tasks, affect the brain.

    For example, exercising in a novel environment that poses a new mental challenge, may prove to be especially beneficial, Raichlen said.

    “Most of the research in this area puts people in a cognitively impoverished environment. They put people in a lab and have them run on a treadmill or exercise bike, and you don’t really have to do as much, so it’s possible that we’re missing something by not increasing novelty,” he said.

    Alexander and Raichlen say they hope the adaptive capacity model will help advance research on exercise and the brain.

    “This evolutionary neuroscience perspective is something that’s been generally lacking in the field,” Alexander said. “And we think this might be helpful to advance research and help develop some new specific hypotheses and ways to identify more universally effective interventions that could be helpful to everyone.”


  10. Higher IQ in childhood is linked to a longer life

    by Ashley

    From the BMJ press release:

    Higher intelligence (IQ) in childhood is associated with a lower lifetime risk of major causes of death, including heart disease, stroke, smoking related cancers, respiratory disease and dementia, finds a study published by The BMJ today.

    It is the largest study to date reporting causes of death in men and women across the life course, and the findings suggest that lifestyle, especially tobacco smoking, is an important component in the effect of intelligence on differences in mortality.

    Previous studies have shown that, on average, individuals with higher IQs tend to live a little longer than those with lower IQs, but these are largely based on data from male conscripts followed up only to middle adulthood.

    So a team of researchers from the University of Edinburgh set out to examine the association between intelligence test scores measured at age 11 and leading causes of death in men and women up to age 79.

    Their findings are based on data from 33,536 men and 32,229 women born in Scotland in 1936, who took a validated childhood intelligence test at age 11, and who could be linked to cause of death data up to December 2015.

    Cause of death included coronary heart disease, stroke, specific cancers, respiratory disease, digestive disease, external causes (including suicide and death from injury), and dementia.

    After taking account of several factors (confounders) that could have influenced the results, such as age, sex and socioeconomic status, the researchers found that higher childhood intelligence was associated with a lower risk of death until age 79.

    For example, a higher test score was associated with a 28% reduced risk of death from respiratory disease, a 25% reduced risk of death from coronary heart disease, and a 24% reduced risk of death from stroke.

    Other notable associations were seen for deaths from injury, smoking related cancers (particularly lung and stomach), digestive disease, and dementia. There was no evident association between childhood intelligence and death from cancers not related to smoking.

    The authors outline some study limitations which could have introduced bias. However, key strengths include the whole population sample, 68-year follow up, and ability to adjust for important confounders.

    They also point out that significant associations remained after further adjustment for smoking and socioeconomic status, suggesting that these factors did not fully account for mortality differences. And they say future studies “would benefit from measures of the cumulative load of such risk factors over the life course.”

    This study is the largest to date reporting causes of death across the life course, and it provides us with interesting results, say researchers based in Sweden, in a linked editorial.

    “Importantly, it shows that childhood IQ is strongly associated with causes of death that are, to a great extent, dependent on already known risk factors,” they write. And they suggest that “tobacco smoking and its distribution along the socioeconomic spectrum could be of particular importance here.”

    In conclusion, they say: “It remains to be seen if this is the full story or if IQ signals something deeper, and possibly genetic, in its relation to longevity.”