1. Listening to happy music may enhance divergent creativity

    September 24, 2017 by Ashley

    From the PLOS press release:

    Listening to happy music may help generate more, innovative solutions compared to listening to silence, according to a study published September 6, 2017 in the open-access journal PLOS ONE by Simone Ritter from Radboud University, The Netherlands and Sam Ferguson from the University of Technology Sydney, Australia.

    Creativity is an important quality in our complex, fast-changing world, as it allows us to generate innovative solutions for a wide range of problems and come up with fresh ideas. The question of what facilitates creative cognition has long been studied, and while music has previously been shown to benefit cognition, little is known about how listening to music affects creative cognition specifically.

    To investigate the effect of music on creative cognition, researchers had 155 participants complete questionnaires and split them into experimental groups. Each group listened to one of four different types of music that were categorized as calm, happy, sad, or anxious, depending on their emotional valence (positive, negative) and arousal (high, low), while one control group listened to silence. After the music started playing, participants performed various cognitive tasks that tested their divergent and convergent creative thinking. Participants who came up with the most original and useful solutions to a task scored higher in divergent creativity, while participants who came up with the single best possible solution to a task scored higher in convergent creativity.

    The researchers found that listening to happy music, which they define as classical music that is positive valence and high in arousal, facilitates more divergent creative thinking compared to silence. The authors suggest that the variables involved in the happy music condition may enhance flexibility in thinking, so that additional solutions might be considered by the participant that may not have occurred to them as readily if they were performing the task in silence.

    This study shows that creative cognition may be enhanced through music, and further research could explore how different ambient sounds might affect creativity and include participants of diverse cultures, age groups, and levels of music experience. The authors suggest that their study may also demonstrate that music listening could promote creative thinking in inexpensive and efficient ways in various scientific, educational and organizational settings.


  2. Study suggests the bilingual brain calculates differently depending on the language used

    September 22, 2017 by Ashley

    From the University of Luxembourg press release:

    People can intuitively recognise small numbers up to four; however, when calculating they depend on the assistance of language. In this respect, the fascinating research question ensues: how do multilingual people solve arithmetical tasks presented to them in different languages of which they have a very good command? The question will gain in importance in the future, as an increasingly globalised job market and accelerated migration will mean that ever more people seek work and study outside of the linguistic area of their home countries.

    This question was investigated by a research team led by Dr Amandine Van Rinsveld and Professor Dr Christine Schiltz from the Cognitive Science and Assessment Institute (COSA) at the University of Luxembourg. For the purpose of the study, the researchers recruited subjects with Luxembourgish as their mother tongue, who successfully completed their schooling in the Grand Duchy of Luxembourg and continued their academic studies in francophone universities in Belgium. Thus, the study subjects mastered both the German and French languages perfectly. As Luxembourger students, they took maths classes in primary schools in German and then in secondary schools in French.

    In two separate test situations, the study participants had to solve very simple and a bit more complex addition tasks, both in German and French. In the tests it became evident that the subjects were able to solve simple addition tasks equally well in both languages. However, for complex addition in French, they required more time than with an identical task in German. Moreover, they made more errors when attempting to solve tasks in French.

    During the tests, functional magnetic resonance imaging (fMRI) was used to measure the brain activity of the subjects. This demonstrated that, depending on the language used, different brain regions were activated. With addition tasks in German, a small speech region in the left temporal lobe was activated. When solving complex calculatory tasks in French, additional parts of the subjects’ brains responsible for processing visual information, were involved. However, during the complex calculations in French, the subjects additionally fell back on figurative thinking. The experiments do not provide any evidence that the subjects translated the tasks they were confronted with from French into German, in order to solve the problem. While the test subjects were able to solve German tasks on the basis of the classic, familiar numerical-verbal brain areas, this system proved not to be sufficiently viable in the second language of instruction, in this case French. To solve the arithmetic tasks in French, the test subjects had to systematically fall back on other thought processes, not observed so far in monolingual persons.

    The study documents for the first time, with the help of brain activity measurements and imaging techniques, the demonstrable cognitive “extra effort” required for solving arithmetic tasks in the second language of instruction. The research results clearly show that calculatory processes are directly affected by language.


  3. Study suggests another gene that may significantly influence development of dementia and Alzheimer’s

    September 21, 2017 by Ashley

    From the University of Southern California press release:

    The notorious genetic marker of Alzheimer’s disease and other forms of dementia, ApoE4, may not be a lone wolf.

    Researchers from USC and the University of Manchester have found that another gene, TOMM40, complicates the picture. Although ApoE4 plays a greater role in some types of aging-related memory ability, TOMM40 may pose an even greater risk for other types.

    TOMM40 and APOE genes are neighbors, adjacent to each other on chromosome 19, and they are sometimes used as proxies for one another in genetic studies. At times, scientific research has focused chiefly on one APOE variant, ApoE4, as the No. 1 suspect behind Alzheimer’s and dementia-related memory decline. The literature also considers the more common variant of APOE, ApoE3, neutral in risk for Alzheimer’s disease.

    USC researchers believe their new findings raise a significant research question: Has TOMM40 been misunderstood as a sidekick to ApoE4 when it is really a mastermind, particularly when ApoE3 is present?

    “Typically, ApoE4 has been considered the strongest known genetic risk factor for cognitive decline, memory decline, Alzheimer’s disease or dementia-related onset,” said T. Em Arpawong, the study’s lead author and a post-doctoral fellow in the USC Dornsife College of Letters, Arts and Sciences Department of Psychology. “Although prior studies have found some variants of this other gene TOMM40 may heighten the risk for Alzheimer’s disease, our study found that a TOMM40 variant was actually more influential than ApoE4 on the decline in immediate memory – the ability to hold onto new information.”

    Studies have shown that the influence of genes associated with memory and cognitive decline intensifies with age. That is why the scientists chose to examine immediate and delayed verbal test results over time in conjunction with genetic markers.

    “An example of immediate recall is someone tells you a series of directions to get somewhere and you’re able to repeat them back,” explained Carol A. Prescott, the paper’s senior author who is a professor of psychology at USC Dornsife College and professor of gerontology at the USC Davis School of Gerontology. “Delayed recall is being able to remember those directions a few minutes later, as you’re on your way.”

    The study was published in the journal PLOS ONE on Aug. 11.

    Prescott and Arpawong are among the more than 70 researchers at USC who are dedicated to the prevention, treatment and potential cure of Alzheimer’s disease. The memory-erasing illness is one of the greatest health challenges of the century, affecting 1 in 3 seniors and costing $236 billion a year in health care services. USC researchers across a range of disciplines are examining the health, societal and political effects and implications of the disease.

    In the past decade, the National Institute on Aging has nearly doubled its investment in USC research. The investments include an Alzheimer Disease Research Center.

    Tracking memory loss

    For the study, the team of researchers from USC and The University of Manchester utilized data from two surveys: the U.S. Health and Retirement Study and the English Longitudinal Study of Ageing. Both data sets are nationally representative samples and include results of verbal memory testing and genetic testing.

    The research team used verbal test results from the U.S. Health and Retirement Survey, collected from 1996 to 2012, which interviewed participants via phone every two years. The researchers utilized the verbal memory test scores of 20,650 participants, aged 50 and older who were tested repeatedly to study how their memory changed over time.

    To test immediate recall, an interviewer read a list of 10 nouns and then asked the participant to repeat the words back immediately. For delayed recall, the interviewer waited five minutes and then asked the participant to recall the list. Test scores ranged from 0 to 10.

    The average score for immediate recall was 5.7 words out of 10, and the delayed recall scoring average was 4.5 words out of 10. A large gap between the two sets of scores can signal the development of Alzheimer’s or some other form of dementia.

    “There is usually a drop-off in scores between the immediate and the delayed recall tests,” Prescott said. “In evaluating memory decline, it is important to look at both types of memory and the difference between them. You would be more worried about a person who has scores of 10 and 5 than a person with scores of 6 and 4.”

    The first person is worrisome because five minutes after reciting the 10 words perfectly, he or she can recall only half of them, Prescott said. The other person wasn’t perfect on the immediate recall test, but five minutes later, was able to remember a greater proportion of words.

    To prevent bias in the study’s results, the researchers excluded participants who reported that they had received a likely diagnosis of dementia or a dementia-like condition, such as Alzheimer’s. They also focused on participants identified as primarily European in heritage to minimize population bias. Results were adjusted for age and sex.

    The researchers compared the U.S. data to the results of an independent replication sample of participants, age 50 and up, in the English Longitudinal Study of Aging from 2002 to 2012. Interviews and tests were conducted every two years.

    Genetic markers of dementia

    To investigate whether genes associated with immediate and delayed recall abilities, researchers utilized genetic data from 7,486 participants in the U.S. Health and Retirement Study and 6,898 participants in the English Longitudinal Study of Ageing.

    The researchers examined the association between the immediate and delayed recall results with 1.2 million gene variations across the human genome. Only one, TOMM40, had a strong link to declines in immediate recall and level of delayed recall. ApoE4 also was linked but not as strongly.

    “Our findings indicate that TOMM40 plays a larger role, specifically, in the decline of verbal learning after age 60,” the scientists wrote. “Further, our analyses showed that there are unique effects of TOMM40 beyond ApoE4 effects on both the level of delayed recall prior to age 60 and decline in immediate recall after 60.”

    Unlike ApoE4, the ApoE3 variant is generally thought to have no influence on Alzheimer’s disease or memory decline. However, the team of scientists found that adults who had ApoE3 and a risk variant of TOMM40, were more likely to have lower memory scores. The finding suggests that TOMM40 affects memory – even when ApoE4 is not a factor.

    The team suggested that scientists should further examine the association between ApoE3 and TOMM40 variants and their combined influence on decline in different types of learning and memory.

    “Other studies may not have detected the effects of TOMM40,” Prescott said. “The results from this study provide more evidence that the causes of memory decline are even more complicated than we thought before, and they raise the question of how many findings in other studies have been attributed to ApoE4 that may be due to TOMM40 or a combination of TOMM40 and ApoE4.”


  4. Study suggests that expressive writing can help worriers perform a stressful task more efficiently

    by Ashley

    From the Michigan State University press release:

    Chronic worriers, take note: Simply writing about your feelings may help you perform an upcoming stressful task more efficiently, finds a Michigan State University study that measured participants’ brain activity.

    The research, funded by the National Science Foundation and National Institutes of Health, provides the first neural evidence for the benefits of expressive writing, said lead author Hans Schroder, an MSU doctoral student in psychology and a clinical intern at Harvard Medical School’s McLean Hospital.

    Worrying takes up cognitive resources; it’s kind of like people who struggle with worry are constantly multitasking — they are doing one task and trying to monitor and suppress their worries at the same time,” Schroder said. “Our findings show that if you get these worries out of your head through expressive writing, those cognitive resources are freed up to work toward the task you’re completing and you become more efficient.”

    Schroder conducted the study at Michigan State with Jason Moser, associate professor of psychology and director of MSU’s Clinical Psychophysiology Lab, and Tim Moran, a Spartan graduate who’s now a research scientist at Emory University. The findings are published online in the journal Psychophysiology.

    For the study, college students identified as chronically anxious through a validated screening measure completed a computer-based “flanker task” that measured their response accuracy and reaction times. Before the task, about half of the participants wrote about their deepest thoughts and feelings about the upcoming task for eight minutes; the other half, in the control condition, wrote about what they did the day before.

    While the two groups performed at about the same level for speed and accuracy, the expressive-writing group performed the flanker task more efficiently, meaning they used fewer brain resources, measured with electroencephalography, or EEG, in the process.

    Moser uses a car analogy to describe the effect. “Here, worried college students who wrote about their worries were able to offload these worries and run more like a brand new Prius,” he said, “whereas the worried students who didn’t offload their worries ran more like a ’74 Impala – guzzling more brain gas to achieve the same outcomes on the task.”

    While much previous research has shown that expressive writing can help individuals process past traumas or stressful events, the current study suggests the same technique can help people — especially worriers — prepare for stressful tasks in the future.

    Expressive writing makes the mind work less hard on upcoming stressful tasks, which is what worriers often get “burned out” over, their worried minds working harder and hotter,” Moser said. “This technique takes the edge off their brains so they can perform the task with a ‘cooler head.'”


  5. Nutrition has benefits for brain network organization

    September 20, 2017 by Ashley

    From the University of Illinois at Urbana-Champaign press release:

    Nutrition has been linked to cognitive performance, but researchers have not pinpointed what underlies the connection. A new study by University of Illinois researchers found that monounsaturated fatty acids — a class of nutrients found in olive oils, nuts and avocados — are linked to general intelligence, and that this relationship is driven by the correlation between MUFAs and the organization of the brain’s attention network.

    The study of 99 healthy older adults, recruited through Carle Foundation Hospital in Urbana, compared patterns of fatty acid nutrients found in blood samples, functional MRI data that measured the efficiency of brain networks, and results of a general intelligence test. The study was published in the journal NeuroImage.

    “Our goal is to understand how nutrition might be used to support cognitive performance and to study the ways in which nutrition may influence the functional organization of the human brain,” said study leader Aron Barbey, a professor of psychology. “This is important because if we want to develop nutritional interventions that are effective at enhancing cognitive performance, we need to understand the ways that these nutrients influence brain function.”

    “In this study, we examined the relationship between groups of fatty acids and brain networks that underlie general intelligence. In doing so, we sought to understand if brain network organization mediated the relationship between fatty acids and general intelligence,” said Marta Zamroziewicz, a recent Ph.D. graduate of the neuroscience program at Illinois and lead author of the study.

    Studies suggesting cognitive benefits of the Mediterranean diet, which is rich in MUFAs, inspired the researchers to focus on this group of fatty acids. They examined nutrients in participants’ blood and found that the fatty acids clustered into two patterns: saturated fatty acids and MUFAs.

    “Historically, the approach has been to focus on individual nutrients. But we know that dietary intake doesn’t depend on any one specific nutrient; rather, it reflects broader dietary patterns,” said Barbey, who also is affiliated with the Beckman Institute for Advanced Science and Technology at Illinois.

    The researchers found that general intelligence was associated with the brain’s dorsal attention network, which plays a central role in attention-demanding tasks and everyday problem solving. In particular, the researchers found that general intelligence was associated with how efficiently the dorsal attention network is functionally organized used a measure called small-world propensity, which describes how well the neural network is connected within locally clustered regions as well as across globally integrated systems.

    In turn, they found that those with higher levels of MUFAs in their blood had greater small-world propensity in their dorsal attention network. Taken together with an observed correlation between higher levels of MUFAs and greater general intelligence, these findings suggest a pathway by which MUFAs affect cognition.

    “Our findings provide novel evidence that MUFAs are related to a very specific brain network, the dorsal attentional network, and how optimal this network is functionally organized,” Barbey said. “Our results suggest that if we want to understand the relationship between MUFAs and general intelligence, we need to take the dorsal attention network into account. It’s part of the underlying mechanism that contributes to their relationship.”

    Barbey hopes these findings will guide further research into how nutrition affects cognition and intelligence. In particular, the next step is to run an interventional study over time to see whether long-term MUFA intake influences brain network organization and intelligence.

    “Our ability to relate those beneficial cognitive effects to specific properties of brain networks is exciting,” Barbey said. “This gives us evidence of the mechanisms by which nutrition affects intelligence and motivates promising new directions for future research in nutritional cognitive neuroscience.”


  6. Link found between cognitive fatigue and effort and reward

    September 19, 2017 by Ashley

    From the Kessler Foundation press release:

    Kessler Foundation researchers have authored a new article that has implications for our understanding of the relationship between cognitive fatigue and effort and reward. The study, which was conducted in healthy participants, broadens our understanding of disease entities that are associated with a lower threshold for cognitive fatigue, such as multiple sclerosis, brain injury, stroke and Parkinson disease. The article, “The relationship between outcome prediction and cognitive fatigue: a convergence of paradigms,” was epublished ahead of print on May 25, 2017, in Cognitive, Affective, & Behavioral Neuroscience. The authors are Glenn Wylie, DPhil, Helen Genova, PhD, John DeLuca, PhD, and Ekaterina Dobryakova, PhD, of Kessler Foundation.

    Injury and disease of the brain increase the likelihood of cognitive fatigue, which can be disabling. Researchers are studying the mechanisms of cognitive fatigue, toward the goal of developing effective interventions. “In this study, we focused on the activity of the anterior cingulate cortex, which has been shown by others to be related to error processing, and which we have shown to be associated with fatigue,” said Dr. Wylie, who is associate director of Neuroscience Research and the Rocco Ortenzio Neuroimaging Center at Kessler Foundation. “We challenged participants with difficult tasks of working memory, and assessed which parts of the anterior cingulate cortex were associated with error processing,” he explained. “We then investigated whether exactly the same areas of the anterior cingulate cortex were also associated with fatigue. They were, suggesting that cognitive fatigue may be the brain’s way of signalling to itself that the effort required for the task no longer merits the rewards received.”


  7. ‘Waves’ of neural activity give new clues about Alzheimer’s

    September 18, 2017 by Ashley

    From the SINC press release:

    While unconscious during deep sleep, millions of neurons’ activity travels across the cerebral cortex. This phenomenon, known as slow waves, is related to the consolidation of memory. The European project called SloW Dyn, led by Spanish scientists, has now revealed anomalies in this activity in mice displaying a decline similar to Alzheimer’s.

    During deep sleep, large populations of neurons in the cerebral cortex and subcortical brain structures simultaneously discharge electrical pulses. They are slow oscillations, that travel as ‘waves’ of neural activity from one point to another in the cortex once every one to four seconds.

    “This global rhythmic activity, controlled by the cerebral cortex, is associated with a lack of consciousness,” says Mavi Sanchez-Vives, director of the Neuroscience Systems group at the August Pi i Sunyer Biomedical Research Institute (IDIBAPS, Barcelona), whose research team has suggested that it is the default activity of the cortical circuits.

    These oscillations consolidate memory and synaptic plasticity and maintain metabolic and cellular function, among others. Within the framework of the European SloW Dyn (Slow Wave Dynamics) project which the neuroscientist leads, researchers have now discovered differences in this brain activity between healthy mice and mice with cognitive decline similar to Alzheimer’s due to premature aging.

    “We detected a decrease in the frequency of the oscillations which were also more irregular and had a lower high-frequency content of 15 to 100 hertz,” points out Sanchez-Vives, also from the Catalan Institution for Research and Advanced Studies (ICREA).

    The study, published in the journal Frontiers in Aging Neuroscience, highlights how some of these changes have also been registered in patients with Alzheimer’s disease for which reason, according to the authors, the animal model could help in studying the disease.

    Cause or effect of diseases

    The relationship between slow oscillations and neurodegenerative diseases is twofold. When there are pathologies that disturb cortical circuits, they are often reflected in the disruption of slow waves. “We are studying what those changes tell us about the altered underlying mechanisms,” says the researcher.

    Furthermore, the wave alterations will likely be associated with sleep problems, which may influence the development of a disease. “For example, if slow wave sleep periods are disrupted, cognitive functions such as attention and memory can be negatively affected,” Sanchez-Vives notes.

    In order to measure these oscillations, scientists use EEGs which record a person’s brain activity while sleeping. Throughout the SloW Dyn project, experts will measure the waves of thousands of people and will ascertain how they change with age. The tools which they have developed for this purpose are an instrument that registers brain activity and an app.

    “This will provide massive information about the composition of sleep, the synchronization of brain activity and the anomalies that can occur as a result of aging or specific pathologies,” highlights the scientist. Researchers hope that these records will also give them clues about the therapeutic potential of restoring slow waves when they are impaired.

    Disconnecting consciousness

    SloW Dyn has been given over 660,000 euros in funding and will last 36 months. At present, the international consortium is midway through this period. One of the ultimate objectives is to develop a model that mathematically describes these oscillations and thus be able to make predictions.

    “We are trying to understand a phenomenon which, although seemingly very simple, has the power to disconnect consciousness,” summarises Sanchez-Vives.

    The Pompeu Fabra University (Barcelona), the Italian Institute of Technology, the University of Chicago (USA), the National Centre for Scientific Research (France) and the company Rythm (France) are also participating in the project led by IDIBAPS.

    Within Horizon 2020-the framework programme for funding research in the European Union-, SloW Dyn is part of the Human Brain Project, one of the Flagship Future and Emerging Technology Research Initiatives (FET Flagships).


  8. Sleep may influence an eyewitness’s ability to identify guilty person

    by Ashley

    From the Michigan State University press release:

    Sleep may influence an eyewitness’s ability to correctly pick a guilty person out of a police lineup, indicates a study by Michigan State University researchers.

    Published in PLOS ONE, the research found that eyewitnesses to a crime who sleep before being given a lineup are much less likely to pick an innocent person out of a lineup — at least when the perpetrator is not in the lineup.

    Some 70 percent of wrongful convictions in the United States are related to false eyewitness accounts. This study is the first scientific investigation into how sleep affects eyewitness memory of a crime, said lead author Michelle Stepan, a doctoral student in psychology.

    “It’s concerning that more people aren’t making the correct decision during lineups; this suggests our memories are not super accurate and that’s a problem when you’re dealing with the consequences of the criminal justice system,” Stepan said. “Putting someone in jail is a big decision based on somebody’s memory of a crime.”

    Stepan and Kimberly Fenn, associate professor of psychology and director of MSU’s Sleep and Learning Lab, conducted an experiment in which about 200 participants watched a video of a crime (a man planting a bomb on a rooftop) and then, 12 hours later, viewed one of two computer lineups of six similar-looking people. One lineup included the perpetrator; the other lineup did not.

    Some participants watched the crime video in the morning and viewed a lineup that night, with no sleep in between. Others watched the crime video at night and viewed a lineup the next morning, after sleeping.

    When the perpetrator was not in the lineup, participants who had slept identified an innocent person 42 percent of the time — compared to 66 percent for participants who had not slept.

    “This is the most interesting finding of the study — that individuals are less likely to choose an innocent suspect after a period of sleep when the perpetrator is absent from the lineup,” Fenn said. This is relevant, she added, because false convictions too often stem from an incorrect eyewitness identification of a suspect who did not commit the crime.

    When the perpetrator was in the lineup, there was essentially no difference between the sleep and no-sleep groups’ ability to choose the guilty man. Both groups correctly identified the perpetrator about 50 percent of the time.

    “In other words,” Fenn said, “sleep may not help you get the right guy, but it may help you keep an innocent individual out of jail.”

    The results could reflect both changes in memory strength and decision-making strategies after sleep.

    The researchers believe participants who slept were more likely to use an “absolute strategy,” in which they compare each person in the lineup to their memory of the suspect, while participants who didn’t sleep were more likely to use a “relative strategy,” in which they compare the people in the lineup to each other to determine who most resembles the perpetrator relative to the others.

    Using a relative strategy is believed to increase false identifications relative to an absolute strategy in perpetrator-absent lineups, Stepan said.

    “These findings tell us that sleep likely impacts memory processes but that it might also impact how people search through a lineup, and those search strategies might be a critical factor when the perpetrator is not in the lineup,” she said.

    Fenn noted that the key findings of the study have since been replicated.

    The MSU team is conducting research that further explores how sleep may directly or indirectly affect eyewitness memory.


  9. Brain’s ability to recognize faces is shaped through repeated exposure

    September 17, 2017 by Ashley

    From the Harvard Medical School press release:

    Scientists have long deemed the ability to recognize faces innate for people and other primates — something our brains just know how to do immediately from birth.

    However, the findings of a new Harvard Medical School study published Sept. 4 in the journal Nature Neuroscience cast doubt on this longstanding view.

    Working with macaques temporarily deprived of seeing faces while growing up, a Harvard Medical School team led by neurobiologists Margaret Livingstone, Michael Arcaro, and Peter Schade has found that regions of the brain that are key to facial recognition form only through experience and are absent in primates who don’t encounter faces while growing up.

    The finding, the researchers say, sheds light on a range of neuro-developmental conditions, including those in which people can’t distinguish between different faces or autism, marked by aversion to looking at faces. Most importantly, however, the study underscores the critical formative role of early experiences on normal sensory and cognitive development, the scientists say.

    Livingstone, the Takeda Professor of Neurobiology at Harvard Medical School, explains that macaques — a close evolutionary relative to humans, and a model system for studying human brain development — form clusters of neurons responsible for recognizing faces in an area of the brain called the superior temporal sulcus by 200 days of age. The relative location of these brain regions, or patches, are similar across primate species.

    That knowledge, combined with the fact that infants seem to preferentially track faces early in development, led to the longstanding belief that facial recognition must be inborn, she said. However, both humans and primates also develop areas in the brain that respond to visual stimuli they haven’t encountered for as long during evolution, including buildings and text. The latter observation puts a serious wrench in the theory that facial recognition is inborn.

    To better understand the basis for facial recognition, Livingstone, along with postdoctoral fellow Arcaro and research assistant Schade, raised two groups of macaques. The first one, the control group, had a typical upbringing, spending time in early infancy with their mothers and then with other juvenile macaques, as well as with human handlers. The other group grew up raised by humans who bottle-fed them, played with and cuddled them — all while the humans wore welding masks. For the first year of their lives, the macaques never saw a face — human or otherwise. At the end of the trial, all macaques were put in social groups with fellow macaques and allowed to see both human and primate faces.

    When both groups of macaques were 200 days old, the researchers used functional MRI to look at brain images measuring the presence of facial recognition patches and other specialized areas, such as those responsible for recognizing hands, objects, scenes and bodies.

    The macaques who had typical upbringing had consistent “recognition” areas in their brains for each of these categories. Those who’d grown up never seeing faces had developed areas of the brain associated with all categories except faces.

    Next, the researchers showed both groups images of humans or primates. As expected, the control group preferentially gazed at the faces in those images. In contrast, the macaques raised without facial exposure looked preferentially at the hands. The hand domain in their brains, Livingstone said, was disproportionally large compared to the other domains.

    The findings suggest that sensory deprivation has a selective effect on the way the brain wires itself. The brain seems to become very good at recognizing things that an individual sees often, Livingstone said, and poor at recognizing things that it never or rarely sees.

    What you look at is what you end up ‘installing’ in the brain’s machinery to be able to recognize,” she added.

    Normal development of these brain regions could be key to explaining a wide variety of disorders, the researchers said. One such disorder is developmental prosopagnosia — a condition in which people are born with the inability to recognize familiar faces, even their own, due to the failure of the brain’s facial recognition machinery to develop properly. Likewise, Livingstone said, some of the social deficits that develop in people with autism spectrum disorders may be a side effect stemming from the lack of experiences that involve looking at faces, which children with these disorders tend to avoid. The findings suggest that interventions to encourage early exposure to faces may assuage the social deficits that stem from lack of such experiences during early development, the team said.


  10. Study looks at biological origins of memory deficits in schizophrenia

    by Ashley

    From the Zuckerman Institute at Columbia University press release:

    A team of Columbia scientists has found that disruptions to the brain’s center for spatial navigation — its internal GPS — result in some of the severe memory deficits seen in schizophrenia. The new study in mouse models of the disorder marks the first time that schizophrenia’s effects have been observed in the behavior of living animals — and at the level of individual brain cells — with such high-resolution, precision and clarity. The findings offer a promising entry point for attacking a near-universal and debilitating symptom of schizophrenia, memory deficits, which has thus far withstood all forms of treatment.

    The results of this study were published in Nature Neuroscience.

    “An almost intractably complex disorder, schizophrenia is nearly impossible to fully treat — in large part because it acts as two disorders in one,” said Joseph Gogos, MD, PhD, a principal investigator at Columbia’s Mortimer B. Zuckerman Mind Brain Behavior Institute and the paper’s co-senior author. “On one hand, you have paranoia, hallucinations and delusions; while on the other you have severe memory deficits. Antipsychotic drugs, which treat the first class of symptoms, are entirely ineffective when dealing with the second. The reasons for this are simple: we do not yet understand what happens in the brains of schizophrenia patients.

    Cracking schizophrenia’s code must therefore start with deciphering its biological origins, says Dr. Gogos, who is also professor of physiology, cellular biophysics and neuroscience at Columbia University Medical Center (CUMC). This has led to a recent focus on the memory impairments that are so common among schizophrenia patients. In this new study, Dr. Gogos teamed up with Attila Losonczy, MD, PhD, a fellow Zuckerman Institute principal investigator, to investigate episodic memory, which is severely impaired in cases of schizophrenia.

    “Episodic memory is the brain’s repository of information about the past; a way of traveling backwards to recall a specific moment in time,” said Dr. Losonczy, who was also a senior author. “This type of memory is critical for learning about and functioning in everyday life.”

    For this study, the team focused on a brain region called CA1, located in the hippocampus, which plays a role in both navigation and in episodic memory. Physical alterations to CA1 have been previously reported among schizophrenia patients. CA1 is home to place cells, which collectively form internal maps in the brain critical for navigating one’s present surroundings. The CA1 place cells also encode the spatial aspects of episodic memories, such as where you were when you last saw your best friend, or the place your parents always kept the holiday decorations.

    “Recent advances in imaging technologies now give us the power to watch the activity of hundreds of place cells in the CA1 in real time while an animal forms and recalls memories,” said Dr. Losonczy, who is also an associate professor of neuroscience at CUMC. “We developed experiments to record CA1 activity in mice that were genetically modified to mimic schizophrenia, and compared them to normal, healthy mice.”

    The researchers placed both groups of animals on a treadmill under a high-resolution, two-photon microscope, where they were exposed to a variety of sights, sounds and smells (including a water reward placed at unmarked locations on the treadmill). These experiments were designed to test the animals’ ability to navigate a new environment, remember how to navigate a familiar one and adapt quickly when that environment was altered.

    The two groups of mice showed striking differences in behavior and in cell activity. While both groups could successfully navigate a new environment, the schizophrenia-like mice had more trouble remembering familiar environments from day to day, as well as adapting when aspects of that environment changed. By simultaneously tracking the animals’ place cells via the two-photon microscope, the team spotted the difference.

    “When the healthy mice approached something familiar, such as water, their place cells fired with increasing intensity, and then quieted down as the animals moved away,” explained Dr. Losonczy. “And when we moved the location of the water, and gave the animals a chance to relearn where it was, the activity of their place cells reflected the new location.”

    But the brains of the schizophrenia-like mice were different. Their place cells did not shift when the water reward was moved. The brain cells’ lack of adaptability, the scientists argue, could reflect a key and more general mechanism of memory deficits in schizophrenia. It could also represent a new target for drug intervention.

    “These studies are helping to build an understanding of a disorder that has remained a biological mystery,” said Dr. Gogos. “By pinpointing schizophrenia’s many causes, we are opening up multiple points of intervention to slow, halt and even prevent the disorder — which stands to dramatically improve the lives of patients and their families.”