1. Study suggests mu­sic and nat­ive lan­guage in­ter­act in the brain

    December 11, 2017 by Ashley

    From the University of Helsinki press release:

    The brain’s auditory system can be shaped by exposure to different auditory environments, such as native language and musical training.

    A recent doctoral study by Caitlin Dawson from University of Helsinki focuses on interacting effects of native language patterns and musical experience on early auditory processing of basic sound features. Methods included electrophysiological brainstem recording as well as a set of behavioral auditory discrimination tasks.

    The auditory tasks were designed to find discrimination thresholds for intensity, frequency, and duration. A self-report questionnaire on musical sophistication was also used in the analyses.

    “We found that Finnish speakers showed an advantage in duration processing in the brainstem, compared to German speakers. The reason for this may be that Finnish language includes long and short sounds that determine the meaning of words, which trains Finnish speakers’ brains to be very sensitive to the timing of sounds,” Dawson states.

    For Finnish speakers, musical expertise was associated with enhanced behavioral frequency discrimination. Mandarin speaking musicians showed enhanced behavioral discrimination in both frequency and duration. Mandarin Chinese language has tones which determine the meaning of words.

    “The perceptual effects of musical expertise were not reflected in brainstem responses in either Finnish or Mandarin speakers. This might be because language is an earlier and more essential skill than music, and native speakers are experts at their own language,” Dawson says.

    The results suggest that musical expertise does not enhance all auditory features equally for all language speakers; native language phonological patterns may modulate the enhancing effects of musical expertise on processing of specific features.


  2. Study suggests socioeconomic status may be linked to differences in the vocabulary growth

    December 10, 2017 by Ashley

    From the University of Texas at Dallas press release:

    The nation’s 31 million children growing up in homes with low socioeconomic status have, on average, significantly smaller vocabularies compared with their peers.

    A new study from the Callier Center for Communication Disorders at The University of Texas at Dallas found these differences in vocabulary growth among grade school children of different socioeconomic statuses are likely related to differences in the process of word learning.

    Dr. Mandy Maguire, associate professor in the School of Behavioral and Brain Sciences (BBS), said in her study that children from lower-income homes learned 10 percent fewer words than their peers from higher-income homes. When entering kindergarten, children from low-income homes generally score about two years behind their higher-income peers on language and vocabulary measures.

    The vocabulary gap between the two groups of children gets larger throughout their schooling and has long-term academic implications, Maguire said.

    The primary reason for the differences in infancy and preschool is related to different quantity and quality of language exposure at home. But why the gap increases as the children get older is less studied.

    “We might assume that it’s the same reason that the gap is large when they’re young: that their environment is different,” Maguire said. “Another possibility is that all of this time spent in low-income situations has led to differences in their ability to learn a word. If that’s the case, there’s a problem in the mechanism of learning, which is something we can fix.”

    The study, recently published in the Journal of Experimental Child Psychology, aimed to determine whether socioeconomic status is related to word learning in grade school and to what degree vocabulary, reading and working memory might mediate that relationship.

    For the study, 68 children ages 8 to 15 performed a task that required using the surrounding text to identify the meaning of an unknown word. One exercise included three sentences, each with a made-up word at the end — for example, “Mom piled the pillows on the thuv.”

    “You have to understand all of the language in each sentence leading up to the made-up word, remember it and decide systematically across all three sentences what the made-up word must mean,” Maguire said. “In this case, the three sentences all indicated ‘thuv’ meant ‘bed.’ This isn’t quite the same as real word learning, where we have to create a new concept, but this what we think kids — and adults — do as they initially learn a word.”

    Specifically, the study found that children of lower socioeconomic status are not as effective at using known vocabulary to build a robust picture or concept of the incoming language and use that to identify the meaning of an unknown word.

    Reading and working memory — also known to be problematic for children from low-income homes — were not found to be related.

    The study also provides potential strategies that may be effective for intervention. For children ages 8 to 15, schools may focus too much on reading and not enough on increasing vocabulary through oral methods, Maguire said.

    Maguire said parents and teachers can help children identify relationships between words in sentences, such as assigning a word like “bakery,” and having the child list as many related words as possible in one minute. Visualizing the sentences as they read also can help.

    “Instead of trying to fit more vocabulary in a child’s head, we might be able to work on their depth of knowledge of the individual words and linking known meanings together in a way that they can use to learn new information,” Maguire said.

    This study was funded by a three-year grant from the National Science Foundation, which was awarded in April 2016.

    Three co-authors of the paper are BBS doctoral students who work in Maguire’s Developmental Neurolinguistics Laboratory: Julie M. Schneider, Anna E. Middleton and Yvonne Ralph. Lab coordinator Michael Lopez and Dr. Robert Ackerman, an associate professor, also are co-authors, along with Dr. Alyson Abel, a recent Callier Center postdoctoral fellow who is an assistant professor at San Diego State University.


  3. Study suggests babies understand when words are related

    December 1, 2017 by Ashley

    From the Duke University press release:

    The meaning behind infants’ screeches, squeals and wails may frustrate and confound sleep-deprived new parents. But at an age when babies cannot yet speak to us in words, they are already avid students of language.

    Even though there aren’t many overt signals of language knowledge in babies, language is definitely developing furiously under the surface,” said Elika Bergelson, assistant professor of psychology and neuroscience at Duke University.

    Bergelson is the author of a surprising 2012 study showing that six- to nine-month-olds already have a basic understanding of words for food and body parts. In a new report, her team used eye-tracking software to show that babies also recognize that the meanings of some words, like car and stroller, are more alike than others, like car and juice.

    By analyzing home recordings, the team found that babies’ word knowledge correlated with the proportion of time they heard people talking about objects in their immediate surroundings.

    “Even in the very early stages of comprehension, babies seem to know something about how words relate to each other,” Bergelson said. “And already by six months, measurable aspects of their home environment predict how much of this early level of knowledge they have. There are clear follow-ups for potential intervention work with children who might be at-risk for language delays or deficits.”

    The study appears the week of Nov. 20 in the Proceedings of the National Academy of Sciences.

    To gauge word comprehension, Bergelson invited babies and their caregivers into a lab equipped with a computer screen and few other infant distractions. The babies were shown pairs of images that were related, like a foot and a hand, or unrelated, like a foot and a carton of milk. For each pair, the caregiver (who couldn’t see the screen) was prompted to name one of the images while an eye-tracking device followed the baby’s gaze.

    Bergelson found that babies spent more time looking at the image that was named when the two images were unrelated than when they were related.

    “They may not know the full-fledged adult meaning of a word, but they seem to recognize that there is something more similar about the meaning of these words than those words,” Bergelson said.

    Bergelson then wanted to investigate how babies’ performance in the lab might be linked to the speech they hear at home. To peek into the daily life of the infants, she sent each caregiver home with a colorful baby vest rigged with a small audio recorder and asked them to use the vest to record day-long audio of the infant. She also used tiny hats fitted with lipstick-sized video recorders to collect hour-long video of each baby interacting with his or her caregivers.

    Combing through the recordings, Bergelson and her team categorized and tabulated different aspects of speech the babies were exposed to, including the objects named, what kinds of phrases they occurred in, who said them, and whether or not objects named were present and attended to.

    “It turned out that the proportion of the time that parents talked about something when it was actually there to be seen and learned from correlated with the babies’ overall comprehension,” Bergelson said.

    For instance, Bergelson said, if a parent says, “here is my favorite pen,” while holding up a pen, the baby might learn something about pens based on what they can see. In contrast, if a parent says, “tomorrow we are going to see the lions at the zoo,” the baby might not have any immediate clues to help them understand what lion means.

    “This study is an exciting first step in identifying how early infants learn words, how their initial lexicon is organized, and how it is shaped or influenced by the language that they hear in the world that surrounds them,” said Sandra Waxman, a professor of psychology at Northwestern University who was not involved in the study.

    But, Waxman cautions, it is too early in the research to draw any conclusions about how caregivers should be speaking to their infants.

    “Before anyone says ‘this is what parents need to be doing,’ we need further studies to tease apart how culture, context and the age of the infant can affect their learning,” Waxman said.

    “My take-home to parents always is, the more you can talk to your kid, the better,” Bergelson said. “Because they are listening and learning from what you say, even if it doesn’t appear to be so.”

     


  4. Study suggests punctuation in text messages helps replace cues found in face-to-face conversations

    November 25, 2017 by Ashley

    From the Binghamton University press release:

    Emoticons, irregular spellings and exclamation points in text messages aren’t sloppy or a sign that written language is going down the tubes — these “textisms” help convey meaning and intent in the absence of spoken conversation, according to newly published research from Binghamton University, State University of New York.

    “In contrast with face-to-face conversation, texters can’t rely on extra-linguistic cues such as tone of voice and pauses, or non-linguistic cues such as facial expressions and hand gestures,” said Binghamton University Professor of Psychology Celia Klin. “In a spoken conversation, the cues aren’t simply add-ons to our words; they convey critical information. A facial expression or a rise in the pitch of our voices can entirely change the meaning of our words.”

    “It’s been suggested that one way that texters add meaning to their words is by using “textisms” — things like emoticons, irregular spellings (sooooo) and irregular use of punctuation (!!!).”

    A 2016 study led by Klin found that text messages that end with a period are seen as less sincere than text messages that do not end with a period. Klin pursued this subject further, conducting experiments to see if people reading texts understand textisms, asking how people’s understanding of a single-word text (e.g., yeah, nope, maybe) as a response to an invitation is influenced by the inclusion, or absence, of a period.

    “In formal writing, such as what you’d find in a novel or an essay, the period is almost always used grammatically to indicate that a sentence is complete. With texts, we found that the period can also be used rhetorically to add meaning,” said Klin. “Specifically, when one texter asked a question (e.g., I got a new dog. Wanna come over?), and it was answered with a single word (e.g., yeah), readers understood the response somewhat differently depending if it ended with a period (yeah.) or did not end with a period (yeah). This was true if the response was positive (yeah, yup), negative (nope, nah) or more ambiguous (maybe, alright). We concluded that although periods no doubt can serve a grammatical function in texts just as they can with more formal writing — for example, when a period is at the end of a sentence — periods can also serve as textisms, changing the meaning of the text.”

    Klin said that this research is motivated by an interest in taking advantage of a unique moment in time when scientists can observe language evolving in real time.

    “What we are seeing with electronic communication is that, as with any unmet language need, new language constructions are emerging to fill the gap between what people want to express and what they are able to express with the tools they have available to them,” said Klin. “The findings indicate that our understanding of written language varies across contexts. We read text messages in a slightly different way than we read a novel or an essay. Further, all the elements of our texts — the punctuation we choose, the way that words are spelled, a smiley face — can change the meaning. The hope, of course, is that the meaning that is understood is the one we intended. Certainly, it’s not uncommon for those of us in the lab to take an extra second or two before we send texts. We wonder: How might this be interpreted? ‘Hmmm, period or no period? That sounds a little harsh; maybe I should soften it with a “lol” or a winky-face-tongue-out emoji.'”

    With trillions of text messages sent each year, we can expect the evolution of textisms, and of the language of texting more generally, to continue at a rapid rate, wrote the researchers. Texters are likely to continue to rely on current textisms, as well to as create new textisms, to take the place of the extra-linguistic and nonverbal cues available in spoken conversations. The rate of change for “talk-writing” is likely to continue to outpace the changes in other forms of English.

    “The results of the current experiments reinforce the claim that the divergence from formal written English that is found in digital communication is neither arbitrary nor sloppy,” said Klin. “It wasn’t too long ago that people began using email, instant messaging and text messaging on a regular basis. Because these forms of communication provide limited ways to communicate nuanced meaning, especially compared to face-to-face conversations, people have found other tools.”

    Klin believes that this subject could be studied further.

    “An important extension would be to have a situation that more closely mimics actual texting, but in the lab where we can observe how different cues, such as punctuation, abbreviations and emojis, contribute to texters’ understanding,” she said. “We might also examine some of the factors that should influence texters’ understanding, such as the social relationship between the two texters or the topic of the text exchange, as their level of formality should influence the role of things like punctuation.”


  5. Chimp study reveals how brain’s structure shaped our evolution

    by Ashley

    From the University of Edinburgh press release:

    The pattern of asymmetry in human brains could be a unique feature of our species and may hold the key to explaining how we first developed language ability, experts say.

    Findings are based on brain scans of humans and previously collected data from chimpanzees. They could help scientists understand how our brains evolved and why asymmetry is vital to human development.

    The study explores the phenomenon of brain torque, in which the human brain shows slight twisting. Until now, this was also thought to be true of other primates.

    Researchers led by the University of Edinburgh studied images from an existing bank of chimpanzee brain scans held in the US.

    Comparisons were made with the brains of humans who were scanned using similar equipment — known as magnetic resonance imaging (MRI) — and identical experimental procedures.

    Chimpanzee brains were shown to be made up of equal halves, or hemispheres, whereas in human brains a subtle twist was present.

    Asymmetry was seen in humans — but not chimpanzees — with the left hemisphere longer than the right.

    Language ability has been linked to areas within the left hemisphere of the brain and has also been associated with asymmetry.

    The research sheds light on how humans developed skills for language, researchers suggest. A new study of particular brain areas related to language using the same image bank could aid understanding of this.

    Neil Roberts, Professor of Medical Physics and Imaging Science at the University of Edinburgh, said: “Our findings highlight a special, subtle feature of the human brain that distinguishes us from our closest primate cousins and may have evolved rapidly. Better understanding of how this came about in our evolution could help explain how humans developed language.”

    The study was published in the journal NeuroImage. It was carried out in collaboration with researchers at the University of Oxford, as well as in China and the US.


  6. Study suggests engaging children in math at home equals a boost in more than just math skills

    November 23, 2017 by Ashley

    From the Purdue University press release:

    Preschool children who engage in math activities at home with their parents not only improve their math skills, but also their general vocabulary, according to research from Purdue University.

    “Exposure to basic numbers and math concepts at home were predictive, even more so than storybook reading or other literacy-rich interactions, of improving preschool children’s general vocabulary,” said Amy Napoli, a doctoral student in the Department of Human Development and Family Studies who led the study. “And one of the reasons we think this could be is the dialogue that happens when parents are teaching their children about math and asking questions about values and comparisons, which helps these young children improve their oral language skills.”

    The findings are published online in the Journal of Experimental Child Psychology.

    “It’s never too early to talk about numbers and quantities. One of the first words young children learn is ‘more,'” said David Purpura, an assistant professor in the Department of Human Development and Family Studies, and senior author of the study.

    There are a number of ways parents can encourage math learning at home, such as talking about counting, connecting numbers to quantities and comparing values — more and less. It also helps to focus on counting as purposeful, such as “there are three cookies for a snack” rather than “there are cookies for a snack.”

    “This focus on math typically isn’t happening at home, but this shows that when parents do include math concepts it can make a difference,” said Napoli, who is working on tools to help parents improve math-related instruction at home. “When working with families, there is a math-related anxiety aspect and that is probably why more parents focus on literacy than on math. But, if you can count, then you can teach something to your child.”

    This study evaluated 116 preschool children, ages 3-5. The researchers assessed the children’s math and language skills in the fall and spring of the preschool year and examined how what their parents reported about math and literacy activities at home predicted children’s improvement over time. Napoli and Purpura do caution that these findings are only correlational and the future experimental work is needed to evaluate the causal nature of these findings. This research is ongoing work supported by Purdue’s Department of Human Development and Family Studies.


  7. Study suggests commonplace jokes may normalize experiences of sexual misconduct

    November 20, 2017 by Ashley

    From the Taylor & Francis press release:

    Commonplace suggestive jokes, such as “that’s what she said,” normalize and dismiss the horror of sexual misconduct experiences, experts suggest in a new essay published in Communication and Critical/Cultural Studies, a National Communication Association publication.

    The recent wave of sexual assault and harassment allegations against prominent actors, politicians, media figures, and others highlights the need to condemn inappropriate and misogynistic behavior, and to provide support and encouragement to victims.

    Communication scholars Matthew R. Meier of West Chester University of Pennsylvania and Christopher A. Medjesky of the University of Findlay argue that off-hand, common remarks such as the “that’s what she said” joke are deeply entrenched in modern society, and contribute to humorizing and legitimizing sexual misconduct.

    The first notable “that’s what she said” joke occurred during a scene in the 1992 film Wayne’s World; however, it became a running joke in the hit television show The Office, leading to “dozens of internet memes, video compilations, and even fansites dedicated to cataloguing occurrences and creating new versions of the joke.” After analyzing multiple examples of the joke used in the show, the authors argue that the “that’s what she said” joke serves as an analog to the rhetoric of rape culture.

    By discrediting and even silencing victims, this type of humor conditions audiences to ignore — and worse, to laugh at — inappropriate sexual behavior.

    Furthermore, the authors suggest that these types of comments contribute to dangerous societal and cultural norms by ultimately reinforcing the oppressive ideologies they represent, despite the intentions or naivete of the people making the jokes.

    The authors argue that the “that’s what she said” joke cycle is part of a larger discourse that not only becomes culturally commonplace, but also reinforces dangerous ideologies that are so entrenched in contemporary life that we end up laughing at something that isn’t funny at all.


  8. Study looks at language often used by people with ADHD on Twitter

    by Ashley

    From the University of Pennsylvania press release:

    What can Twitter reveal about people with attention-deficit/hyperactivity disorder, or ADHD? Quite a bit about what life is like for someone with the condition, according to findings published by University of Pennsylvania researchers Sharath Chandra Guntuku and Lyle Ungar in the Journal of Attention Disorders. Twitter data might also provide clues to help facilitate more effective treatments.

    “On social media, where you can post your mental state freely, you get a lot of insight into what these people are going through, which might be rare in a clinical setting,” said Guntuku, a postdoctoral researcher working with the World Well-Being Project in the School of Arts and Sciences and the Penn Medicine Center for Digital Health. “In brief 30- or 60-minute sessions with patients, clinicians might not get all manifestations of the condition, but on social media you have the full spectrum.”

    Guntuku and Ungar, a professor of computer and information science with appointments in the School of Engineering and Applied Science, the School of Arts and Sciences, the Wharton School and Penn Medicine, turned to Twitter to try to understand what people with ADHD spend their time talking about. The researchers collected 1.3 million publicly available tweets posted by almost 1,400 users who had self-reported diagnoses of ADHD, plus an equivalent control set that matched the original group in age, gender and duration of overall social-media activity. They then ran models looking at factors like personality and posting frequency.

    “Some of the findings are in line with what’s already known in the ADHD literature,” Guntuku said. For example, social-media posters in the experimental group often talked about using marijuana for medicinal purposes. “Our coauthor, Russell Ramsay, who treats people with ADHD, said this is something he’s observed in conversations with patients,” Guntuku added.

    The researchers also found that people with ADHD tended to post messages related to lack of focus, self-regulation, intention and failure, as well as expressions of mental, physical and emotional exhaustion. They often used words like “hate,” “disappointed,” “cry” and “sad” more frequently than the control group and often posted during hours of the day when the majority of people sleep, from midnight to 6 a.m.

    “People with ADHD are experiencing more mood swings and more negativity,” Ungar said. “They tend to have problems self-regulating.”

    This could partially explain why they enjoy social media’s quick feedback loop, he said. A well-timed or intriguing tweet could yield a positive response within minutes, propelling continued use of the online outlet.

    Using information gleaned from this study and others, Ungar and Guntuku said they plan to build condition-specific apps that offer insight into several conditions, including ADHD, stress, anxiety, depression and opioid addiction. They aim to factor in facets of individuals, their personality or how severe their ADHD is, for instance, as well as what triggers particular symptoms.

    The applications will also include mini-interventions. A recommendation for someone who can’t sleep might be to turn off the phone an hour before going to bed. If anxiety or stress is the major factor, the app might suggest an easy exercise like taking a deep breath, then counting to 10 and back to zero.

    “If you’re prone to certain problems, certain things set you off; the idea is to help set you back on track,” Ungar said.

    Better understanding ADHD has the potential to help clinicians treat such patients more successfully, but having this information also has a downside: It can reveal aspects of a person’s personality unintentionally, simply by analyzing words posted on Twitter. The researchers also acknowledge that the 50-50 split of ADHD to non-ADHD study participants isn’t true to life; only about 8 percent of adults in the U.S. have the disorder, according to the National Institute of Mental Health. In addition, people in this study self-reported an ADHD diagnosis rather than having such a determination come from a physician interaction or medical record.

    Despite these limitations, the researchers say the work has strong potential to help clinicians understand the varying manifestations of ADHD, and it could be used as a complementary feedback tool to give ADHD sufferers personal insights.

    “The facets of better-studied conditions like depression are pretty well understood,” Ungar said. “ADHD is less well studied. Understanding the components that some people have or don’t have, the range of coping mechanisms that people use — that all leads to a better understanding of the condition.”


  9. Study suggests dual-language learners outperform monolingual students once they gain English proficiency

    November 16, 2017 by Ashley

    From the Iowa State University press release:

    Not all dual-language learners are at risk academically, but as a group, these students are often labeled as such, despite differences in their English skills.

    A new Iowa State University study examined how variation in dual language status among Head Start students related to development in cognitive and academic areas. The research team led by Ji-Young Choi, an assistant professor of Human Development and Family Studies, found dual-language learners (DLLs) had significant growth, eventually outperforming students who only spoke English, once DLLs gained basic English proficiency. The results are published in the journal Early Childhood Research Quarterly.

    Choi, Christine Lippard, an assistant professor of human development and family studies at Iowa State; and Shinyoung Jeon, a postdoctoral research fellow at the University of Oklahoma-Tulsa, analyzed data measuring inhibitory control (the ability to pay attention and control natural, but unnecessary thoughts or behaviors) and math achievement for low-income students in Head Start through kindergarten. The data, collected through the Head Start Family and Child Experiences Survey (FACES) 2009, included 825 children — whose home language was English or Spanish — at 59 Head Start programs across the country.

    Instead of treating DLLs as a homogenous group, researchers created two categories — Spanish-English bilinguals, who can function in both languages; and DLLs with limited English skills — based on ability entering Head Start. They identified stark differences between the DLL groups and English-only students over the course of the study. Entering Head Start, bilingual students had higher inhibitory control, but lower math scores, than English-only students did. DLLs with limited English skills lagged behind both groups. However, over the course of 18 months, bilingual students outperformed English-only students with higher scores in math and inhibitory control, despite having lower baseline scores for math at the beginning of the study.

    DLLs with limited English skills — students considered at risk when they entered Head Start — also made significant progress, the study found. These students outpaced bilingual and English-only students in the rate of gains for inhibitory control skills. While their scores had not caught up with the other two groups by the midpoint of kindergarten (the final point of analysis for the study), Choi expects with more time DLLs with limited English skills would eventually match or even outperform English-only peers as they learn more English and become bilingual.

    “Recognizing that dual-language learners can do better than we expected has huge implications. When these students do not have age-appropriate English skills they are more at risk, but once they achieve those skills they actually excel,” Choi said. “This study also confirms that there is a cognitive benefit for bilingual students.”

    Importance of inhibitory control

    The researchers say that bilingual children’s faster growth rate in inhibitory control over time helped explain the significant difference in kindergarten math skills between bilingual children and English-only students. Based on the FACES data, they could not provide a definitive explanation for the faster growth rate in inhibitory control. However, Choi says the research results lend support to the theory that bilingual students develop stronger inhibitory control skills because of their daily practice toggling between languages to fit the conversation, and inhibiting one language while speaking another.

    Inhibitory control encompasses everything from a child’s ability to suppress the impulse to grab a toy away from a friend to inhibiting the impulse to pronounce a “t” sound at the beginning of the, Lippard said. It is an important foundational skill for academic growth as well as behavior.

    Supporting students’ home language

    Recognizing skill-level differences is important given that DLLs are in more than 70 percent of Head Start classrooms. Lippard says all early childhood educators need to understand the developmental strengths of DLLs, and recognize there is no one-size-fits-all approach for teaching these students. The study makes the case for instructional support to help DLLs become proficient in English while learning or maintaining their home language. Lippard says one way to achieve that is by giving students the opportunity to engage with linguistically diverse teachers.

    “Preschool programs are so full of academic expectations that adding a Spanish lesson time may not be helpful or developmentally appropriate,” Lippard said. “Learning Spanish by interacting with a native Spanish speaker and experiencing typical preschool activities like singing songs or reading stories in Spanish holds potential benefits for all of the children in the classroom.”

    Choi would like to see instructional support for DLLs throughout their formal education. DLLs use their home language less and less as they are exposed to English in school and risk losing their home language, Choi said. While it is important for students to be proficient in English, she says DLLs would lose the potential bilingual benefits without support for their home language.


  10. How spatial navigation correlates with language

    November 15, 2017 by Ashley

    From the National Research University Higher School of Economics press release:

    Cognitive neuroscientists from the Higher School of Economics and Aarhus University experimentally demonstrate how spatial navigation impacts language comprehension. The results of the study have been published in NeuroImage.

    Language is a complicated cognitive function, which is performed not only by local brain modules, but by a distributed network of cortical generators. Physical experience such as movement and spatial motion play an important role in psychological experiences and cognitive function, which is related to how an individual mentally constructs the meaning of a sentence.

    Nikola Vukovic and Yury Shtyrov carried out an experiment at the HSE Centre for Cognition & Decision Making, which explains the relations between the systems responsible for spatial navigation and language. Using neurophysiological data, they describe brain mechanisms that support navigation systems use in both spatial and linguistic tasks.

    “When we read or hear stories about characters, we have to represent the inherently different perspectives people have on objects and events, and ‘put ourselves in their shoes’. Our study is the first to show that our brain mentally simulates sentence perspective by using non-linguistic areas typically in charge of visuo-spatial thought” says Dr. Nikola Vukovic, the scientist who was chiefly responsible for devising and running the experiment.

    Previous studies have shown that humans have certain spatial preferences that are based either on one’s body (egocentric) or are independent from it (allocentric). Although not absolute and subject to change in various situations, these preferences define how an individual perceives the surrounding space and how they plan and understand navigation in this space.

    The participants of the experiment solved two types of tasks. The first was a computer-based spatial navigation task involving movement through a twisting virtual tunnel, at the end of which they had to indicate the beginning of the tunnel. The shape of the tunnel was designed so that people with egocentric and allocentric perspectives estimated the starting point differently. This difference in their subjective estimates helped the researchers split the participants according to their reference frame predispositions.

    The second task involved understanding simple sentences and matching them with pictures. The pictures differed in terms of their perspective, and the same story could be described using first (“I”) or second person pronouns (“You”). The participants had to choose which pictures best matched the situation described by the sentence.

    During the experiment, electrical brain activity was recorded in all participants with the use of continuous electroencephalographic (EEG) data. Spectral perturbations registered by EEG demonstrated that a lot of areas responsible for navigation were active during the completion of both types of tasks. One of the most interesting facts for the researchers was that activation of areas when hearing the sentences also depended on the type of individual’s spatial preferences.

    Brain activity when solving a language task is related to a individuals’ egocentric or allocentric perspective, as well as their brain activity in the navigation task. The correlation between navigation and linguistic activities proves that these phenomena are truly connected’, emphasized Yury Shtyrov, leading research fellow at the HSE Centre for Cognition & Decision Making and professor at Aarhus University, where he directs MEG/EEG research. ‘Furthermore, in the process of language comprehension we saw activation in well-known brain navigation systems, which were previously believed to make no contribution to speech comprehension’.

    These data may one day be used by neurobiologists and health professionals. For example, in some types of aphasia, comprehension of motion-related words suffers, and knowledge on the correlation between the navigation and language systems in the brain could help in the search for mechanisms to restore these links.