1. Study provides insight into how infants learn to walk

    January 3, 2018 by Ashley

    From the Lancaster University press release:

    Ten-week-old babies can learn from practising walking months before they begin walking themselves say researchers.

    They gave the infants experience at “reflex walking” which is a primitive instinct in babies which disappears around 12 weeks of age.

    When held by an adult at a slightly forward angle, and with the soles of their feet touching a flat surface, the infants will reflexively walk by placing one foot in front of the other.

    Psychologists at Lancaster University gave this “reflex walking” experience to one half of a group of 10 week old infants, who took an average of 23 steps in 3 minutes.

    The other half of the group did not share in the experience of walking.

    The researchers showed film of human figures walking and crawling to both groups of infants as they sat on their mothers’ laps in a dimly lit room.

    They then measured how the infants responded to this visual information by recording electrical activity in their brains.

    Only the brains of the infants who had experienced “reflex walking” were able to recognise the same movement in the film of figures walking.

    Their response was more similar to that of older children learning to walk rather than babies from younger ages.

    The group of infants who had not practised “reflex walking” did not show this more mature brain activity but they may have recognised filmed crawling movement.

    Psychologist Professor Vincent Reid said the research in Neuropsychologia showed a link between perceiving an action and carrying out that action even in early infancy.

    “This result strongly suggests that experience refines the perception of biological motion during early infancy.

    “The act of walking has therefore shifted the percept of biological motion for those infants who had experienced self produced stepping behaviour.

    “This suggests that the limited period of experience … altered the infant’s perception of walking, indicating a link between action perception and action production in early infancy.”


  2. Study suggests certain books can increase infant learning during shared reading

    December 21, 2017 by Ashley

    From the University of Florida press release:

    Parents and pediatricians know that reading to infants is a good thing, but new research shows reading books that clearly name and label people and objects is even better.

    That’s because doing so helps infants retain information and attend better.

    “When parents label people or characters with names, infants learn quite a bit,” said Lisa Scott, a University of Florida psychology professor and co-author of the study published Dec. 8 in the journal Child Development. “Books with individual-level names may lead parents to talk to infants more, which is particularly important for the first year of life.”

    Scott and colleagues from the University of Massachusetts-Amherst studied infants in Scott’s Brain, Cognition, and Development Lab. Babies came into the lab twice: once at 6 months old and again at age 9 months. While in the lab, eye-tracking and electroencephalogram, or EEG, methods were used to measure attention and learning at both ages.

    In between visits, parents were asked to read with their infants at home according to a schedule that included 10 minutes of parent-infant shared book reading every day for the first two weeks, every other day for the second two weeks and then continued to decrease until infants returned at 9 months. Twenty-three families were randomly assigned storybooks. One set contained individual-level names, and the other contained category-level labels. Both sets of books were identical except for the labeling. Each of the training books’ eight pages presented an individual image and a two-sentence story.

    The individual-level books clearly identified and labeled all eight individuals, with names such as “Jamar,” “Boris,” “Anice,” and “Fiona.” The category-level books included two made-up labels (“hitchel,” “wadgen”) for all images. The control group included 11 additional 9-month-old infants who did not receive books.

    The infants whose parents read the individual-level names spent more time focusing and attending the images, and their brain activity clearly differentiated the individual characters after book reading. This was not found at 6 months (before book reading), for the control group, or for the group of infants who were given books with category-level labels.

    Scott has been studying how the specificity of labels affects infant learning and brain development since 2006. This longitudinal study is the third in a series. The eye tracking and EEG results are consistent with her other studies showing that name specificity improves cognition in infants.

    “There are lots of recommendations about reading books to babies, but our work provides a scientific basis for these recommendations and suggests that the type of book is also important,” she said. “Shared reading is a good way to support development in the first year of life,” “It creates an enjoyable and comforting environment for both the parents and the infant and encourages parents to talk to their infants.”


  3. Study suggests infant brain responses predict reading speed in secondary school

    December 20, 2017 by Ashley

    From the University of Jyväskylä press release:

    A study conducted at the Department of Psychology at the University of Jyväskylä, Finland and Jyväskylä Centre for Interdisciplinary Brain Research (CIBR) has found that the brain responses of infants with an inherited risk for dyslexia, a specific reading disability, predict their future reading speed in secondary school.

    The longitudinal study looked at the electrical brain responses of six-month-old infants to speech and the correlation between the brain responses and their pre-literacy skills in pre-school-age, as well as their literacy in the eighth grade at 14 years of age.

    The study discovered that the brain response of the infants with an inherited dyslexia risk differed from the brain responses of the control infants and predicted their reading speed in secondary school. The larger brain responses were related to a more fluent naming speed of familiar objects, better phonological skills, and faster reading.

    “The predictive effect of the infant’s brain response to the reading speed in secondary school is mediated by the pre-school-age naming speed of familiar objects, suggesting that if search of the words from the mental lexicon is hindered before school age, reading is still tangled in secondary school,” states researcher PhD Kaisa Lohvansuu.

    Atypical brain activation to speech in infants with inherited risk for dyslexia impedes the development of effective connections to the mental lexicon, and thus slows the naming and reading performances. Efficient/effortless search from the mental lexicon is therefore necessary in both fluent reading and naming.

    Developmental dyslexia, a specific reading disability, is the most common of the learning disabilities. It has a strong genetic basis: children of dyslexic parents are at great risk of encountering reading and/or writing difficulties at school.

    As speech stimuli, the current study used pseudo-words, i.e. words without meaning. The pseudo-words contained either a short or a long consonant (a double consonant). Phonemic length is a semantically distinguishing feature in the Finnish language, and differentiating it correctly would therefore be essential. However, the correct reading and writing of these contrasts have previously been found to be particularly difficult for Finnish dyslexics.

    The research was carried out as part of the Jyväskylä Longitudinal Study of Dyslexia (JLD). Half of the children who participated in the project had an inherited risk of dyslexia.


  4. Holding infants – or not – can leave traces on their genes

    December 19, 2017 by Ashley

    From the University of British Columbia press release:

    The amount of close and comforting contact between infants and their caregivers can affect children at the molecular level, an effect detectable four years later, according to new research from the University of British Columbia and BC Children’s Hospital Research Institute.

    The study showed that children who had been more distressed as infants and had received less physical contact had a molecular profile in their cells that was underdeveloped for their age — pointing to the possibility that they were lagging biologically.

    “In children, we think slower epigenetic aging might indicate an inability to thrive,” said Michael Kobor, a Professor in the UBC Department of Medical Genetics who leads the “Healthy Starts” theme at BC Children’s Hospital Research Institute.

    Although the implications for childhood development and adult health have yet to be understood, this finding builds on similar work in rodents. This is the first study to show in humans that the simple act of touching, early in life, has deeply-rooted and potentially lifelong consequences on genetic expression.

    The study, published last month in Development and Psychopathology, involved 94 healthy children in British Columbia. Researchers from UBC and BC Children’s Hospital asked parents of 5-week-old babies to keep a diary of their infants’ behavior (such as sleeping, fussing, crying or feeding) as well as the duration of caregiving that involved bodily contact. When the children were about 4 1/2 years old, their DNA was sampled by swabbing the inside of their cheeks.

    The team examined a biochemical modification called DNA methylation, in which some parts of the chromosome are tagged with small molecules made of carbon and hydrogen. These molecules act as “dimmer switches” that help to control how active each gene is, and thus affect how cells function.

    The extent of methylation, and where on the DNA it specifically happens, can be influenced by external conditions, especially in childhood. These epigenetic patterns also change in predictable ways as we age.

    Scientists found consistent methylation differences between high-contact and low-contact children at five specific DNA sites. Two of these sites fall within genes: one plays a role in the immune system, and the other is involved in metabolism. However, the downstream effects of these epigenetic changes on child development and health aren’t known yet.

    The children who experienced higher distress and received relatively little contact had an “epigenetic age” that was lower than would be expected, given their actual age. Such a discrepancy has been linked to poor health in several recent studies.

    “We plan on following up to see whether the ‘biological immaturity’ we saw in these children carries broad implications for their health, especially their psychological development,” says lead author Sarah Moore, a postdoctoral fellow. “If further research confirms this initial finding, it will underscore the importance of providing physical contact, especially for distressed infants.”


  5. Study suggests hearing different accents at home impacts language processing in infants

    December 12, 2017 by Ashley

    From the University at Buffalo press release:

    Infants raised in homes where they hear a single language, but spoken with different accents, recognize words dramatically differently at about 12 months of age than their age-matched peers exposed to little variation in accent, according to a recent study by a University at Buffalo expert in language development.

    The findings, published in the Journal of the Acoustical Society of America, point to the importance of considering the effects of multiple accents when studying speech development and suggest that monolingual infants should not be viewed as a single group.

    “This is important if you think about clinical settings where children are tested,” says Marieke van Heugten, an assistant professor in UB’s Department of Psychology and lead author of the study with Elizabeth K. Johnson, an associate professor of psychology at the University of Toronto. “Speech language pathologists [in most of North America] typically work with the local variant of English, but if you have a child growing up in an environment with more than one accent then they might recognize words differently than a child who hears only one accent.

    Although extensive research exists on bilingualism, few studies have taken accents into account when looking at early word recognition in monolingualism, and van Heugten says none have explored the issue of accents in children younger than 18 months, the age when they traditionally develop the ability to recognize pronunciation differences that can occur across identical words.

    “Variability in children’s language input, what they hear and how they hear it, can have important consequences on word recognition in young, monolingual children,” she says. For instance, an American-English speaking parent might call the yellow vehicle that takes children to school a “bus,” while the pronunciation of the same word by an Irish-English speaking parent might sound more like “boss.” The parents are referencing the same object, but because the child hears the word pronounced two ways she needs to learn how to map those different pronunciations to the same object.

    For the study, researchers tested children who were seated on the lap of a parent. The children heard words typically known to 12-1/2 month olds, like daddy, mommy, ball, dog and bath, along with nonsense words, such as dimma, mitty and guttle. Head turns by the children, a common procedure used in infant speech perception that signals recognition, determined preference for particular words. A second experiment included children of 14-1/2- and 18-months of age.

    Van Heugten says children show a preference for the known words when they recognize words that occur in their language. If they fail to recognize words there’s no reason for them to express a preference, as on the surface, the known and the nonsense words sound equally exciting with an infant-directed intonation.

    The results indicate that children who hear just a single accent prefer to listen to real words in the lab, although those who hear multiple accents don’t have that preference at 12-1/2 months. This difference between preference patterns was found even though the two groups were matched on socioeconomic status as well as the number of words children understand and produce. This suggests that both groups of children are learning words at about the same rate, but that hearing multiple accents at home might change how children recognize these words around their first birthday, at least in lab settings. Perhaps children raised in multi-accent environments need more contextual information to recognize words because they do not assume all words will be spoken in the regional accent.

    But they no longer have difficulties recognizing these words in this challenging task by the time they’re 18-months-old, according to van Heugten.

    “What we’re concluding is that children who hear multiple accents process language differently than those who hear a single accent,” she says. “We should be aware of this difference, and keep in mind as a factor prediction behavior in test settings, especially when testing children from diverse areas in the world.”

    “We are excited to carry out additional work in this area, comparing the development of young children growing up in more or less linguistically diverse environments,” adds Johnson. “[This] will help us better understand language acquisition in general, and perhaps help us better diagnose and treat language delays in children growing up in different types of environments.”


  6. Under stress, newborn babies show greater brain response to pain

    December 10, 2017 by Ashley

    From the Cell Press press release:

    When newborn babies are under stress, their brains show a heightened response to pain, a new study has found. However, you’d never know it from the way those infants act. The findings reported in Current Biology on November 30 show that stress leads to an apparent disconnect between babies’ brain activity and their behavior.

    “When newborn babies experience a painful procedure, there is a reasonably well coordinated increase in their brain activity and their behavioral responses, such as crying and grimacing,” says Laura Jones of the University College London. “Babies who are stressed have a larger response in the brain following a painful procedure. But, for these babies, this greater brain activity is no longer matched by their behavior.”

    Because newborns cannot tell us when they’re in pain, doctors and researchers use indirect measures of pain based on behavior. Those measures are sometimes used in the hospital to assess whether a baby needs to be made more comfortable or to be given painkillers.

    Jones and her colleagues led by Maria Fitzgerald knew that adults under stress frequently report feeling more pain. They wanted to find out in the new study whether that’s also true in infants.

    The researchers enrolled 56 healthy, newborn infant boys and girls from the postnatal ward and special care baby unit at the Elizabeth Garrett Anderson Obstetric Wing, University College Hospital. The researchers measured the babies’ stress levels based on salivary levels of cortisol stress hormone and heartbeat patterns, both before and after a clinically necessary heel lance. At the same time, they measured the babies’ pain response using EEG brain activity and facial expression.

    The data showed that babies with higher levels of background stress showed a bigger brain reaction to the heel lance procedure. However, that heightened activity in the brain didn’t correspond to a change in the babies’ behavior.

    Jones says that the effects of stress on the brain response didn’t come as a surprise. But they hadn’t expected that the babies’ behavior wouldn’t follow the same trend. In retrospect, however, the Fitzgerald lab had found before that behavior and brain activity in infants aren’t always related.

    “Now we have a greater understanding of what may cause this dissociation,” Jones says.

    Jones says that the findings offer yet another reason to treat and care for babies in ways that minimize both pain and stress. Stressed babies may not seem to respond to pain, even as their brain is still processing it.

    “This means that caregivers may underestimate a baby’s pain experience,” Jones says.

    In fact, neonatal doctors and nurses know that preterm babies sometimes “tune out” and become unresponsive when they are overwhelmed. The new findings seem to confirm those clinical observations in full-term infants.

    The researchers say they plan to explore in future studies how other environmental factors and previous experiences, such as interactions between mother and baby, influence the way newborns process and experience pain.


  7. Study suggests infants understand that more desirable rewards require more effort

    December 4, 2017 by Ashley

    From the American Association for the Advancement of Science press release:

    Infants who observe someone putting more effort into attaining a goal attribute more value to it, a new study finds. Past work has shed light on ways in which infants come to realize the differences in the value of an object; for example, if a person consistently chooses one item over another, infants will attribute more value to the selected item. Yet, it remains to be determined whether infants can grasp concepts of reward and associated “cost.” Here, Shari Liu and colleagues presented ten-month-old infants with animations in which a character was faced with varying costs in order to achieve a goal; for example, in one scenario, the character had to jump over either a low wall or a much higher one to get its reward. If the cost of acquiring the reward was too much, the character would refuse to expend the effort to retrieve it. The researchers monitored as infants watched these test events, observing the lengths of their gazes. The researchers ran two additional and similar experiments in which the character had to climb a ramp or jump a gap to retrieve the reward, to ensure that it wasn’t just speed or height that mattered, to infants’ perception of cost, but indeed the effort exerted. Regardless of whether a character cleared higher barriers, climbed steeper ramps, or jumped wider gaps to reach one target over the other, infants gazed longer at the scenario in which the character went after the less desirable reward, which indicated they were surprised by the choice. They expected the character to prefer the goal it attained through costlier actions. Thus, it appears that infants are able to assign greater value to rewards that are more costly to attain.


  8. Study suggests babies understand when words are related

    December 1, 2017 by Ashley

    From the Duke University press release:

    The meaning behind infants’ screeches, squeals and wails may frustrate and confound sleep-deprived new parents. But at an age when babies cannot yet speak to us in words, they are already avid students of language.

    Even though there aren’t many overt signals of language knowledge in babies, language is definitely developing furiously under the surface,” said Elika Bergelson, assistant professor of psychology and neuroscience at Duke University.

    Bergelson is the author of a surprising 2012 study showing that six- to nine-month-olds already have a basic understanding of words for food and body parts. In a new report, her team used eye-tracking software to show that babies also recognize that the meanings of some words, like car and stroller, are more alike than others, like car and juice.

    By analyzing home recordings, the team found that babies’ word knowledge correlated with the proportion of time they heard people talking about objects in their immediate surroundings.

    “Even in the very early stages of comprehension, babies seem to know something about how words relate to each other,” Bergelson said. “And already by six months, measurable aspects of their home environment predict how much of this early level of knowledge they have. There are clear follow-ups for potential intervention work with children who might be at-risk for language delays or deficits.”

    The study appears the week of Nov. 20 in the Proceedings of the National Academy of Sciences.

    To gauge word comprehension, Bergelson invited babies and their caregivers into a lab equipped with a computer screen and few other infant distractions. The babies were shown pairs of images that were related, like a foot and a hand, or unrelated, like a foot and a carton of milk. For each pair, the caregiver (who couldn’t see the screen) was prompted to name one of the images while an eye-tracking device followed the baby’s gaze.

    Bergelson found that babies spent more time looking at the image that was named when the two images were unrelated than when they were related.

    “They may not know the full-fledged adult meaning of a word, but they seem to recognize that there is something more similar about the meaning of these words than those words,” Bergelson said.

    Bergelson then wanted to investigate how babies’ performance in the lab might be linked to the speech they hear at home. To peek into the daily life of the infants, she sent each caregiver home with a colorful baby vest rigged with a small audio recorder and asked them to use the vest to record day-long audio of the infant. She also used tiny hats fitted with lipstick-sized video recorders to collect hour-long video of each baby interacting with his or her caregivers.

    Combing through the recordings, Bergelson and her team categorized and tabulated different aspects of speech the babies were exposed to, including the objects named, what kinds of phrases they occurred in, who said them, and whether or not objects named were present and attended to.

    “It turned out that the proportion of the time that parents talked about something when it was actually there to be seen and learned from correlated with the babies’ overall comprehension,” Bergelson said.

    For instance, Bergelson said, if a parent says, “here is my favorite pen,” while holding up a pen, the baby might learn something about pens based on what they can see. In contrast, if a parent says, “tomorrow we are going to see the lions at the zoo,” the baby might not have any immediate clues to help them understand what lion means.

    “This study is an exciting first step in identifying how early infants learn words, how their initial lexicon is organized, and how it is shaped or influenced by the language that they hear in the world that surrounds them,” said Sandra Waxman, a professor of psychology at Northwestern University who was not involved in the study.

    But, Waxman cautions, it is too early in the research to draw any conclusions about how caregivers should be speaking to their infants.

    “Before anyone says ‘this is what parents need to be doing,’ we need further studies to tease apart how culture, context and the age of the infant can affect their learning,” Waxman said.

    “My take-home to parents always is, the more you can talk to your kid, the better,” Bergelson said. “Because they are listening and learning from what you say, even if it doesn’t appear to be so.”

     


  9. Neuroscientists identify source of early brain activity

    November 15, 2017 by Ashley

    From the University of Maryland press release:

    Some expectant parents play classical music for their unborn babies, hoping to boost their children’s cognitive capacity later in life. While some research supported a link between prenatal sound exposure and improved brain function, scientists had not identified any structures responsible for this link in the developing brain.

    A new study led by University of Maryland neuroscientists is the first to identify a mechanism that could explain such an early link between sound input and cognitive function, often called the “Mozart effect.” Working with an animal model, the researchers found that a type of cell present in the brain’s primary processing area during early development, long thought to form structural scaffolding with no role in transmitting sensory information, may conduct such signals after all.

    The results, which could have implications for the early diagnosis of autism and other cognitive deficits, were published in the online early edition of the Proceedings of the National Academy of Sciences on November 6, 2017.

    “Previous research documented brain activity in response to sound during early developmental phases, but it was hard to determine where in the brain these signals were coming from,” said Patrick Kanold, a professor of biology at UMD and the senior author of the research paper. “Our study is the first to measure these signals in an important cell type in the brain, providing important new insights into early sensory development in mammals.”

    Working with young ferrets, Kanold and his team directly observed sound-induced nerve impulses in subplate neurons for the first time. During development, subplate neurons are among the first neurons to form in the cerebral cortex — the outer part of the mammalian brain that controls perception, memory and, in humans, higher functions such as language and abstract reasoning. Subplate neurons help guide the formation of neural circuits, in the same way that a temporary scaffolding helps a construction crew build walls and install windows on a new building.

    Much like construction scaffolding, the role of subplate neurons is thought to be temporary. Once the brain’s permanent neural circuits form, most of the subplate neurons die off and disappear. According to Kanold, researchers assumed that subplate neurons had no role in transmitting sensory information, given their temporary structural role.

    Conventional wisdom suggested that mammalian brains transmit their first sensory signals in response to sound after the thalamus fully connects to the cerebral cortex. In many mammals used for research, the connection of the thalamus and the cortex also coincides with the opening of the ear canals, which allows sounds to activate the inner ear. This coincident timing provided further support for the traditional model of when sound processing begins in the brain.

    However, researchers had struggled to reconcile this conventional model with observations of sound-induced brain activity much earlier in the developmental process. Until his group directly measured the response of subplate neurons to sound, Kanold said, the phenomenon had largely been overlooked.

    “Our work is the first to suggest that subplate neurons do more than bridge the gap between the thalamus and the cortex, forming the structure for future circuits,” Kanold said. “They form a functional scaffolding that actually processes and transmits information before other cortical circuits are activated. It is likely that subplate neurons help determine the early functional organization of the cortex in addition to structural organization.”

    By identifying a source of early sensory nerve signals, the current study could lead to new ways to diagnose autism and other cognitive deficits that emerge early in development. Early diagnosis is an important first step toward early intervention and treatment, Kanold noted.

    “Now that we know subplate neurons are transmitting sensory input, we can begin to study their functional role in development in more detail,” Kanold said. “What is the role of sensory experience at this early stage? How might defects in subplate neurons correlate with cognitive deficits and conditions like autism? There are so many new possibilities for future research.”

    Kanold’s findings are already drawing interest from researchers who study sensory development in humans. Rhodri Cusack, a professor of cognitive neuroscience at Trinity College Dublin, in Ireland, noted that the results could have implications for the care of premature infants.

    “This paper shows that our sensory systems are shaped by the environment from a very early age,” Cusack said. “In human infants, this includes the third trimester, when many preterm infants spend time in a neonatal intensive care unit. The findings are a call to action to identify enriching environments that can optimize sensory development in this vulnerable population.”


  10. Study suggests babies can use context to look for things

    November 7, 2017 by Ashley

    From the Brown University press release:

    Just six months into the world, babies already have the capacity to learn, remember and use contextual cues in a scene to guide their search for objects of interest, such as faces, a new Brown University study shows.

    “It was pretty surprising to find that 6-month-olds were capable of this memory-guided attention,” said lead author Kristen Tummeltshammer, a postdoctoral scholar at Brown. “We didn’t expect them to be so successful so young.”

    In the experiment described in Developmental Science, babies showed steady improvement in finding faces in repeated scenes, but didn’t get any quicker or more accurate in finding faces in new scenes. Senior author Dima Amso, an associate professor in Brown’s Department of Cognitive, Linguistic and Psychological Sciences, said the finding that infants can recognize and exploit patterns of context provides important new insights into typical and possibly atypical brain development.

    “What that means is that they are efficient in using the structure in their environment to maximize attentional resources on the one hand and to reduce uncertainty and distraction on the other,” Amso said. “A critical question in our lab has been whether infants at risk for neurodevelopmental disorders, especially autism spectrum disorders, have differences in the way that they process visual information, and whether this would impact future learning and attention. These data lay the developmental groundwork for asking whether there are differences in using previously learned visual information to guide future learning and attention across various neurodevelopmental populations.”

    Find the face

    To make the findings, Tummeltshammer and Amso invited 46 healthy, full-term infants, either 6 or 10 months old, to their lab to play a little game of finding faces. Seated on a parent’s lap, the babies simply had to watch a screen as they were presented with a series of arrangements of four colored shapes. In each arrangement, the shapes would turn around with one revealing a face. An eye-tracking system would measure where the baby looked.

    Eventually the babies would always look at the face, especially because after two seconds, the face would become animate and say words like “peekaboo.” In all, each baby saw 48 arrangements over eight minutes, with little breaks to watch clips of Elmo from “Sesame Street.” That, Tummeltshammer said, was to help keep them (and maybe their parents) engaged and happy.

    The trick of the experiment is that while half the time the shape arrangements were randomly scrambled and the face could be revealed anywhere, the other half of the time the same arrangements were repeated, meaning a baby could learn from that context to predict where to look for the face. In this way, the babies beheld faces both in novel and repeated contexts. If babies could notice the repeated context pattern, remember it and put it to use, they should be quicker and more accurate in finding the face when it came up in that kind of scene again.

    By several measures reported in the study, the babies demonstrated that capacity clearly. For example, as they saw more scenes, babies consistently reduced the amount of time it took to find the face in repeated-context scenes, but not in new-context scenes. Also they became better at ignoring non-face shapes in repeated-context scenes as they went along, but didn’t show that same improvement in new-context scenes.

    Babies even learned to anticipate where the faces would be on the screen based on their experiences in the experiment.

    Tummeltshammer said there was little difference between the 6-month-olds and the 10-month-olds, suggesting that the skill is already developed at the younger age.

    In new research, Tummeltshammer said, she and Amso plan to experiment with more realistic scenes. After all, babies rarely need to look for faces among cleanly defined abstract shapes. A more real-world challenge for a baby, for instance, might be finding a parent’s familiar and comforting face across a holiday dinner table.

    But even from this simpler experimental setting, the ability is clearly established.

    “We think of babies as being quite reactive in how they spread their attention,” Tummeltshammer said. “This helps us recognize that they are actually quite proactive. They are able to use recent memory and to extract what’s common in an environment as a shortcut to be able to locate things quickly.”

    A James S. McDonnell Scholar Award and the National Institutes of Health (1-F32-MH108278-01) funded the research.