1. Cognitive scientist suggests numerical cognition is not biologically endowed

    May 26, 2017 by Ashley

    From the University of California San Diego press release:

    Clocks and calendars, sports scores and stock-market tickers — our society is saturated with numbers. One of the first things we teach our children is to count, just as we teach them their ABCs. But is this evidence of a biological drive? No, says cognitive scientist Rafael Nunez of the University of California San Diego. It is evidence of our cultural preoccupations. “Numerical cognition,” he says, “is not biologically endowed.”

    Writing in the June 2017 issue of Trends in Cognitive Sciences, Nunez takes on the conventional wisdom in the field right now — a widely accepted view in cognitive neuroscience, child psychology and animal cognition that there is a biologically evolved capacity for number and arithmetic that we share with other species.

    For example, Alex the African grey parrot wowed millions with his mathematical genius, not only on YouTube and TV but in scientific journals as well. Respected researchers are publishing studies suggesting that not only Alex but an incredible array of other animals can deal with numbers, too, from our evolutionary cousins the chimpanzees to more distant relatives like newborn chicks, salamanders and even mosquitofish. Human babies have also been shown to discriminate between different quantities at ages so young that it would seem language and culture couldn’t have yet played much of a role.

    That all points to a primordial ability for math, right? We’re wired for it like we are for language? Not so fast, says Nunez, professor of cognitive science in the UC San Diego Division of Social Sciences and director of the Embodied Cognition Laboratory.

    He believes that part of the problem is muddled terminology. There is a difference, he says, between number and quantity, between doing math and perceiving relative amounts of things. We — human and nonhuman animals alike — do seem to have a shared ability, grounded in our biology and helped along by evolutionary pressures, to tell apart “some” and “many” or even small amounts of something. But numbers, more strictly speaking, he says, need a symbolic system and the scaffold of culture.

    In the same issue of the journal, neuroscientist Andreas Nieder writes a rebuttal to Nunez. And Nunez, in turn, rebuts the rebuttal. Is this an argument for the birds, though? A debate only a specialist could love? Nunez says the implications go far beyond the field. As society seeks to apply findings from neuroscience to solve problems in education, for example, we need a clearer view of where to look for solutions.

    To support his argument that we and other animals don’t have an evolved capacity for number per se, Nunez cites several different strands of research in the current literature, including experimental work with humans from non-industrialized cultures, which suggests an imprecise approach to quantity.

    Many of the world’s languages, he points out, don’t bother with exact terms for numbers larger than a few and rely on quantifiers like “several” or “many.” People can get along surprisingly well with just those kinds of words and occasional linguistic emphasis like “really” to distinguish between “a lot” and “really a lot.” A survey of 193 hunter-gatherer languages from different continents found that most of these languages stop at the number five or below: 61 percent in South America, 92 percent in Australia and 41 percent in Africa. Nunez suspects that until the need arose to make precise counts of commodities, most humans throughout history just worked with “natural quantifiers.”

    He points also to brain-imaging research that shows native speakers of Chinese and native speakers of English process the same Arabic numerals in different parts of their brains, suggesting that language and culture influence even which neurons are recruited to deal with numbers.

    According to Nunez, as much as we might be wowed by what some trained animals can do, we have to remember that doesn’t necessarily point back to an evolved capacity. They are trained over many hours and months, and they’re trained by humans. “A circus seal may jump through a burning ring but it doesn’t tell us anything about the animal’s ability to deal with fire in its natural environment,” he said.

    To drive home his point about humans, Nunez uses what he describes as the “absurd” analogy of snowboarding. To be able to snowboard, we need our biology- “we need our limbs and our vestibular system for balance, we need optic flow navigation, but those don’t give an account of snowboarding and no one argues that we evolved to do it.” Without a culture that allows for thermal suits and ski lifts, he said, we wouldn’t be on the slopes at all.

    Nunez calls on researchers to become more precise with their terms and suggests that it could be productive to investigate “what seems to be nearly universal in human cultures and in many nonhuman animals too: a ‘quantical‘ ability and not a numerical one.”

    “Quantical skills,” he said, are a good candidate for more intensive study and might even be informative for education. We study early ability to count and draw correlations with later achievement in school. Perhaps there are even stronger correlations with the ability to quantify, he said.


  2. Study reassesses Learning Styles

    May 24, 2017 by Ashley

    From the Frontiers press release:

    What is the best way for teachers to teach so students will really learn? That’s an age-old question.

    Since the 1970s, one theory that has been popular among schoolteachers and pervasive in education research literature in the United Kingdom and the United States is the idea of “Learning Styles,” the notion that people can be categorized into one or more ‘styles’ of learning (e.g., Visual, Auditory, Converger) and that teachers can and should tailor their curriculum to suit individual students. The idea is that students will learn more if they are exposed to material through approaches that specifically match their Learning Style.

    But in recent years, many academicians have criticized Learning Styles saying there is no evidence it improves student understanding.

    Now comes a newly published study of 114 academics in higher education in the United Kingdom, led by education researchers Philip M. Newton, Ph.D., and Mahallad Miah, both of the Swansea University Medical School in Swansea, UK. Their study “Evidence-Based Higher Education — Is the Learning Styles ‘Myth’ Important?” was published March 27, 2017, in Frontiers in Psychology.

    Their findings are very interesting. Newton and Miah found while 58% of the academics surveyed believe Learning Styles to be beneficial — only 33% actually used the pedagogical tool.

    In other words, there is something about the idea of individualized education that appeals, but actually administering a Learning Styles questionnaire to students and then tailoring the class curriculum to suit individual students’ personal learning styles is only done by a handful of faculty.

    “There is a mismatch between the empirical evidence and the belief in Learning Styles,” said Newton. “Among those who participated in our study far more reported using a number of techniques that are demonstrably evidence-based.”

    These techniques include: assigning formative assessments (i.e., practice tests), peer teaching (i.e., having students teach each other), working problems and examples aloud, and microteaching (i.e., taking video footage of teachers in training so they can reflect on and adjust how they explain material and interact with students).

    Furthermore, 90% of the faculty surveyed said that Learning Styles as an approach is fundamentally flawed.

    “Learning Styles does not account for the complexity of ‘understanding,'” said Newton. “It is not possible to teach complex concepts such as mathematics or languages by presenting these subjects in only one style. This would be like trying to teach medical students to recognize different heart sounds using visual methods, or teaching them how to recognize different skin rashes using auditory methods.”

    Newton and Miah say those faculty who use Learning Styles may in fact represent certain disciplines or subject areas and that to truly evaluate the usefulness of this teaching method would require demographic studies of faculty. But that may not be worth the investment, they say.

    Part of the issue seems to lie in the fact that many respondents embrace a “looser definition” of Learning Styles, preferring to think of it as an overarching theme or general trend rather than a pedagogical tool. In other words: they operate from the standpoint that individual students have different ‘styles of learning’ — lowercase — but don’t formally change their teaching techniques. This philosophical leaning may also explain why some dedicated faculty continue to ‘believe in’ Learning Styles even when presented with the evidence that it doesn’t work.

    It appears Learning Styles has become more a point of awareness or point of view rather than a teaching tool. Thus, say Newton and Miah, rather than debunking Learning Styles — capital letters — a far better focus for education research would be to promote those evidence-based techniques that survey participants indicated they actually use and that are demonstrably effective.


  3. Three-year-olds understand, value obligations of joint commitment

    May 20, 2017 by Ashley

    From the Society for Research in Child Development press release:

    The ability to engage in joint actions is a critical step toward becoming a cooperative human being. In particular, forming a commitment with a partner to achieve a goal that one cannot achieve alone is important for functioning in society. Previous research has shown that children begin collaborating with others between ages 2 and 3 years. However, it’s less clear whether they understand the concept of joint commitments with binding obligations. A new study looked at this phenomenon and suggests that children as young as 3 understand and value the obligations that accompany a joint commitment – and that they resent a partner who breaks a commitment for selfish reasons.

    The study, by researchers at the Max Planck Institute for Evolutionary Anthropology (in Germany) and Duke University, appears in the journal Child Development.

    “Collaborating with joint goals that structure individuals’ roles in pursuit of shared rewards is a uniquely human form of social interaction,” explains Margarita Svetlova, visiting assistant professor of psychology at Duke University, who contributed to the study. “In this study, we wanted to see what children understand about the specific roles each of them must play to achieve joint success in an interdependent task, and how they react to partners who fail to do their part for various reasons.”

    Researchers presented 72 pairs of 3-year-olds (for a total of 144 children) with a collaborative task – pulling on a rope together to move a block toward a set of marbles to get a reward; the children were tested in same-sex pairs and agreed verbally to play the game together. The children were predominantly middle-class and White, and they lived in a medium-sized German city.

    During the task, one of the partners – who had been trained before engaging in the task – stopped cooperating with the other, either leaving the collaborative task to receive an individual reward (the selfish condition), stopping because he or she operated the toy inefficiently (the ignorant condition), or quitting because the toy broke (the accidental condition). Researchers then observed children’s reactions to their partner’s defection and what they did next: Some tattled, others became emotionally aroused, and still others taught their partner how to engage in the task.

    The study found that 3-year-olds could coordinate their actions with others in joint tasks, and that they understood these tasks were collaborative commitments and protested when a partner defected selfishly and knowingly. Children were naturally frustrated when the task was interrupted for any reason, but they reacted more strongly and with more resentment–tattling on their partner and becoming more aroused–when they thought their partner had broken the joint commitment (the selfish condition). “This illustrates children’s growing understanding of norms of cooperative co-existence and the obligations they entail,” explains Svetlova.

    When a partner’s defection happened because of a reason outside his or her control (the accidental condition) or because the partner didn’t know how to do the task (the ignorant condition), the children were less upset, the researchers found. In the latter condition, children were more likely to teach their partner how to do the task and less likely to protest the interruption.

    “The outcome of an unsuccessful collaboration was the same in all three conditions, but the children’s reactions were drastically different,” adds Svetlova. “That tells us that the children in our study correctly inferred and addressed the underlying intentions of their partners.”

    “Our findings have important implications for parents and preschool teachers,” suggests Ulrike Kachel, Ph.D. student at the Max Planck Institute for Evolutionary Anthropology, who led the study. “Young children have a stronger foundation in terms of their understanding of joint commitments than has previously been understood. Those working with children in the home and in classrooms can build on this foundation by incorporating joint activities, such as cooperative independent tasks, into children’s interactions.”


  4. How Pokémon GO can help students build stronger communication skills

    May 17, 2017 by Ashley

    From the Iowa State University press release:

    Technology continues to change the way students learn and engage with their peers, parents and community. That is why Emily Howell, an assistant professor in Iowa State University’s School of Education, is working with teachers to develop new ways to incorporate digital tools in the classroom, including playing games such as Pokémon GO.

    The focus of Howell’s work is two-fold — to give students equitable access to technology and help them build multimodal communication skills. That means not only using technology to consume information or replace traditional classroom tools, but experimenting with new forms of communication, she said. Instead of having students read a book on a tablet or use the computer to type an assignment, they need to learn how to create and upload videos or build graphics and maps to convey their message.

    Howell’s suggestion of having students play Pokémon GO to build these skills may seem a bit unconventional. However, after playing the smartphone game with her own children, she saw how it could help students with writing and research in ways that align with Common Core standards. Howell says engaging students through Pokémon GO, a game many are already playing outside the classroom, also generates interest and connects students to their work.

    “It is important to give students authentic choices that really have meaning in their lives,” Howell said. “We need to encourage them to develop questions, research the answers and then share that information in writing.”

    For example, a common assignment is to have elementary students write an essay on how to make a peanut butter and jelly sandwich — a task students can easily explain, but not a genuine question many have, Howell said. Pokémon GO, like many video games, provides players with limited information or what Howell describes as “just in time learning.” As a result, players have questions about how to use certain tools or advance to the next level.

    Playing the game with her own children, Howell watched their enthusiasm in researching and finding the answers to these questions. They were even more excited to share their knowledge with her and their grandmother, who was also playing the game. In a paper published in the journal The Reading Teacher, Howell explains how teachers can have students identify questions about Pokémon GO, find the answers and present their findings in different formats.

    Using different modes of communication

    Pokémon GO incorporates different modes of communication — gestures, visuals and directions — which makes it a good fit for the classroom, Howell said. Players see the character on their phone, the character is integrated into a map and the player controls catching the character. Pokémon GO illustrates the need to understand multimodal text, which reflects how we communicate with others, she said.

    “We don’t just send a text or email; we have a live chat or video conferences. Anytime teachers can find something that students are already doing, and comes in multimodal form, they can harness that interest and teach students about the tool’s potential,” Howell said.

    Even more than conventional tools such as a paper and pen, teachers must provide a framework for using digital tools. Howell says students need to understand conventional literacy skills, but also learn how to upload files or design elements on a page that are not in a linear progression.

    “It’s not just giving students the technology and letting them play, it’s really guiding that interaction so they can express meaning,” Howell said.

    Providing a safe, online forum

    To make the assignment even more authentic, Howell suggests giving students an outlet to share their work with people outside of the classroom. Many school districts create secure, online platforms where students can share work with family and friends and receive feedback. Knowing that others will view their work helps students develop writing styles for different audiences, not just their teacher, Howell said.

    “It makes the assignment more authentic and helps with motivation and understanding the purpose for writing,” she said. “It has academic as well as social benefits.”

    Howell received a grant from the Center for Educational Transformation at the University of Northern Iowa to help elementary teachers in Iowa integrate technology into their writing lessons. The goal is to engage students in writing so that they are using digital tools to create content, rather than strictly consume information.


  5. Study suggests languages we speak may influence factors related to ability to read

    May 15, 2017 by Ashley

    From the FECYT – Spanish Foundation for Science and Technology press release:

    The way bilingual people
    read is conditioned by the languages they speak
    . This is the main conclusion reached by researchers at the Basque Center on Co
    gnition, Brain and Language (BCBL) after reviewing the existing scientific literature and comparing this information to the findings of studies at their own centre.

    The scientists found that the languages spoken by bilingual people (when they learned to read in two languages at the same time) affect their reading strategies and even the cognitive foundations that form the basis for the capacity to read. This discovery could have implications for clinical and education practice.

    “Monolingual speakers of transparent languages — where letters are pronounced the same independently of the word they are included in, such as Basque or Spanish — have a greater tendency to use analytical reading strategies, where they read words in parts,” according to Marie Lallier, one of the authors of the article.

    On the other hand, speakers of opaque languages, where the sounds of letters differ depending on the word (for example English or French) are more likely to use a global reading strategy. In other words, they tend to read whole words to understand their meaning.

    Nonetheless, the BCBL researchers have observed that bilingual people who learn to read two languages at the same time do not read the same way as monolingual speakers; rather, they follow a different pattern which had not previously been described.

    According to the literature review, recently published in Psychonomic Bulletin and Review, a contamination effect takes place between the two reading strategies in speakers of two languages. Therefore, a person learning to read in Spanish and in English will have a greater tendency towards a global strategy, even when reading in Spanish, than a monolingual Spanish speaker. This effect is caused by the influence of the second language.

    When reading in English, on the contrary, they will tend towards a more analytical strategy (reading by parts) than monolingual English speakers, due to “contagion” from Spanish. “The brains of bilingual people adjusts itself in accordance with what they learn, applying the strategies needed to read in one language to their reading in the other language,” Lallier adds.

    Associated cognitive processes

    The researchers had previously described the principal strategies used by monolingual speakers of various languages. However, the way reading strategies are modified in bilingual people when they learn to read in two languages had never before been identified.

    The scientists at the San Sebastian centre believe that learning to read in two languages with different characteristics than the mother tongue also causes a change in the cognitive processes that are the basis of reading acquisition, such as visual attention or auditory phonological processes.

    In other words, learning to read in an opaque language (such as English or French) reinforces our capacity to rapidly process many visual elements, because whole words must be deciphered to achieve fluent reading in these languages.

    As transparent languages have a much greater focus on the letter-sound correspondence, learning to read in these languages is though to improve our sensitivity in perceiving the sounds of the language.

    Application in diagnosing dyslexia

    The paper’s authors consider their findings to have implications at different levels. From an educational standpoint, they allow better understanding of how bilingual populations learn to read and what type of strategies are most advisable to help pupils learn based on the languages they know.

    The new discovery could also help in the diagnosis and assessment of dyslexia and other problems with reading. “Language learning can not provoke more cases of dyslexia, as the disorder is usually caused by neurogenetic factors. Our theory suggests that more language learning could make the symptoms more visible, and vice versa. This depends on the combination of languages they are learning,” Lallier explains.

    The languages a child knows are therefore determinants for identifying potential disorders, as this essential information would explain certain mistakes made when reading.

    “Our experience with languages modulates our capacity to read. This should be taken into account when teaching bilingual children to read, and if any reading problems, such as dyslexia, should appear. We need to establish specific scales for the diagnosis of dyslexia in bilingual people, because their circumstances are different,” the expert concludes.


  6. How do toddlers learn best from touchscreens?

    May 14, 2017 by Ashley

    From the Frontiers press release:

    Educational apps for kids can be valuable learning tools, but there’s still a lot left to understand about how to best design them, shows a report in Frontiers in Psychology.

    “Our experiments are a reminder that just because touchscreens allow for physical interaction, it doesn’t mean that it’s always beneficial,” says Dr. Colleen Russo-Johnson, lead author of the study and who completed this work as a graduate student at Vanderbilt University.

    Smartphones and tablets have become so pervasive that, even in lower-income households, 90% of American children have used a touchscreen by the age of 2. Eighty percent of educational apps in the iTunes store are designed for children — especially toddlers and preschoolers. But recent research has shown that sometimes all those chimes and animations hinder learning, prompting the question, how well do we understand what it takes to make a truly beneficial learning app?

    “Children interact with touch screens and the embedded media content in vastly different ways and this impacts their ability to learn from the content,” says Russo-Johnson. “Our experiment focused on how children interacted with touchscreen devices — on a more basic level — by stripping away fancy design features that vary from app to app and that are not always beneficial.”

    Using a custom-made, streamlined learning app, Russo-Johnson and her colleagues showed that children as young as 2 could use the app to learn new words such as the fictional names of a variety of newly-introduced toys (designed specifically for the study). Unsurprisingly, slightly older children (age 4 to 5) were able to learn more than the younger ones (age 2 to 3) and they were also able to follow directions better — such as only tapping when instructed to do so.

    The researchers went on to show that the excessive tapping by younger children seemed to go hand-in-hand with lower scores of a trait called self-regulation. As in this study, self-regulation is commonly measured by seeing how long children can keep themselves from eating a cracker that is placed in front of them — after they’ve been told to wait until they hear a signal that it’s ok to eat the cracker.

    To complement this first study (which included 77 children), Russo-Johnson and her colleagues designed a second app to see which interactions — tapping, dragging, or simply watching — were better for learning new words.

    Somewhat surprisingly, across this next group of 170 2- to 4-year olds, no single type of interaction proved to consistently be the best. But there were differences depending on age, gender, and the extent of prior exposure to touchscreens at home. Boys appeared to benefit more from watching, whereas dragging seemed best for girls and children with the most touchscreen experience.

    These results complement the growing body of research on identifying effective interactive features, as well as providing insight into how apps might be tailored to fit the learning needs of different children.

    “I hope that this research will inform academics and app developers alike,” says Russo-Johnson. “Educational app developers should be mindful of utilizing interactivity in meaningful ways that don’t distract from the intended educational benefits, and, when possible, allow for customization so parents and educators can determine the best settings for their children.”


  7. Parents’ motivation influences students’ academic outcomes

    May 13, 2017 by Ashley

    From the University of Tübingen press release:

    Whether parental help has positive or negative effects on students’ academic outcomes depends on the motivation and involvement of their parents. Results of a study conducted by the Hector Research Institute of Education Sciences and Psychology suggest that students whose parents are interested in math and perceive their own math competencies to be high perform better than students with parents who show a low interest in math and regard their competencies in the domain as equally low — regardless of the intensity of the help students receive at home. The results have now been published in Child Development.

    Family background plays a crucial role in the development of students’ academic motivation and achievement. Previous research suggested that parents’ academic involvement is, on average, associated with better academic outcomes, but the pattern of results was far from being unequivocal and it also remained unclear what kind of help is actually helpful and what is harmful. For example, excessive parental involvement may be perceived by students as controlling behavior. This can have a detrimental effect on their academic confidence and correspondingly on achievement. Thus, researchers at the University of Tübingen set out to investigate which family characteristics have a positive effect on academic outcomes and which characteristics can be more of a hindrance. To this end, they collected data from more than 1,500 ninth-grade students and their parents.

    Parents answered questions on the degree of their academic involvement in math such as homework help, family math interest, their math competencies, their child’s need for support in math, and the time and energy they invest in their child’s academic life. Students filled out questionnaires at the beginning as well as five months later, in which they reported on their own competencies, their effort, and their interest in math. In addition, their math grades and their achievement in standardized achievement tests were assessed.

    The results confirmed the researchers’ assumption that parental involvement per se does not result in higher academic outcomes. Instead, there are very specific family characteristics that promote high achievement. “A favorable pattern of students’ academic outcomes was found when families were interested in math and perceived their own math competence to be high, regardless of their amount of academic involvement,” says Isabelle Häfner, lead author of the study. Thus, it would be problematic to attribute high or low achievement solely to whether parents help students with their homework or not.

    The most unfavorable conditions for academic achievement were found for students from deeply involved families who considered their child needed support in math, showed low levels of family math interest, and perceived their own math competencies as low. Students from these ‘involved but unmotivated’ families not only performed poorly in math, but also showed low levels of motivation. “Helicopter moms can impair their child’s performance if they are not themselves interested in the subject they want to support their child in,” explains Häfner. This complex interplay of favorable and unfavorable factors with regard to students’ academic achievement will be investigated in further studies.


  8. Surprise communication found between brain regions involved in infant motor control

    May 12, 2017 by Ashley

    From the University of Iowa press release:

    A newborn’s brain is abuzz with activity.

    Day and night, it’s processing signals from all over the body, from recognizing the wriggles of the child’s own fingers and toes to the sound of mommy’s or daddy’s voice.

    Though much of how the infant brain works and develops remains a mystery, University of Iowa researchers say they have uncovered a new mode of communication between two relatively distant regions. And, it turns out that sleep is key to this communication.

    When two areas of the brain communicate, their rhythms will often synchronize. One well-known brain rhythm, the theta rhythm, is most closely associated with the hippocampus, a region in the forebrain important for consolidating memories and navigation, among other functions. In experiments with infant rats, the researchers showed for the first time that the hippocampus oscillates in lockstep with the red nucleus, a brain-stem structure that plays a major role in motor control. Importantly, the hippocampus and red nucleus synchronize almost exclusively during REM (active) sleep.

    Rats and humans both spend much of their early lives in REM sleep. In human newborns, eight hours of every day is occupied by REM sleep alone. And because rat brains and human brains have the same basic structure, UI researchers believe the same communication, between the same regions, is likely occurring in human infants. They also suspect disruptions to that linkage may contribute to the motor-control problems that often accompany disorders such as autism and schizophrenia.

    “Our findings provide a possible route to understanding the early emergence of motor problems in human infants. Because we found that communication between the hippocampus and red nucleus occurred primarily during REM sleep, disrupting normal sleep in early infancy could interfere with the strengthening of the communication links among forebrain and brainstem structures,” says Mark Blumberg, a professor in the UI Department of Psychological and Brain Sciences and corresponding author on the study, published in the journal Current Biology.

    “We feel this work opens new doors to a host of important questions that have been largely overlooked,” Blumberg says.

    Carlos Del Rio-Bermudez, a Fulbright Scholar who joined Blumberg’s lab for his doctoral studies, says he was surprised by the findings.

    “Although a lot is known about the theta rhythm in the hippocampus and other forebrain structures, no one seems to have suspected that it might also be involved in communication between the hippocampus and a brain-stem structure like the red nucleus,” Del Rio-Bermudez says.

    According to Blumberg, this discovery supports the idea that REM sleep is important for early brain development and that brain rhythms play a significant role in this process.

    The researchers point out that an infant’s red nucleus and other similar structures contribute heavily to motor control at a time in development when other brain structures, including the motor cortex, are still developing.

    Considering the many similarities in the brain and behavior of infant rats and humans, “it would be extraordinary if similar events are not also happening in us,” Blumberg says.


  9. Handheld screen time linked with speech delays in young children

    May 11, 2017 by Ashley

    From the American Academy of Pediatrics press release:

    As the number of smart phones, tablets, electronic games and other handheld screens in U.S. homes continues to grow, some children begin using these devices before beginning to talk. New research being presented at the 2017 Pediatric Academic Societies Meeting suggests these children may be at higher risk for speech delays.

    Researchers will present the abstract, “Is handheld screen time use associated with language delay in infants?” on Saturday, May 6 at the Moscone West Convention Center in San Francisco. The study included 894 children between ages 6 months and 2 years participating in TARGet Kids!, a practice-based research network in Toronto between 2011 and 2015.

    By their 18-month check-ups, 20 percent of the children had daily average handheld device use of 28 minutes, according to their parents. Based on a screening tool for language delay, researchers found that the more handheld screen time a child’s parent reported, the more likely the child was to have delays in expressive speech. For each 30-minute increase in handheld screen time, researchers found a 49 percent increased risk of expressive speech delay. There was no apparent link between handheld device screen time and other communications delays, such as social interactions, body language or gestures.

    “Handheld devices are everywhere these days,” said Dr. Catherine Birken, MD, MSc, FRCPC, the study’s principal investigator and a staff pediatrician and scientist at The Hospital for Sick Children (SickKids). “While new pediatric guidelines suggest limiting screen time for babies and toddlers, we believe that the use of smartphones and tablets with young children has become quite common. This is the first study to report an association between handheld screen time and increased risk of expressive language delay.”

    Dr. Birken said the results support a recent policy recommendation by the American Academy of Pediatrics to discourage any type of screen media in children younger than 18 months. More research is needed, she said, to understand the type and contents of screen activities infants are engaging in to further explore mechanisms behind the apparent link between handheld screen time and speech delay, such as time spent together with parents on handheld devices, and to understand the impact on in-depth and longer-term communication outcomes in early childhood.


  10. Reading with children starting in infancy gives lasting literacy boost

    May 10, 2017 by Ashley

    From the American Academy of Pediatrics press release:

    New research at the 2017 Pediatric Academic Societies Meeting shows that reading books with a child beginning in early infancy can boost vocabulary and reading skills four years later, before the start of elementary school.

    The abstract, “Early Reading Matters: Long-term Impacts of Shared Bookreading with Infants and Toddlers on Language and Literacy Outcomes,” will be presented on Monday, May 8, at the Moscone West Convention Center in San Francisco.

    “These findings are exciting because they suggest that reading to young children, beginning even in early infancy, has a lasting effect on language, literacy and early reading skills,” said Carolyn Cates, PhD, lead author and research assistant professor in the department of pediatrics at New York University (NYU) School of Medicine. “What they’re learning when you read with them as infants,” she said, “still has an effect four years later when they’re about to begin elementary school.”

    Mothers and their babies were recruited from the newborn nursery of an urban public hospital, with more than 250 pairs monitored between ages of 6 months and 4 and a half years (54 months) for how well they could understand words, and for early literacy and reading skills. The study was funded by the Eunice Kennedy Shriver National Institute of Child Health and Human Development.

    The findings were compared with the quantity of shared book-reading, such as the number of books in the home and days per week spent reading together. Quality of shared book-reading was gauged by asking whether parents had conversations with their child about the book while reading, whether they talked about or labeled the pictures and the emotions of the characters in the book and whether the stories were age-appropriate.

    Adjusting for socioeconomic differences, the researchers found that reading quality and quantity of shared book-reading in early infancy and toddlerhood predicted child vocabulary up to four years later, prior to school entry. Book-reading quality during early infancy, in particular, predicted early reading skills while book-reading quantity and quality during toddler years appeared strongly tied to later emergent literacy skills, such as name-writing at age 4.

    The results highlight the importance of parenting programs used in pediatric primary care that promote shared book-reading soon after birth, Dr. Cates said, such as Read Out and Read and the Video Interaction Project..

    Dr. Cates will present the abstract, “Early Reading Matters: Long-term Impacts of Shared Bookreading with Infants and Toddlers on Language and Literacy Outcomes,” at 8:15 a.m.