1. Now or later: How taste and sound affect when you buy

    July 16, 2017 by Ashley

    From the Brigham Young University press release:

    There’s a reason marketers make appeals to our senses; the “snap, crackle and pop” of Rice Krispies makes us want to buy the cereal and eat it. But as savvy as marketers are, they may be missing a key ingredient in their campaigns.

    New research finds the type of sensory experience an advertisement conjures up in our mind — taste and touch vs. sight and sound — has a fascinating effect on when we make purchases.

    The study led by marketing professors at Brigham Young University and the University of Washington finds that advertisements highlighting more distal sensory experiences (sight/sound) lead people to delay purchasing, while highlighting more proximal sensory experiences (touch/taste) lead to earlier purchases.

    “Advertisers are increasingly aware of the influence sensory cues can play,” said lead author Ryan Elder, associate professor of marketing at BYU. “Our research dives into which specific sensory experiences will be most effective in an advertisement, and why.”

    Elder, with fellow lead author Ann Schlosser, a professor of marketing at the University of Washington, Morgan Poor, assistant professor of marketing at San Diego State University, and Lidan Xu, a doctoral student at the University of Illinois, carried out four lab studies and a pilot study involving more than 1,100 study subjects for the research, published in the Journal of Consumer Research.

    Time and time again, their experiments found that people caught up in the taste or touch of a product or event were more likely to be interested at an earlier time.

    In one experiment, subjects read one of two reviews for a fictional restaurant: One focused on taste/touch, the other emphasized sound/vision. Participants were then asked to make a reservation to the restaurant on a six-month interactive calendar. Those who read the review focusing on the more proximal senses (taste and touch) were significantly more likely to make a reservation closer to the present date.

    In another experiment, study subjects read ad copy for a summer festival taking place either this weekend or next year. Two versions of the ad copy existed: one emphasizing taste (“You will taste the amazing flavors…”) and one emphasizing sound (“You will listen to the amazing sounds…”).

    When subjects were asked when they would like to attend, those who read the ad copy about taste had a higher interest in attending a festival this weekend. Those who read ads emphasizing sounds were more likely to have interest in attending the festival next year.

    If an advertised event is coming up soon, it would be better to highlight the more proximal senses of taste or touch — such as the food served at the event — than the more distal senses of sound and sight,” Schlosser said. “This finding has important implications for marketers, especially those of products that are multi-sensory.”

    As part of the study, researchers also learned an interesting insight into making restaurant reviews more helpful. In their field study, the authors analyzed 31,889 Yelp reviews to see if they could find connections between the sensory elements of a reviewer’s experience and the usefulness of a review.

    They found reviews from people who emphasized a more distal sense (such as sight) were rated more useful when the review used the past tense (“We ate here last week and…”), while people emphasizing a proximal sense (touch) had more useful reviews when they used the present tense (“I’m eating this right now and it is so good!”).

    “Sensory marketing is increasingly important in today’s competitive landscape. Our research suggests new ways for marketers to differentiate their products and service, and ultimately influence consumer behavior,” Elder said. “Marketers need to pay closer attention to which sensory experiences, both imagined and actual, are being used.”


  2. Taking photos of experiences boosts visual memory, impairs auditory memory

    July 12, 2017 by Ashley

    From the Association for Psychological Science press release:

    A quick glance at any social media platform will tell you that people love taking photos of their experiences — whether they’re lying on the beach, touring a museum, or just waiting in line at the grocery store. New research shows that choosing to take photos may actually help us remember the visual details of our encounters.

    The findings are published in Psychological Science, a journal of the Association for Psychological Science.

    “Our research is novel because it shows that photo-taking itself improves memory for visual aspects of an experience but can hurt memory for nonvisual aspects, like auditory details,” the authors say.

    This research was conducted by Alixandra Barasch (New York University Stern School of Business), Kristin Diehl (USC Marshall School of Business), Jackie Silverman (The Wharton School of the University of Pennsylvania), and Gal Zauberman (Yale School of Management).

    Previous research has suggested that being able to take photographs or consult the Internet may allow us to outsource our memory, freeing up cognitive resources but potentially impairing our ability to remember.

    Barasch, Diehl, Silverman, and Zauberman hypothesized that this offloading effect may hold for factual information, but might not apply when it comes to the experiences we deliberately choose to photograph.

    “People take photos specifically to remember these experiences, whether it’s a fun dinner with friends, a sightseeing tour, or something else,” they argue.

    Of course, the reality is that most of the photos we take will probably never get a second glance. The researchers wondered: How well do we remember the experiences we photograph if we never revisit the photos? Furthermore, does taking photos affect memory for what we saw differently than for what we heard?

    In one experiment, the researchers had 294 participants tour a real-life museum exhibit of Etruscan artifacts. The participants stashed their belongings before starting the tour but some were allowed to keep a camera on them. Those with a camera could photograph anything they wanted in the exhibit and were told to take at least 10 photos. As the participants toured the exhibit, they listened to an accompanying audio guide.

    At the end of the tour, they answered multiple-choice questions asking them to identify objects they had seen or complete factual statements from the audio guide.

    The results showed that those who took photos visually recognized more of the objects compared with those who didn’t have a camera. But they also remembered less auditory information than their camera-less peers.

    These findings provided evidence that taking pictures can enhance visual memory. To test their hypotheses in a more controlled environment, the researchers designed a virtual art-gallery tour. Participants navigated through the gallery on screen as they would in real life and some were able to take pictures of what they saw on screen by clicking an on-screen button.

    Again, participants who were able to take pictures were better at recognizing what they saw and worse at remembering what they heard, compared to those who couldn’t take pictures.

    When the researchers examined visual memory for specific objects, they found that participants who were able to take pictures performed better on visual memory tasks regardless of whether the objects in question were the most or least photographed. Photo-takers even had better visual memory for aspects of the exhibit they didn’t photograph, compared with participants who weren’t able to take pictures.

    “These findings suggest that having a camera changes how people approach an experience in a fundamental way,” the authors say. “Even when people don’t take a photo of a particular object, like a sculpture, but have a camera with them and the intention to take photos, they remember that sculpture better than people who did not have a camera with them.”

    Pooling findings from all four studies, the researchers found that taking photos had a reliably positive effect on visual memory and a smaller but reliable negative effect on auditory memory.

    Even participants who thought their photos would be deleted and those who were instructed to “mentally take a photo” showed enhanced visual memory and impaired auditory memory relative to participants who couldn’t take pictures.

    Together, these experiments suggest that photographing our experiences doesn’t outsource our memory so much as it focuses it, funneling our attention toward visual aspects of our experiences and away from others.


  3. When lovers touch, their breathing, heartbeat syncs, pain wanes

    July 10, 2017 by Ashley

    From the University of Colorado at Boulder press release:

    Fathers-to-be, take note: You may be more useful in the labor and delivery room than you realize.

    That’s one takeaway from a study released last week that found that when an empathetic partner holds the hand of a woman in pain, their heart and respiratory rates sync and her pain dissipates.

    The more empathic the partner and the stronger the analgesic effect, the higher the synchronization between the two when they are touching,” said lead author Pavel Goldstein, a postdoctoral pain researcher in the Cognitive and Affective Neuroscience Lab at CU Boulder.

    The study of 22 couples, published in the journal Scientific Reports last week, is the latest in a growing body of research on “interpersonal synchronization,” the phenomenon in which individuals begin to physiologically mirror the people they’re with.

    Scientists have long known that people subconsciously sync their footsteps with the person they’re walking with or adjust their posture to mirror a friend’s during conversation. Recent studies also show that when people watch an emotional movie or sing together, their heart rates and respiratory rhythms synchronize. When leaders and followers have a good rapport, their brainwaves fall into a similar pattern. And when romantic couples are simply in each other’s presence, their cardiorespiratory and brainwave patterns sync up, research has shown.

    The new study, co-written with University of Haifa Professor Simone Shamay-Tsoory and Assistant Professor Irit Weissman-Fogel, is the first to explore interpersonal synchronization in the context of pain and touch. The authors hope it can inform the discussion as health care providers seek opioid-free pain relief options.

    Goldstein came up with the idea after witnessing the birth of his daughter, now 4.

    “My wife was in pain, and all I could think was, ‘What can I do to help her?’ I reached for her hand and it seemed to help,” he recalls. “I wanted to test it out in the lab: Can one really decrease pain with touch, and if so, how?”

    Goldstein recruited 22 long-term heterosexual couples, age 23 to 32, and put them through a series of tests aimed at mimicking that delivery-room scenario.

    Men were assigned the role of observer; women the pain target. As instruments measured their heart and breathing rates, they: sat together, not touching; sat together holding hands; or sat in separate rooms. Then they repeated all three scenarios as the woman was subjected to a mild heat pain on her forearm for 2 minutes.

    As in previous trials, the study showed couples synced physiologically to some degree just sitting together. But when she was subjected to pain and he couldn’t touch her, that synchronization was severed. When he was allowed to hold her hand, their rates fell into sync again and her pain decreased.

    “It appears that pain totally interrupts this interpersonal synchronization between couples,” Goldstein said. “Touch brings it back.”

    Goldstein’s previous research found that the more empathy the man showed for the woman (as measured in other tests), the more her pain subsided during touch. The more physiologically synchronized they were, the less pain she felt.

    It’s not clear yet whether decreased pain is causing increased synchronicity, or vice versa.

    It could be that touch is a tool for communicating empathy, resulting in an analgesic, or pain-killing, effect,” said Goldstein.

    Further research is necessary to figure out how a partner’s touch eases pain. Goldstein suspects interpersonal synchronization may play a role, possibly by affecting an area of the brain called the anterior cingulate cortex, which is associated with pain perception, empathy, and heart and respiratory function.

    The study did not explore whether the same effect would occur with same-sex couples, or what happens when the man is the subject of pain. Goldstein did measure brainwave activity and plans to present those results in a future study.

    He hopes the research will help lend scientific credence to the notion that touch can ease pain.

    For now, he has some advice for partners in the delivery room: Be ready and available to hold your partner’s hand.


  4. Detecting social signals may have affected how we see colors

    June 29, 2017 by Ashley

    From the New York University press release:

    The arrangement of the photoreceptors in our eyes allows us to detect socially significant color variation better than other types of color vision, a team of researchers has found. Specifically, our color vision is superior at spotting “social signaling,” such as blushing or other facial color changes — even when compared to the type of color vision that we design for digital cameras and other photographic devices.

    “Our color vision is very strange,” says James Higham, an assistant professor in NYU’s Department of Anthropology and one of the study’s co-authors. “Our green receptor and our red receptor detect very similar colors. One would think that the ideal type of color vision would look different from ours, and when we design color detection, such as for digital cameras, we construct a different type of color vision. However, we’ve now shown that when it comes to spotting changes in color linked to social cues, humans outshine the type of color vision we’ve designed for our technologies.”

    The study, which appears in the journal Proceedings of the Royal Society Biological Sciences, focuses on trichromatic color vision — that is, how we process the colors we see, based on comparisons among how red, green, and blue they are.

    One particularly interesting thing about how our visual system is structured is how significantly it differs from that of cameras. Notably, the green and red photoreceptors we use for color vision are placed very close together; by contrast, the equivant components in cameras are situated with ample (and even) spacing among them. Given that cameras are designed to optimally capture color, many have concluded that their ability to detect an array of colors should be superior to that of humans and other primates — and wondered why our vision is the way it is.

    One idea that has been well-studied is related to foraging. It hypothesizes that primate color vision allows us to detect between subtle shades of green and red, which is useful, for example, when fruit are ripening against green leaves in a tree. An alternative hypothesis relates to the fact that both humans and primates must be able to spot subtle changes in facial color in social interactions. For instance, some species of monkeys give red signals on their faces and on genitals that change color during mating and in social interactions. Similarly, humans exhibit facial color changes such as blushing, which are socially informative signals.

    In their study, the researchers had 60 human subjects view a series of digital photographs of female rhesus macaque monkeys. These primates’ facial color has been known to change with their reproductive status, with female faces becoming redder when they are ready to mate. This process, captured in the series of photographs, provides a good model for testing the ability to not only detect colors, but also to spot those linked to social cues — albeit across two species.

    In different sets of photographs, the scientists developed software that replicated how colors look under different types of color vision, including different types of color blindness, and the type of trichromatic vision seen in many artificial systems, with even spacing of the green and red photoreceptors. Some of the study’s subjects viewed photos of the transformation of the monkeys’ faces as a human or primate would see them while others saw pictures as a color-blind person would and others as a camera would. During this period, the study’s subjects had to discriminate between the different colors being exhibited by the monkeys in the photos.

    Overall, the subjects viewing the images using the human/primate visual system more accurately and more quickly identified changes in the monkeys’ face coloring.

    “Humans and many other primates have an unusual type of color vision, and no one is sure why,” first author Chihiro Hiramatsu of Japan’s Kyushu University notes. “Here, we provide one of the first experimental tests of the idea that our unusual vision might be related to detecting social signals in the faces of others.”

    “But, perhaps more importantly, these results support a rarely tested idea that social signaling itself, such as the need to detect blushing and facial color changes, might have had a role in the evolution or maintenance of the unusual type of color vision shown in primates, especially those with conspicuous patches of bare skin, including humans, macaques, and many others,” concludes co-author Amanda Melin of the University of Calgary.


  5. Study examines reasons for eye contact difficulties in autism

    June 28, 2017 by Ashley

    From the Massachusetts General Hospital press release:

    Individuals with autism spectrum disorder (ASD) often find it difficult to look others in the eyes. This avoidance has typically been interpreted as a sign of social and personal indifference, but reports from people with autism suggests otherwise. Many say that looking others in the eye is uncomfortable or stressful for them — some will even say that “it burns” — all of which points to a neurological cause. Now, a team of investigators based at the Athinoula A. Martinos Center for Biomedical Imaging at Massachusetts General Hospital has shed light on the brain mechanisms involved in this behavior. They reported their findings in a Scientific Reports paper published online this month.

    “The findings demonstrate that, contrary to what has been thought, the apparent lack of interpersonal interest among people with autism is not due to a lack of concern,” says Nouchine Hadjikhani, MD, PhD, director of neurolimbic research in the Martinos Center and corresponding author of the new study. “Rather, our results show that this behavior is a way to decrease an unpleasant excessive arousal stemming from overactivation in a particular part of the brain.”

    The key to this research lies in the brain’s subcortical system, which is responsible for the natural orientation toward faces seen in newborns and is important later for emotion perception. The subcortical system can be specifically activated by eye contact, and previous work by Hadjikhani and colleagues revealed that, among those with autism, it was oversensitive to effects elicited by direct gaze and emotional expression. In the present study, she took that observation further, asking what happens when those with autism are compelled to look in the eyes of faces conveying different emotions.

    Using functional magnetic resonance imaging (fMRI), Hadjikhani and colleagues measured differences in activation within the face-processing components of the subcortical system in people with autism and in control participants as they viewed faces either freely or when constrained to viewing the eye-region. While activation of these structures was similar for both groups exhibited during free viewing, overactivation was observed in participants with autism when concentrating on the eye-region. This was especially true with fearful faces, though similar effects were observed when viewing happy, angry and neutral faces.

    The findings of the study support the hypothesis of an imbalance between the brain’s excitatory and inhibitory signaling networks in autism — excitatory refers to neurotransmitters that stimulate the brain, while inhibitory refers to those that calm it and provide equilibrium. Such an imbalance, likely the result of diverse genetic and environmental causes, can strengthen excitatory signaling in the subcortical circuitry involved in face perception. This in turn can result in an abnormal reaction to eye contact, an aversion to direct gaze and consequently abnormal development of the social brain.

    In revealing the underlying reasons for eye-avoidance, the study also suggests more effective ways of engaging individuals with autism. “The findings indicate that forcing children with autism to look into someone’s eyes in behavioral therapy may create a lot of anxiety for them,” says Hadjikhani, an associate professor of Radiology at Harvard Medical School. “An approach involving slow habituation to eye contact may help them overcome this overreaction and be able to handle eye contact in the long run, thereby avoiding the cascading effects that this eye-avoidance has on the development of the social brain.”

    The researchers are already planning to follow up the research. Hadjikhani is now seeking funding for a study that will use magnetoencephalography (MEG) together with eye-tracking and other behavioral tests to probe more deeply the relationship between the subcortical system and eye contact avoidance in autism.


  6. How we can tell where a sound is coming from

    by Ashley

    From the University College London press release:

    A new UCL and University of Nottingham study has found that most neurons in the brain’s auditory cortex detect where a sound is coming from relative to the head, but some are tuned to a sound source’s actual position in the world.

    The study, published in PLOS Biology, looked at whether head movements change the responses of neurons that track sound location.

    “Our brains can represent sound location in either an egocentric manner — for example, when I can tell that a phone is ringing to my left — or in an allocentric manner — hearing that the phone is on the table. If I move my head, neurons with an egocentric focus will respond differently, as the phone’s position relative to my ears has changed, while the allocentric neurons will maintain their response,” said the study’s first author, Dr Stephen Town (UCL Ear Institute).

    The researchers monitored ferrets while they moved around a small arena surrounded by speakers that emitted clicking sounds. Electrodes monitored the firing rates of neurons in the ferrets’ auditory cortex, while LEDs were used to track the animals’ movement.

    Among the neurons under investigation that picked up sound location, the study showed that most displayed egocentric orientations by tracking where a sound source was relative to the animal’s head, but approximately 20% of the spatially tuned neurons instead tracked a sound source’s actual location in the world, independent of the ferret’s head movements.

    The researchers also found that neurons were more sensitive to sound location when the ferret’s head was moving quickly.

    “Most previous research into how we determine where a sound is coming from used participants with fixed head positions, which failed to differentiate between egocentric and allocentric tuning. Here we found that both types coexist in the auditory cortex,” said the study’s senior author, Dr Jennifer Bizley (UCL Ear Institute).

    The researchers say their findings could be helpful in the design of technologies involving augmented or virtual reality.

    “We often hear sounds presented though earphones as being inside our heads, but our findings suggest sound sources could be created to appear externally, in the world, if designers incorporate information about body and head movements,” Dr Town said.

    The study was funded by the Medical Research Council, Human Frontiers Science Foundation, Wellcome and the Biotechnology and Biological Sciences Research Council.


  7. Study suggests damage to amydala can make sideways faces more memorable and cause ’emotion blindness’

    June 21, 2017 by Ashley

    From the University of Bath press release:

    People with damage to a crucial part of the brain fail to recognise facial emotions, but they unexpectedly find faces looking sideways more memorable, researchers have found.

    The findings are more evidence that damage to the amygdala affects how facial recognition and gaze perception work in unpredictable ways. Perception and understanding the facial cues of others is essential in human societies.

    Patients with amygdala damage, which is common in epilepsy for example, struggle in their understanding of social signals as well as in everyday communication, which can lead to problems in their interactions with friends and family, finding life partners, and progressing with their professional careers. They often feel misunderstood which contributes to lower levels of life satisfaction.

    Normally we tend to more readily remember faces showing emotions such as fear or anger than neutral expressions. When trying to predict others’ actions, we decipher their facial expressions and follow their gaze to understand the focus of their attention and eventually of their emotion. This is an important process to understand the implications of the situation for our own well-being — which is known as self-relevance — and to interpret social situations and cues.

    The amygdala is particularly responsible for the processing of emotion and self-relevance. Individuals with damage to the amygdala have been observed to have emotion recognition deficits while keeping the perception of others’ eye gaze direction intact.

    But now researchers from the University of Bath, working with neurosurgeons and psychologists in Warsaw, Poland, have shown that individuals with amygdala damage remembered faces looking to the side more than those looking towards them — in contrast with previous studies.

    However, in line with previous research they didn’t remember emotive faces any better than neutral faces.

    Sylwia Hyniewska from the University of Bath said: “Surprisingly we found that individuals with amygdala damage remembered faces looking to the side more than those looking towards them. This effect was independent of the emotional content of the face. This was unexpected given that all research so far focusing on other populations showed either an interaction effect between emotion and gaze, or an improved memory for faces looking towards the observer.

    “We expected our patients to remember faces better when they were looking at them — presented with the direct gaze. However for some reason patients seem to remember faces looking away better. This means that the interaction between the processing of emotions and gaze is more complex than we thought, and not only emotions but also gaze should be studied further in this specific population to develop treatments improving these patients’ well-being.”

    The research is published in the journal Epilepsy and Behaviour.

    The team showed 40 patients with mesial temporal lobe epilepsy (MTLE) and 20 healthy control patients a series of faces with neutral or emotional expressions. Half were looking straight ahead, and half sideways.

    As expected healthy participants had better recognition of emotional faces. The epilepsy patients did not remember emotional faces any better than neutral ones, but did find patients gazing away more memorable than those looking straight ahead.


  8. How our eyes process visual cues

    June 17, 2017 by Ashley

    From the University of Queensland press release:

    The mystery of how human eyes compute the direction of moving light has been made clearer by scientists at The University of Queensland.

    Using advanced electrical recording techniques, researchers from UQ’s Queensland Brain Institute (QBI) discovered how nerve cells in the eye’s retina were integral to the process.

    Professor Stephen Williams said that dendrites — the branching processes of a neuron that conduct electrical signals toward the cell body — played a critical role in decoding images.

    “The retina is not a simple camera, but actively processes visual information in a neuronal network, to compute abstractions that are relayed to the higher brain,” Professor Williams said.

    “Previously, dendrites of neurons were thought to be passive input areas.

    “Our research has found that dendrites also have powerful processing capabilities.”

    Co-author Dr Simon Kalita-de Croft said dendritic processing enabled the retina to convert and refine visual cues into electrical signals.

    “We now know that movement of light — say, a flying bird, or a passing car — gets converted into an electrical signal by dendritic processing in the retina,” Dr Kalita-de Croft said.

    “The discovery bridges the gap between our understanding of the anatomy and physiology of neuronal circuits in the retina.”

    Professor Williams said the ability of dendrites in the retina to process visual information depended on the release of two neurotransmitters — chemical messengers — from a single class of cell.

    “These signals are integrated by the output neurons of the retina,” Professor Williams said.

    “Determining how the neural circuits in the retina process information can help us understand computational principles operational throughout the brain.

    “Excitingly, our discovery provides a new template for how neuronal computations may be implemented in brain circuits.”


  9. Auditory perception: Where microseconds matter

    June 15, 2017 by Ashley

    From the Ludwig-Maximilians-Universität München press release:

    In the mammalian auditory system, sound waves impinging on the tympanic membrane of the ear are transduced into electrical signals by sensory hair cells and transmitted via the auditory nerve to the brainstem. The spatial localization of sound sources, especially low-frequency sounds, presents the neuronal processing system with a daunting challenge, for it depends on resolving the difference between the arrival times of the acoustic stimulus at the two ears. The ear that is closer to the source receives the signal before the contralateral ear. But since this interval — referred to as the interaural timing difference (ITD) — is on the order of a few microseconds, its neuronal processing requires exceptional temporal precision. Members of the research group led by LMU neurobiologists Professor Benedikt Grothe and Dr. Michael Pecka have now uncovered a specific combination of mechanisms, which plays a crucial role in ensuring that auditory neurons can measure ITDs with the required accuracy. Their findings appear in the journal PNAS.

    Before cells in the auditory brainstem can determine the ITD, the signals from both ears must first be transmitted to them via chemical synapses that connect them with the sensory neurons. Depending on the signal intensity, synapses themselves can introduce varying degrees of delay in signal transmission. The LMU team, however, has identified a pathway in which the synapses involved respond with a minimal and constant delay.” Indeed, the duration of the delay remains constant even when rates of activation are altered, and that is vital for the precise processing of interaural timing differences,” Benedikt Grothe explains.

    In addition, Grothe and his colleagues demonstrate that a particular structural feature of the wrapping of the signal-transmitting fibers (“axons”) by discontinuous membrane sheaths, which they first described in the journal Nature Communications in 2015, correlates with the constancy of synaptic delay in the pathway. In that study, they had found that these axons are particularly thick and that their wrapping exhibits a highly unusual pattern to enable rapid signal transmission — which is an important prerequisite for accurate measurement of minimal timing differences. Both of these features are found in mammals such as gerbils, which use ITDs for the localization of low-frequency sounds, but not in mice, which only hear high frequencies and don’t use ITDs. “Our work underlines the fact that nerve cells and neuronal circuits are anatomically and physiologically adapted for the specific nature of their biological function,” says Dr. Michael Pecka. “We assume that all mammals that are capable of perceiving low-frequency sounds make use of these structural adaptations.”


  10. Vision keeps maturing until mid-life

    by Ashley

    From the McMaster University press release:

    The visual cortex, the human brain’s vision-processing centre that was previously thought to mature and stabilize in the first few years of life, actually continues to develop until sometime in the late 30s or early 40s, a McMaster neuroscientist and her colleagues have found. Kathryn Murphy, a professor in McMaster’s department of Psychology, Neuroscience and Behaviour, led the study using post-mortem brain-tissue samples from 30 people ranging in age from 20 days to 80 years.

    Her analysis of proteins that drive the actions of neurons in the visual cortex at the back of the brain recasts previous understanding of when that part of the brain reaches maturity, extending the timeline until about age 36, plus or minus 4.5 years.

    The finding was a surprise to Murphy and her colleagues, who had expected to find that the cortex reached its mature stage by 5 to 6 years, consistent with previous results from animal samples and with prevailing scientific and medical belief.

    “There’s a big gap in our understanding of how our brains function,” says Murphy. “Our idea of sensory areas developing in childhood and then being static is part of the challenge. It’s not correct.”

    The research appears May 29 in The Journal of Neuroscience.

    Murphy says treatment for conditions such as amblyopia or “lazy eye,” for example, have been based on the idea that only children could benefit from corrective therapies, since it was thought that treating young adults would be pointless because they had passed the age when their brains could respond.

    Though the research is isolated to the visual cortex, it suggests that other areas of the brain may also be much more plastic for much longer than previously thought, Murphy says.