1. Neuroscientists find differences in brain activity depending whether people are outdoors or in a lab

    February 19, 2018 by Ashley

    From the University of Alberta press release:

    The brain acts much differently when we’re outdoors compared to when we’re inside the lab, a new study has found.

    “It happens when we’re doing normal, everyday activities, like riding a bike,” explained Kyle Mathewson, a neuroscientist in UAlberta’s Department of Psychology.

    Mathewson and his research team put EEG equipment into backpacks and had subjects perform a standard neuroscience task while riding a bike outside. The task involved identifying changes in an otherwise consistent set of stimuli, such as a higher pitch in a series of beep sounds. They had previously performed the same experiment on stationary bikes inside their lab but in the but in the new study, the scientists were able to record laboratory quality measurements of brain activity outdoors, using portable equipment.

    “Something about being outdoors changes brain activity,” said Joanna Scanlon, graduate student and lead author on the study. “In addition to dividing attention between the task and riding a bike, we noticed that brain activity associated with sensing and perceiving information was different when outdoors, which may indicate that the brain is compensating for environmental distractions.”

    The great outdoors

    The study showed that our brains process stimuli, like sounds and sights, differently when we perform the same task outdoors compared to inside a lab.

    “If we can understand how and what humans are paying attention to in the real world, we can learn more about how our minds work,” said Scanlon. “We can use that information to make places more safe, like roadways.”

    “If we want to apply these findings to solve issues in our society, we need to ensure that we understand how the brain works out in the world where humans actually live, work, and play,” said Mathewson, who added that almost everything we know about the human brain is learned from studies in very tightly controlled environments.

    Next, the researchers will explore how this effect differs in outdoor environments with varying degrees of distraction, such as quiet path or a busy roadway.


  2. Study looks at what makes children with autism less social than their peers

    February 12, 2018 by Ashley

    From the University of California – Riverside press release:

    Pick a hand, any hand. That familiar refrain, repeated in schoolyards the world over, is the basis of a simple guessing game that was recently adapted to study how and why kids with autism spectrum disorder (ASD) interact with the people around them.

    The game is the brainchild of Katherine Stavropoulos, an assistant professor of special education in the Graduate School of Education at the University of California, Riverside. As a licensed clinical psychologist with a background in neuroscience, Stavropoulos looks closely at electrical activity in the brains of children with ASD and typical development, or TD, to discern differences in the respective groups’ reward systems.

    Historically, clinicians and scientists have proposed a variety of theories to explain why kids with ASD tend to be less socially communicative than their TD peers. One popular theory, the social motivation hypothesis, suggests that kids with ASD aren’t intrinsically motivated to interact with other people because they aren’t neurologically “rewarded” by social interactions the same way TD kids are.

    “Most of us get a hit of dopamine when we interact with other people, whether it’s through making eye contact or sharing something good that’s happened to us — it feels good to be social,” Stavropoulos said. “The social motivation hypothesis says kids with autism don’t get that same reward from social interaction, so they don’t go out of their way to engage with people because it’s not rewarding for them.”

    A second theory, sensory over-responsivity — also known as the overly intense world hypothesis — posits that because kids with ASD interpret sensory cues more intensely than their TD peers, those with ASD tend to shy away from interactions they perceive as overwhelming or aversive.

    “Kids with autism often find noises too loud or lights too bright, or they find them not intense enough,” Stavropoulos said. “Most of us wouldn’t want to talk to someone whom we perceive as screaming, especially in a room that was already too bright, with ambient noise that was already too loud.” Instead, sensory over-responsivity argues, such interactions compel many individuals with ASD to withdraw from socialization as a self-soothing behavior.

    But according to Stavropoulos, who also serves as assistant director of UCR’s SEARCH Family Autism Resource Center, it may be possible for these seemingly competing theories to exist in tandem.

    Stavropoulos and UC San Diego’s Leslie Carver, her research colleague and former graduate advisor, used electrophysiology to study the neural activity of 43 children between the ages of 7 and 10 — 23 of whom were TD and 20 of whom had ASD — during a guessing game-style simulation that provided participants with both social and nonsocial rewards. Their results, published this week in the journal Molecular Autism, provide a glimpse at the brain mechanisms behind autism.

    Wearing a cap outfitted with 33 electrodes, each child sat before a computer screen showing pairs of boxes containing question marks. Much like the format of the “pick a hand” guessing game, the child then chose the box he or she thought was the “right” one (in reality, the answers were randomized).

    Stavropoulos said it was crucial to design a simulation that would allow the researchers to study participants’ neural reactions to social and nonsocial rewards during two stages: reward anticipation, or the period before the child knew whether he or she had chosen the correct answer, and reward processing, or the period immediately after.

    “We structured the game so that the kids would pick an answer, and then there would be a brief pause,” Stavropoulos said. “It was during that pause that the kids would begin to wonder, ‘Did I get it?’ and we could observe them getting excited; the more rewarding something is to a person, the more that anticipation builds.”

    Each participant played the game in two blocks. During the social block, kids who chose the right box saw a smiling face and kids who chose the wrong box saw a sad, frowning face. During the nonsocial block, meanwhile, the faces were scrambled and reformed in the shapes of arrows pointing up to denote correct answers and down to denote incorrect ones.

    “After the kids saw whether they were right or wrong, we were then able to observe the post-stimulus reward-related activity,” Stavropoulos said of the process, which involved comparing participants’ neural oscillation patterns. The researchers gleaned several key findings from the simulation:

    • TD kids anticipated social awards — in this case, the pictures of faces — more strongly than kids with ASD.
    • Not only did children with ASD anticipate social rewards less than their TD peers, but within the ASD group, the researchers found that kids with more severe ASD were anticipating the nonsocial rewards, or the arrows, the most.
    • During reward processing, or the period after participants learned whether they had chosen the right or wrong box, the researchers observed more reward-related brain activity in TD children but more attention-related brain activity among children with ASD, which Stavropoulos said may be related to feelings of sensory overload in kids with ASD.
    • Among the autism group, meanwhile, kids with more severe ASD also showed heightened responsiveness to positive social feedback, which Stavropoulos said may indicate hyperactivity, or the state of being overwhelmed by “correct” social feedback that is commonly associated with sensory over-responsivity.

    Stavropoulos said the duo’s results provide support for both the social motivation hypothesis and the overly intense world hypothesis.

    Kids with autism might not be as rewarded by social interactions as typically developing kids are, but that doesn’t mean their reward systems are entirely broken,” she added. “This research makes the case for developing clinical interventions that help children with autism better understand the reward value of other people — to slowly teach these kids that interacting with others can be rewarding.

    “But, it is critical to do this while being sensitive to these kids’ sensory experiences,” she continued. “We don’t want to overwhelm them, or make them feel sensory overload. It’s a delicate balance between making social interactions rewarding while being aware of how loudly we speak, how excited our voices sound, and how bright the lights are.”


  3. Electrical stimulation in brain bypasses senses, instructs movement

    December 19, 2017 by Ashley

    From the University of Rochester Medical Center press release:

    The brain’s complex network of neurons enables us to interpret and effortlessly navigate and interact with the world around us. But when these links are damaged due to injury or stroke, critical tasks like perception and movement can be disrupted. New research is helping scientists figure out how to harness the brain’s plasticity to rewire these lost connections, an advance that could accelerate the development of neuro-prosthetics.

    A new study authored by Marc Schieber, M.D., Ph.D., and Kevin Mazurek, Ph.D. with the University of Rochester Medical Center Department of Neurology and the Del Monte Institute for Neuroscience, which appears in the journal Neuron, shows that very low levels of electrical stimulation delivered directly to an area of the brain responsible for motor function can instruct an appropriate response or action, essentially replacing the signals we would normally receive from the parts of the brain that process what we hear, see, and feel.

    “The analogy is what happens when we approach a red light,” said Schieber. “The light itself does not cause us to step on the brake, rather our brain has been trained to process this visual cue and send signals to another parts of the brain that control movement. In this study, what we describe is akin to replacing the red light with an electrical stimulation which the brain has learned to associate with the need to take an action that stops the car.”

    The findings could have significant implications for the development of brain-computer interfaces and neuro-prosthetics, which would allow a person to control a prosthetic device by tapping into the electrical activity of their brain.

    To be effective, these technologies must not only receive output from the brain but also deliver input. For example, can a mechanical arm tell the user that the object they are holding is hot or cold? However, delivering this information to the part of the brain responsible for processing sensory inputs does not work if this part of the brain is injured or the connections between it and the motor cortex are lost. In these instances, some form of input needs to be generated that replaces the signals that combine sensory perception with motor control and the brain needs to “learn” what these new signals mean.

    “Researchers have been interested primarily in stimulating the primary sensory cortices to input information into the brain,” said Schieber. “What we have shown in this study is that you don’t have to be in a sensory-receiving area in order for the subject to have an experience they can identify.”

    A similar approach is employed with cochlear implants for hearing loss which translate sounds into electrical stimulation of the inner ear and, over time, the brain learns to interpret these inputs as sound.

    In the new study, the researchers detail a set of experiments in which monkeys were trained to perform a task when presented with a visual cue, either turning, pushing, or pulling specific objects when prompted by different lights. While this occurred, the animals simultaneously received a very mild electrical stimulus called a micro-stimulation in different areas of the premotor cortex — the part of the brain that initiates movement — depending upon the task and light combination.

    The researchers then replicated the experiments, but this time omitted the visual cue of the lights and instead only delivered the micro-stimulation. The animals were able to successfully identify and perform the tasks they had learned to associate with the different electrical inputs. When the pairing of micro-stimulation with a particular action was reshuffled, the animals were able to adjust, indicating that the association between stimulation and a specific movement was learned and not fixed.

    “Most work on the development of inputs to the brain for use with brain-computer interfaces has focused primarily on the sensory areas of the brain,” said Mazurek. “In this study, we show you can expand the neural real estate that can be targeted for therapies. This could be very important for people who have lost function in areas of their brain due to stroke, injury, or other diseases. We can potentially bypass the damaged part of the brain where connections have been lost and deliver information to an intact part of the brain.”


  4. Study suggests odors that carry social cues seem to affect volunteers on the autism spectrum differently

    December 8, 2017 by Ashley

    From the Weizmann Institute of Science press release:

    Autism typically involves the inability to read social cues. We most often associate this with visual difficulty in interpreting facial expression, but new research at the Weizmann Institute of Science suggests that the sense of smell may also play a central role in autism. As reported in Nature Neuroscience, Weizmann Institute of Science researchers show that people on the autism spectrum have different — and even opposite — reactions to odors produced by the human body. These odors are ones that we are unaware of smelling, but which are, nonetheless, a part of the nonverbal communication that takes place between people, and which have been shown to affect our moods and behavior. Their findings may provide a unique window on autism, including, possibly, on the underlying developmental malfunctions in the disorder.

    Researchers in the lab of Prof. Noam Sobel in the Institute’s Neurobiology Department investigate, among other things, the smells that announce such emotions as happiness, fear or aggression to others. Although this sense is not our primary sense, as it is in many other mammals, we still subliminally read and react to certain odors. For example “smelling fear,” even if we cannot consciously detect its odor, is something we may do without thinking. Since this is a form of social communication, Sobel and members of his lab wondered whether it might be disrupted in a social disorder like autism.

    To conduct their experiments, Sobel and lab members Yaara Endevelt-Shapira and Ofer Perl, together with other members of his lab, devised a series of experiments with a group of participants on the high functioning end of the autism spectrum who volunteered for the study. To begin with, the researchers tested the ability of both autistic and control volunteers to identify smells that can be consciously detected, including human smells like sweat. There was no significant difference between the groups at this stage, meaning the sense of smell in the autistic participants was not significantly different from that of controls.

    Two groups were then exposed to either to the “smell of fear” or to a control odor. The smell of fear was sweat collected from people taking skydiving classes, and control odor was sweat from the same people, only this time it had been collected when they were just exercising — without feeling fear.

    This is where differences emerged: Although neither group reported detecting dissimilarities between the two smells, their bodies reacted to each in a different way. In the control group, smelling the fear-induced sweat produced measurable increases in the fear response, for example in skin conductivity, while the everyday sweat did not. In the autistic men, fear-induced sweat lowered their fear responses, while the odor of “calm sweat” did the opposite: It raised their measurable anxiety levels.

    Next, the group created talking robotic mannequins that emitted different odors through their nostrils. These mannequins gave the volunteers, who were unaware of the olfactory aspect of the experiment, different tasks to conduct. Using mannequins enabled the researchers to have complete control over the social cues — odor-based or other — that the subjects received. The tasks were designed to evaluate the level of trust that the volunteers placed in the mannequins — and here, too, the behavior of autistic volunteers was the opposite of the control group: They displayed more trust in the mannequin that emitted the fear-induced odor and less in the one that smelled “calmer.”

    In continuing experiments, the researchers asked whether other subliminal “social odors” have a different impact in autism than in control groups. In one, the volunteers were exposed to sudden loud noises during their sessions while at the same time they were also exposed to a potentially calming component of body-odor named hexadecanal. Another automatic fear response — blinking — was recorded using electrodes above the muscles of the eye. Indeed, the blink response in the control group was weaker when they were exposed to hexadecanal, while for those in the autistic group this response was stronger with hexadecanal.

    In other words, the autistic volunteers in the experiment did not display an inability to read the olfactory social cues in smell, but rather they misread them. Sobel and his group think that this unconscious difference may point to a deeper connection between our sense of smell and early development. Research in recent years has turned up smell receptors like those in our nasal passages in all sorts of other places in our bodies — from our brains to our uteri. It has been suggested that these play a role in development, among other things. In other words, it is possible that the sensing of subtle chemical signals may go awry at crucial stages in the brain’s development in autism. “We are still speculating, at this point,” says Sobel, “but we are hoping that further research in our lab and others will clarify both the function of these unconscious olfactory social cues and their roots in such social disorders as autism.”

     


  5. Study suggests mammal brains identify type of scent faster than once thought

    November 23, 2017 by Ashley

    From the NYU Langone Health / NYU School of Medicine press release:

    It takes less than one-tenth of a second — a fraction of the time previously thought — for the sense of smell to distinguish between one odor and another, new experiments in mice show.

    In a study to be published in the journal Nature Communications online Nov. 14, researchers at NYU School of Medicine found that odorants — chemical particles that trigger the sense of smell — need only reach a few signaling proteins on the inside lining of the nose for the mice to identify a familiar aroma. Just as significantly, researchers say they also found that the animals’ ability to tell odors apart was the same no matter how strong the scent (regardless of odorant concentration).

    “Our study lays the groundwork for a new theory about how mammals, including humans, smell: one that is more streamlined than previously thought,” says senior study investigator and neurobiologist Dmitry Rinberg, PhD. His team is planning further animal experiments to look for patterns of brain cell activation linked to smell detection and interpretation that could also apply to people.

    “Much like human brains only need a few musical notes to name a particular song once a memory of it is formed, our findings demonstrate that a mouse’s sense of smell needs only a few nerve signals to determine the kind of scent,” says Rinberg, an associate professor at NYU Langone Health and its Neuroscience Institute.

    When an odorant initially docks into its olfactory receptor protein on a nerve cell in the nose, the cell sends a signal to the part of the brain that assigns the odor, identifying the smell, says Rinberg.

    Key among his team’s latest findings was that mice recognize a scent right after activation of the first few olfactory brain receptors, and typically within the first 100 milliseconds of inhaling any odorant.

    Previous research in animals had shown that it takes as long as 600 milliseconds for almost all olfactory brain receptors involved in their sense of smell to become fully activated, says Rinberg. However, earlier experiments in mice, which inhale through the nose faster than humans and have a faster sense of smell, showed that the number of activated receptors in their brains peaks after approximately 300 milliseconds.

    Earlier scientific investigations had also shown that highly concentrated scents activated more receptors. But Rinberg says that until his team’s latest experiments, researchers had not yet outlined the role of concentration in the odor identification process.

    For the new study, mice were trained to lick a straw to get a water reward based on whether they smelled orange- or pine-like scents.

    Using light-activated fibers inserted into the mouse nose, researchers could turn on individual brain receptors or groups of receptors involved in olfaction to control and track how many receptors were available to smell at any time. The optical technique was developed at NYU Langone.

    The team then tested how well the mice performed on water rewards when challenged by different concentrations of each smell, and with more or fewer receptors available for activation. Early activation of too many receptors, the researchers found, impaired odor identification, increasing the number of errors made by trained mice in getting their reward.

    Researchers found that early interruptions in sensing smell, less than 50 milliseconds from inhalation, reduced odor identification scores nearly to chance. By contrast, reward scores greatly improved when the mouse sense of smell was interrupted at any point after 50 milliseconds, but these gains fell off after 100 milliseconds.


  6. Study suggests reasons why head and face pain causes more suffering

    November 22, 2017 by Ashley

    From the Duke University press release:

    Hate headaches? The distress you feel is not all in your — well, head. People consistently rate pain of the head, face, eyeballs, ears and teeth as more disruptive, and more emotionally draining, than pain elsewhere in the body.

    Duke University scientists have discovered how the brain’s wiring makes us suffer more from head and face pain. The answer may lie not just in what is reported to us by the five senses, but in how that sensation makes us feel emotionally.

    The team found that sensory neurons that serve the head and face are wired directly into one of the brain’s principal emotional signaling hubs. Sensory neurons elsewhere in the body are also connected to this hub, but only indirectly.

    The results may pave the way toward more effective treatments for pain mediated by the craniofacial nerve, such as chronic headaches and neuropathic face pain.

    “Usually doctors focus on treating the sensation of pain, but this shows the we really need to treat the emotional aspects of pain as well,” said Fan Wang, a professor of neurobiology and cell biology at Duke, and senior author of the study. The results appear online Nov. 13 in Nature Neuroscience.

    Pain signals from the head versus those from the body are carried to the brain through two different groups of sensory neurons, and it is possible that neurons from the head are simply more sensitive to pain than neurons from the body.

    But differences in sensitivity would not explain the greater fear and emotional suffering that patients experience in response to head-face pain than body pain, Wang said.

    Personal accounts of greater fear and suffering are backed up by functional Magnetic Resonance Imaging (fMRI), which shows greater activity in the amygdala — a region of the brain involved in emotional experiences — in response to head pain than in response to body pain.

    “There has been this observation in human studies that pain in the head and face seems to activate the emotional system more extensively,” Wang said. “But the underlying mechanisms remained unclear.”

    To examine the neural circuitry underlying the two types of pain, Wang and her team tracked brain activity in mice after irritating either a paw or the face. They found that irritating the face led to higher activity in the brain’s parabrachial nucleus (PBL), a region that is directly wired into the brain’s instinctive and emotional centers.

    Then they used methods based on a novel technology recently pioneered by Wang’s group, called CANE, to pinpoint the sources of neurons that caused this elevated PBL activity.

    “It was a eureka moment because the body neurons only have this indirect pathway to the PBL, whereas the head and face neurons, in addition to this indirect pathway, also have a direct input,” Wang said. “This could explain why you have stronger activation in the amygdala and the brain’s emotional centers from head and face pain.”

    Further experiments showed that activating this pathway prompted face pain, while silencing the pathway reduced it.

    “We have the first biological explanation for why this type of pain can be so much more emotionally taxing than others,” said Wolfgang Liedtke, a professor of neurology at Duke University Medical Center and a co-author on Wang’s paper, who is also treating patients with head- and face-pain. “This will open the door toward not only a more profound understanding of chronic head and face pain, but also toward translating this insight into treatments that will benefit people.”

    Chronic head-face pain such cluster headaches and trigeminal neuralgia can become so severe that patients seek surgical solutions, including severing the known neural pathways that carry pain signals from the head and face to the hindbrain. But a substantial number of patients continue to suffer, even after these invasive measures.

    “Some of the most debilitating forms of pain occur in the head regions, such as migraine,” said Qiufu Ma, a professor of neurobiology at Harvard Medical School, who was not involved in the study. “The discovery of this direct pain pathway might provide an explanation why facial pain is more severe and more unpleasant.”

    Liedtke said targeting the neural pathway identified here can be a new approach toward developing innovative treatments for this devastating head and face pain.


  7. Study suggests blue lighting helps us to relax faster after an argument than white lighting

    by Ashley

    From the University of Granada press release:

    Researchers from the University of Granada (UGR), in collaboration with the School for Special Education San Rafael (Hospitaller Order of Saint John of God, Granada, Spain) have proven, by means of an objective evaluation using electrophysiological measurements, that blue lighting accelerates the relaxation process after acute psychosocial stress in comparison with conventional white lighting.

    Said stress is a kind of short-term stress (acute stress) that occurs during social or interpersonal relationships, for example while arguing with a friend or when someone pressures you to finish a certain task as soon as possible.

    The researchers, which belong to the BCI Lab (Brain-Computer Interface Lab) at the University of Granada, note that psychosocial stress produces some physiological responses that can be measured by means of bio-signals. That stress is very common and negatively affects people’s health and quality of life.

    For their work, whose results have been published in the PlosOne journal, the researchers made twelve volunteers to be stressed and then perform a relaxation session within the multisensory stimulation room at the School for Special Education San Rafael.

    In said room the participants lied down with no stimulus but a blue (group 1) or white (group 2) lighting. Diverse bio-signals, such as heart rate and brain activity, were measured throughout the whole session (by means of an electrocardiogram and an electroencephalogram, respectively).

    The results showed that blue lighting accelerates the relaxation process, in comparison with conventional white lighting.


  8. Study suggests visual intelligence is not the same as IQ

    November 17, 2017 by Ashley

    From the Vanderbilt University press release:

    Just because someone is smart and well-motivated doesn’t mean he or she can learn the visual skills needed to excel at tasks like matching fingerprints, interpreting medical X-rays, keeping track of aircraft on radar displays or forensic face matching.

    That is the implication of a new study which shows for the first time that there is a broad range of differences in people’s visual ability and that these variations are not associated with individuals’ general intelligence, or IQ. The research is reported in a paper titled “Domain-specific and domain-general individual differences in visual object recognition” published in the September issue of the journal Cognition and the implications are discussed in a review article in press at Current Directions in Psychological Science.

    “People may think they can tell how good they are at identifying objects visually,” said Isabel Gauthier, David K. Wilson Professor of Psychology at Vanderbilt University, who headed the study. “But it turns out that they are not very good at evaluating their own skills relative to others.”

    In the past, research in visual object recognition has focused largely on what people have in common, but Gauthier became interested in the question of how much visual ability varies among individuals. To answer this question, she and her colleagues had to develop a new test, which they call the Novel Object Memory Test (NOMT), to measure people’s ability to identify unfamiliar objects.

    Gauthier first wanted to gauge public opinions about visual skills. She did so by surveying 100 laypeople using the Amazon Mechanical Turk crowdsourcing service. She found that respondents generally consider visual tasks as fairly different from other tasks related to general intelligence. She also discovered that they feel there is less variation in people’s visual skills than there is in non-visual skills such as verbal and math ability.

    The main problem that Gauthier and colleagues had to address in assessing individuals’ innate visual recognition ability was familiarity. The more time a person spends learning about specific types of objects, such as faces, cars or birds, the better they get at identifying them. As a result, performance on visual recognition tests that use images of common objects are a complex mixture of people’s visual ability and their experience with these objects. Importantly, they have proven to be a poor predictor of how well someone can learn to identify objects in a new domain.

    Gauthier addressed this problem by using novel computer-generated creatures called greebles, sheinbugs and ziggerins to study visual recognition. The basic test consists of studying six target creatures, followed by a number of test trials displaying creatures in sets of three. Each set contains a creature from the target group along with two unfamiliar creatures, and the participant is asked to pick out the creature that is familiar.

    Analyzing the results from more than 2000 subjects, Gauthier and colleagues discovered that the ability to recognize one kind of creature was well predicted by how well subjects could recognize the other kind, although these objects were visually quite different. This confirmed the new test can predict the ability to learn new categories.

    The psychologists also used performance on several IQ-related tests and determined that the visual ability measured on the NOMT is distinct from and independent of general intelligence.

    “This is quite exciting because performance on cognitive skills is almost always associated with general intelligence,” Gauthier said. “It suggests that we really can learn something new about people using these tests, over and beyond all the abilities we already know how to measure.” Although the study confirms the popular intuition that visual skill is different from general intelligence, it found that individual variations in visual ability are much larger than most people think. For instance, on one metric, called the coefficient of variation, the spread of people was wider on the NOMT than on a nonverbal IQ test.

    “A lot of jobs and hobbies depend on visual skills,” Gauthier said. “Because they are independent of general intelligence, the next step is to explore how we can use these tests in real-world applications where performance could not be well predicted before.”


  9. Engineer finds how brain encodes sounds

    November 16, 2017 by Ashley

    From the Washington University in St. Louis press release:

    When you are out in the woods and hear a cracking sound, your brain needs to process quickly whether the sound is coming from, say, a bear or a chipmunk. In new research published in PLoS Biology, a biomedical engineer at Washington University in St. Louis has a new interpretation for an old observation, debunking an established theory in the process.

    Dennis Barbour, MD, PhD, associate professor of biomedical engineering in the School of Engineering & Applied Science who studies neurophysiology, found in an animal model that auditory cortex neurons may be encoding sounds differently than previously thought. Sensory neurons, such as those in auditory cortex, on average respond relatively indiscriminately at the beginning of a new stimulus, but rapidly become much more selective. The few neurons responding for the duration of a stimulus were generally thought to encode the identity of a stimulus, while the many neurons responding at the beginning were thought to encode only its presence. This theory makes a prediction that had never been tested — that the indiscriminate, initial responses would encode stimulus identity less accurately than how the selective responses register over the sound’s duration.

    “At the beginning of a sound transition, things are diffusely encoded across the neuron population, but sound identity turns out to be more accurately encoded,” Barbour said. “As a result, you can more rapidly identify sounds and act on that information. If you get about the same amount of information for each action potential spike of neural activity, as we found, then the more spikes you can put toward a problem, the faster you can decide what to do. Neural populations spike most and encode most accurately at the beginning of stimuli.”

    Barbour’s study involved recording individual neurons. To make similar kinds of measurements of brain activity in humans, researchers must use noninvasive techniques that average many neurons together. Event-related potential (ERP) techniques record brain signals through electrodes on the scalp and reflect neural activity synchronized to the onset of a stimulus. Functional MRI (fMRI), on the other hand, reflects activity averaged over several seconds. If the brain were using fundamentally different encoding schemes for onsets versus sustained stimulus presence, these two methods might be expected to diverge in their findings. Both reveal the neural encoding of stimulus identity, however.

    “There has been a lot of debate for a very long time, but especially in the past couple of decades, about whether information representation in the brain is distributed or local,” Barbour said.

    “If function is localized, with small numbers of neurons bunched together doing similar things, that’s consistent with sparse coding, high selectivity, and low population spiking rates. But if you have distributed activity, or lots of neurons contributing all over the place, that’s consistent with dense coding, low selectivity and high population spiking rates. Depending on how the experiment is conducted, neuroscientists see both. Our evidence suggests that it might just be both, depending on which data you look at and how you analyze it.”

    Barbour said the research is the most fundamental work to build a theory for how information might be encoded for sound processing, yet it implies a novel sensory encoding principle potentially applicable to other sensory systems, such as how smells are processed and encoded.

    Earlier this year, Barbour worked with Barani Raman, associate professor of biomedical engineering, to investigate how the presence and absence of an odor or a sound is processed. While the response times between the olfactory and auditory systems are different, the neurons are responding in the same ways. The results of that research also gave strong evidence that there may exist a stored set of signal processing motifs that is potentially shared by different sensory systems and even different species.


  10. Study proposes more consumer-focused approach to wine

    November 15, 2017 by Ashley

    From the Michigan State University press release:

    The traditional pairing of wine and food too often misses the mark — leaving people confused and intimidated — and should be scrapped in favor of a more consumer-focused approach, a new study indicates.

    The research by Michigan State University hospitality scholars suggests people generally fit into certain wine-drinking categories, or “vinotypes,” and that servers and sommeliers should consider these preferences when suggesting a wine.

    Ordering beef roast for dinner? A traditional wine recommendation would be Cabernet Sauvignon. But why would a server suggest a bold red wine if the customer hates it? Let the patron drink his or her beloved Riesling with the meal, said Carl Borchgrevink, a former chef and restaurant manager and lead author on the study.

    “The palate rules — not someone else’s idea of which wine we should drink with our food,” said Borchgrevink, associate professor and interim director of MSU’s School of Hospitality Business. “They shouldn’t try to intimidate you into buying a certain wine. Instead, they should be asking you what you like.”

    Borchgrevink and culinary expert Allan Sherwin conducted the first scientific study examining the premise of Tim Hanni’s vinotype theory. Hanni, in a play on “phenotype,” proposed that vinotypes are determined by both genetics and environment and that such tastes change over time based on experiences.

    Hanni, a chef and one of the first Americans to earn the designation “Master of Wine,” has proposed four primary vinotypes: “sweet,” “hypersensitive,” “sensitive” and “tolerant.” The categories range from those who like sweet, fruity whites (sweet vinotype) to those who enjoy bold, strong reds (tolerant), and everyone in between.

    Hanni also created a series of criteria to determine vinotypes. If you like sweet beverages such as soda and salt your food liberally, for example, you trend toward the sweet vinotype. But if strong black coffee and intense flavors are your thing, you’re more apt to fall in the tolerant camp.

    For the study, MSU researchers surveyed a group of adults on food and beverage preferences and consumption patterns. They also held a reception with 12 stations where the participants rated the food and wine presented at each station individually, and then together.

    The results were conclusive: Hanni’s premise has merit. The researchers were able to predict wine preferences based on consumption patterns and preferences.

    Next, MSU researchers will test the vintoype theory outright by working with scholars globally. Borchgrevink said separate studies are being planned with partners from around the United States, as well as from Hong Kong, France and other areas.

    The work has implications for both restaurants and wine stores, which should train their staff members on the vinotype approach and find questions to ask consumers that can reveal their wine preferences, the study says.

    But the main focus is the wine-drinker, who should learn to trust their own palate and not necessarily depend on the so-called experts, said Sherwin, the Dr. Lewis J. and Mrs. Ruth E. Minor Chef-Professor of Culinary Management at MSU.

    “At the end of the day it’s going to be the consumer that has the final say,” Sherwin said. “They’re going to be the arbiter.”

    The study is published in the International Journal of Wine Business Research.