1. Illusion reveals that the brain fills in peripheral vision

    December 13, 2016 by Ashley

    From the Association for Psychological Science media release:

    vision_eyesightWhat we see in the periphery, just outside the direct focus of the eye, may sometimes be a visual illusion, according to new findings published in Psychological Science, a journal of the Association for Psychological Science.

    The findings suggest that even though our peripheral vision is less accurate and detailed than what we see in the center of the visual field, we may not notice a qualitative difference because our visual processing system actually fills in some of what we “see” in the periphery.

    “Our findings show that, under the right circumstances, a large part of the periphery may become a visual illusion,” says psychology researcher Marte Otten from the University of Amsterdam, lead author on the new research. “This effect seems to hold for many basic visual features, indicating that this ‘filling in’ is a general, and fundamental, perceptual mechanism.”

    As we go about daily life, we generally operate under the assumption that our perception of the world directly and accurately represents the outside world. But visual illusions of various kinds show us that this isn’t always the case. As the brain processes incoming information about an external stimulus, we come to learn, it creates a representation of the outside world that can diverge from reality in noticeable ways.

    Otten and colleagues wondered whether this same process might explain why we usually feel as though our peripheral vision is detailed and robust when it isn’t.

    Perhaps our brain fills in what we see when the physical stimulus is not rich enough,” she explains. “The brain represents peripheral vision with less detail, and these representations degrade faster than central vision. Therefore, we expected that peripheral vision should be very susceptible to illusory visual experiences, for many stimuli and large parts of the visual field.”

    Over a series of experiments, the researchers presented a total of 20 participants with a series of images. The participants focused on the center of the screen — a central image appeared and then a different peripheral image gradually faded in. Participants were supposed to click the mouse as soon as the difference between the central patch and the periphery disappeared and the entire screen appeared to be uniform.

    Otten and colleagues changed the defining characteristic of the central image in different experiments, varying its shape, orientation, luminance, shade, or motion.

    The results showed that all of these characteristics were vulnerable to a uniformity illusion — that is, participants incorrectly reported seeing a uniform image when the center and periphery were actually different.

    The illusion was less likely to occur when the difference between the center and periphery was large; when the illusion did occur on these trials, it took longer to emerge.

    Participants indicated that they felt roughly equally sure about their experience of uniformity when it actually did exist as when it was illusory. This suggests that the illusory experiences are similar to a visual experience based on a physical visual stimulus.

    “The fun thing about this illusion is that you can to test this out for yourself,” Otten says. “If you look up the illusion on http://www.uniformillusion.com you can find out just how real the illusory experience feels for you.”

    “The most surprising is that we found a new class of visual illusions with such a wide breadth, affecting many different types of stimuli and large parts of the visual field,” Otten adds. “We hope to use this illusion as a tool to uncover why peripheral vision seems so rich and detailed, and more generally, to understand how the brain creates our visual perceptual experiences.”


  2. Increased UVB exposure associated with reduced risk of nearsightedness, particularly in teens, young

    by Ashley

    From the The JAMA Network Journals media release:

    children playgroundHigher ultraviolet B (UVB) radiation exposure, directly related to time outdoors and sunlight exposure, was associated with reduced odds of myopia (nearsightedness), and exposure to UVB between ages 14 and 29 years was associated with the highest reduction in odds of adult myopia, according to a study published online by JAMA Ophthalmology.

    Myopia is a complex trait influenced by numerous environmental and genetic factors and is becoming more common worldwide, most dramatically in urban Asia, but rises in prevalence have also been identified in the United States and Europe. This has major implications, both visually and financially, for the global burden from this potentially sight-threatening condition.

    Astrid E. Fletcher, Ph.D., of the London School of Hygiene and Tropical Medicine, and colleagues examined the association of myopia with UVB radiation, serum vitamin D concentrations and vitamin D pathway genetic variants, adjusting for years in education. The study included a random sample of participants 65 years and older from 6 study centers from the European Eye Study. Of 4,187 participants, 4,166 attended an eye examination including refraction, gave a blood sample, and were interviewed by trained fieldworkers using a structured questionnaire. After exclusion for various factors, the final study group included 371 participants with myopia and 2,797 without.

    The researchers found that an increase in UVB exposure at age 14 to 19 years and 20 to 39 years was associated with reduced odds of myopia; those in the highest tertile (group) of years of education had twice the odds of myopia. No independent associations between myopia and serum vitamin D3 concentrations or variants in genes associated with vitamin D metabolism were found. An unexpected finding was that the highest quintile (group) of plasma lutein concentrations was associated with reduced odds of myopia.

    The association between UVB, education, and myopia remained even after respective adjustment. This suggests that the high rate of myopia associated with educational attainment is not solely mediated by lack of time outdoors,” the authors write.

    “As the protective effect of time spent outdoors is increasingly used in clinical interventions, a greater understanding of the mechanisms and life stages at which benefit is conferred is warranted.”


  3. How human brains do language: One system, two channels

    November 8, 2016 by Ashley

    From the Northeastern University College of Science media release:

    raised hand, 2d:4dContrary to popular belief, language is not limited to speech. In a recent study published in the journal PNAS, Northeastern University Prof. Iris Berent reveals that people also apply the rules of their spoken language to sign language.

    Language is not simply about hearing sounds or moving our mouths. When our brain is “doing language,” it projects abstract structure. The modality (speech or sign) is secondary. “There is a misconception in the general public that sign language is not really a language,” said Berent. “Part of our mandate, through the support of the NSF, is to reveal the complex structure of sign language, and in so doing, disabuse the public of this notion.”

    The Experiment

    To come to this conclusion, Berent’s lab studied words (and signs) that shared the same general structure. She found that people reacted to this structure in the same way, irrespective of whether they were presented with speech or signs.

    In the study, Berent studied words and signs with doubling (e.g., slaflaf) — ones that show full or partial repetition. She found that responses to these forms shift, depending on their linguistic context.

    When a word is presented by itself (or as a name for just one object), people avoid doubling. For example, they rate slaflaf (with doubling) worse than slafmak (with no doubling). But when doubling signaled a systematic change in meaning (e.g., slaf=singular, slaflaf=plural), participants now preferred it.

    Next, Berent asked what happens when people see doubling in signs (signs with two identical syllables). The subjects were English speakers who had no knowledge of a sign language. To Berent’s surprise, these subjects responded to signs in the same way they responded to the words. They disliked doubling for singular objects, but they systematically preferred it if (and only if) doubling signaled plurality. Hebrew speakers showed this preference when doubling signaled a diminutive, in line with the structure of their language. “It’s not about the stimulus, it’s really about the mind, and specifically about the language system,” said Berent. “These results suggest that our knowledge of language is abstract and amodal. Human brains can grasp the structure of language regardless of whether it is presented in speech or in sign.”

    Sign Language is Language

    Currently there is a debate as to what role sign language has played in language evolution, and whether the structure of sign language share similarities with spoken language. Berent’s lab shows that our brain detects some deep similarities between speech and sign language. This allows for English speakers, for example, to extend their knowledge of language to sign language. “Sign language has a structure, and even if you examine it at the phonological level, where you would expect it to be completely different from spoken language, you can still find similarities. What’s even more remarkable is that our brain can extract some of this structure even when we have no knowledge of sign language. We can apply some of the rules of our spoken language phonology to signs,” said Berent.

    Berent says these findings show that our brains are built to deal with very different types of linguistic inputs. The results from this paper confirm what some scientists have long thought, but hasn’t truly been grasped by the general public — language is language no matter what format it takes. “This is a significant finding for the deaf community because sign language is their legacy. It defines their identity, and we should all recognize its value. It’s also significant to our human identity, generally, because language is what defines us as a species ”

    To help further support these findings, Berent and her lab intend to examine how these rules apply to other languages. The present study focused on English and Hebrew.


  4. Getting into the flow: Sexual pleasure is a kind of trance

    November 2, 2016 by Ashley

    From the Northwestern University media release:

    Couple TalkingMany people have speculated on the evolutionary functions of the human orgasm, but the underlying mechanisms have remained mysterious. In a new paper, a Northwestern University researcher seeks to shed light on how orgasm works in the brain.

    Adam Safron, a neuroscientist and Ph.D. candidate in the psychology department’s Brain Behavior Cognition program in the Weinberg College of Arts and Sciences at Northwestern, reviewed related studies and literature over many years to come up with a model in which rhythmic sexual activity likely influences brain rhythms.

    Safron describes how rhythmic stimulation can enhance neural oscillations at corresponding frequencies, somewhat like pushing someone on a swing. Through this process, called neural entrainment, if sexual stimulation is intense enough and goes on long enough, synchronized activity could spread throughout the brain.

    This synchrony may produce such intensely focused attention that sexual activity outcompetes usual self-awareness for access to consciousness, so producing a state of sensory absorption and trance. This may be crucial for allowing for a sufficient intensity of experience to trigger the mechanisms of climax.

    “Synchronization is important for signal propagation in the brain, because neurons are more likely to fire if they are stimulated multiple times within a narrow window of time,” Safron said. “Otherwise, the signals decay as part of a general resetting mechanism, rather than sum together. This then caused me to hypothesize that rhythmic entrainment is the primary mechanism by which orgasmic thresholds are surpassed.”

    Safron said this research could be relevant for improving sexual functioning, encouraging people to focus more on the rhythmic aspects of sexuality.

    “The idea that sexual experiences can be like trance states is in some ways ancient. Turns out this idea is supported by modern understandings of neuroscience,” Safron said. “In theory, this could change the way people view their sexuality. Sex is a source of pleasurable sensations and emotional connection, but beyond that, it’s actually an altered state of consciousness.”

    Safron found parallels between sexual climax and seizures as well as with music and dance — something he wasn’t expecting.

    In both orgasm and reflex seizures, rhythmic inputs into high-bandwidth sensory channels resulted in an explosive process after certain stimulation thresholds were surpassed.

    “And although obvious in retrospect, I wasn’t expecting to find that sexual activity was so similar to music and dance, not just in the nature of the experiences, but also in that evolutionarily, rhythm-keeping ability may serve as a test of fitness for potential mates.”

    He said this is consistent with the fact that rhythmic song and dances are nearly universal parts of mating, going back hundreds of millions of years to our common ancestors with pre-vertebrate animals such as insects.

    Safron’s previous research has focused on the neural bases of sexual preferences. He said orgasm is related to this work because it is one of the most powerful rewards available, and therefore, may have an important role in shaping preferences.

    Before this paper, we knew what lit up in the brain when people had orgasms, and we knew a lot about the hormonal and neurochemical factors in non-human animals, but we didn’t really know why sex and orgasm feel the way they do,” Safron said. “This paper provides a level of mechanistic detail that was previously lacking.”


  5. ‘Sixth sense’ may be more than just a feeling

    September 29, 2016 by Ashley

    From the National Institutes of Health media release:

    regions of the brain correlated with more severe neurobehavioral symptomsWith the help of two young patients with a unique neurological disorder, an initial study by scientists at the National Institutes of Health suggests that a gene called PIEZO2 controls specific aspects of human touch and proprioception, a “sixth sense” describing awareness of one’s body in space. Mutations in the gene caused the two to have movement and balance problems and the loss of some forms of touch. Despite their difficulties, they both appeared to cope with these challenges by relying heavily on vision and other senses.

    “Our study highlights the critical importance of PIEZO2 and the senses it controls in our daily lives,” said Carsten G. Bönnemann, M.D., senior investigator at the NIH’s National Institute of Neurological Disorders and Stroke (NINDS) and a co-leader of the study published in the New England Journal of Medicine. “The results establish that PIEZO2 is a touch and proprioception gene in humans. Understanding its role in these senses may provide clues to a variety of neurological disorders.”

    Dr. Bönnemann’s team uses cutting edge genetic techniques to help diagnose children around the world who have disorders that are difficult to characterize. The two patients in this study are unrelated, one nine and the other 19 years old. They have difficulties walking; hip, finger and foot deformities; and abnormally curved spines diagnosed as progressive scoliosis.

    Working with the laboratory of Alexander T. Chesler, Ph.D., investigator at NIH’s National Center for Complementary and Integrative Health (NCCIH), the researchers discovered that the patients have mutations in the PIEZO2 gene that appear to block the normal production or activity of Piezo2 proteins in their cells. Piezo2 is what scientists call a mechanosensitive protein because it generates electrical nerve signals in response to changes in cell shape, such as when skin cells and neurons of the hand are pressed against a table. Studies in mice suggest that Piezo2 is found in the neurons that control touch and proprioception.

    “As someone who studies Piezo2 in mice, working with these patients was humbling,” said Dr. Chesler. “Our results suggest they are touch-blind. The patient’s version of Piezo2 may not work, so their neurons cannot detect touch or limb movements.

    Further examinations at the NIH Clinical Center suggested the young patients lack body awareness. Blindfolding them made walking extremely difficult, causing them to stagger and stumble from side to side while assistants prevented them from falling. When the researchers compared the two patients with unaffected volunteers, they found that blindfolding the young patients made it harder for them to reliably reach for an object in front of their faces than it was for the volunteers. Without looking, the patients could not guess the direction their joints were being moved as well as the control subjects could.

    The patients were also less sensitive to certain forms of touch. They could not feel vibrations from a buzzing tuning fork as well as the control subjects could. Nor could they tell the difference between one or two small ends of a caliper pressed firmly against their palms. Brain scans of one patient showed no response when the palm of her hand was brushed.

    Nevertheless, the patients could feel other forms of touch. Stroking or brushing hairy skin is normally perceived as pleasant. Although they both felt the brushing of hairy skin, one claimed it felt prickly instead of the pleasant sensation reported by unaffected volunteers. Brain scans showed different activity patterns in response to brushing between unaffected volunteers and the patient who felt prickliness.

    Despite these differences, the patients’ nervous systems appeared to be developing normally. They were able to feel pain, itch, and temperature normally; the nerves in their limbs conducted electricity rapidly; and their brains and cognitive abilities were similar to the control subjects of their age.

    “What’s remarkable about these patients is how much their nervous systems compensate for their lack of touch and body awareness,” said Dr. Bönnemann. “It suggests the nervous system may have several alternate pathways that we can tap into when designing new therapies.

    Previous studies found that mutations in PIEZO2 may have various effects on the Piezo2 protein that may result in genetic musculoskeletal disorders, including distal arthrogryposis type 5, Gordon Syndrome, and Marden-Walker Syndrome. Drs. Bönnemann and Chesler concluded that the scoliosis and joint problems of the patients in this study suggest that Piezo2 is either directly required for the normal growth and alignment of the skeletal system or that touch and proprioception indirectly guide skeletal development.

    Our study demonstrates that bench and bedside research are connected by a two-way street,” said Dr. Chesler. “Results from basic laboratory research guided our examination of the children. Now we can take that knowledge back to the lab and use it to design future experiments investigating the role of PIEZO2 in nervous system and musculoskeletal development.”

    This work was supported by the NCCIH and NINDS intramural research programs.


  6. Study uncovers new molecular signaling mechanism for correcting childhood visual disorders

    September 17, 2016 by Ashley

    From the University of California, Irvine media release:

    vision_eyesightNeuroscientists at University of California, Irvine have discovered a molecular signaling mechanism that translates visual impairments into functional changes in brain circuit connections. The discovery may help to develop novel therapeutic drugs to treat the childhood visual disorder amblyopia and other neurodevelopment disorders. Xiangmin Xu, Todd Holmes and Sunil Gandhi conducted the study, which appears online Sept. 15 inNeuron.

    Amblyopia is the most common cause of permanent visual defects among children and is often a result of improper brain development due to deprivation during the “critical period” of vision development.

    In a previous study, Xu helped discover that a specific class of inhibitory neurons (parvalbumin-expressing neurons, or PV neurons) control the critical period of vision development.

    In this study, Xu and colleagues found that neuregulin-1 (NRG1) molecules modulate the activities of these neurons, thus outlining a new path for treatments that can restore normal vision in children who have had early deficits. As neurodevelopmental disorders such as schizophrenia appear to result from brain developmental defects during defined postnatal windows, the linkage of NRG1 signaling to critical growth periods provides important new insights. Xu said he hopes that therapeutic interventions targeting NRG1 may be exploited to treat cortical neurodevelopmental disorders.


  7. Evidence of ‘hidden hearing loss’ in college-age human subjects

    September 13, 2016 by Ashley

    From the Massachusetts Eye and Ear Infirmary media release:

    meditation_kidsResearchers from Massachusetts Eye and Ear have, for the first time, linked symptoms of difficulty understanding speech in noisy environments with evidence of cochlear synaptopathy, a condition known as “hidden hearing loss,” in college-age human subjects with normal hearing sensitivity.

    In a study of young adults who may regularly overexpose their ears to loud sounds, a research team led by Stéphane Maison, Ph.D., showed a significant correlation between performance on a speech-in-noise test and an electrophysiological measure of the health of the auditory nerve. The team also saw significantly better scores on both tests among subjects who regularly wore hearing protection when exposed to loud sounds. Their findings were published online today in PLOS ONE.

    “While hearing sensitivity and the ability to understand speech in quiet environments were the same across all subjects, we saw reduced responses from the auditory nerve in participants exposed to noise on a regular basis and, as expected, that loss was matched with difficulties understanding speech in noisy and reverberating environments,” said Dr. Maison, an investigator in the Eaton-Peabody Laboratories at Mass. Eye and Ear and Assistant Professor of Otolaryngology at Harvard Medical School.

    Hearing loss, which affects an estimated 48 million Americans, can be caused by noise or aging and typically arises from damage to the sensory cells of the inner ear (or cochlea), which convert sounds into electrical signals, and/or the auditory nerve fibers that transmit those signals to the brain. It is traditionally diagnosed by elevation in the sound level required to hear a brief tone, as revealed on an audiogram, the gold standard test of hearing sensitivity.

    “Hidden hearing loss,” on the other hand, refers to synaptopathy, or damage to the connections between the auditory nerve fibers and the sensory cells, a type of damage which happens well before the loss of the sensory cells themselves. Loss of these connections likely contributes to difficulties understanding speech in challenging listening environments, and may also be important in the generation of tinnitus (ringing in the ears) and/or hyperacusis (increased sensitivity to sound). Hidden hearing loss cannot be measured using the standard audiogram; thus, the Mass. Eye and Ear researchers set out to develop more sensitive measures that can also test for cochlear synaptopathy.

    Diagnostic measures for hidden hearing loss are important because they help us see the full extent of noise-induced damage to the inner ear. Better measurement tools will also be important in the assessment of future therapies to repair the nerve damage in the inner ear. Mass. Eye and Ear researchers have shown in animal models that, under some conditions, connections between the sensory cells and the auditory nerve can be successfully restored using growth factors, such as neurotrophins.

    “Establishing a reliable diagnosis of hidden hearing loss is key to progress in understanding inner ear disease,” said Dr. Maison. “Not only may this change the way patients are tested in clinic, but it also opens the door to new research, including understanding the mechanisms underlying a number of hearing impairments such as tinnitus and hyperacusis.”

    Authors on the PLOS ONE paper include Dr. Maison, M. Charles Liberman, Ph.D. of the Eaton-Peabody Laboratories of Mass. Eye and Ear, and Michael J. Epstein, Ph.D., and Sandra S. Cleveland, Au.D., CCC-A, of Northeastern University. Research supported by NIDCD RO1 DC00188 and NIDCD P30 DC 05209.


  8. Witnesses can catch criminals by smell: Human nose-witnesses identify criminals in a lineup of body odor

    June 10, 2016 by Ashley

    From the Frontiers media release:

    sad winter manMove over sniffer dogs, people who witnessed a crime are able to identify criminals by their smell. Police lineups normally rely on sight, but nose-witnesses can be just as reliable as eye-witnesses, new research published inFrontiers in Psychology has found.

    Police often use human eye-witnesses, and even ear-witnesses, in lineups but, to date, there have not been any human nose-witnesses;” explained Professor Mats Olsson, experimental psychologist at the Karolinska Institutet in Sweden; “We wanted to see if humans can identify criminals by their body odor.”

    Dogs have been used to identify criminals through body odor identification in court, but it is commonly thought that the human sense of smell is inferior to that of other mammals. However, research shows that humans have the ability to distinguish individuals by their unique body odor. Our olfactory sense is often associated with emotional processing and is directly linked to the areas of the brain associated with emotion and memory; the hippocampus and the amygdala.

    To find out more about human odor memory following stressful events, Olsson and his team investigated how well we identify body odor in a forensic setup. In their first study, participants watched video clips of people committing violent crimes, accompanied by a body odor that they were told belonged to the perpetrator.

    They also watched neutral videos, with a similar setup. Then they identified the criminal’s body odor from a lineup of five different men’s odors, showing correct identification in almost 70% of cases. “It worked beyond my expectation;” explained Olsson; “Most interestingly — participants were far better at remembering and identifying the body odor involved in the emotional setting.”

    Olsson has tested the limits of our nose-witness ability. The team conducted the same experiment but varied the lineup size — three, five and eight body odors, and the time between observing the videos and undertaking the lineup — 15 minutes up to one week. In lineups of up to eight body odors, participants were still able to distinguish the criminal.

    The accuracy of their identification did reduce with the larger lineup size, which is in line with studies on eye and ear-witnesses. The results also show that the ability to distinguish the criminal’s body odor is significantly impaired if the lineup is conducted after one week of having smelt the offender’s body odor.

    There is ongoing research into how the memory of a crime scene can be affected by emotion. This is largely focused on visual memory as visual lineups are the common method of criminal identification.

    “Our work shows that we can distinguish a culprit’s body odor with some certainty;” concluded Olsson; “This could be useful in criminal cases where the victim was in close contact with the assailant but did not see them and so cannot visually identify them.”


  9. Emotions in the age of Botox

    May 31, 2016 by Ashley

    From the Sissa Medialab media release:

    mirror, agingBotulin injections in the facial muscles, which relax expression lines and make one’s skin appear younger as a result of a mild paralysis, have another, not easily predictable effect: they undermine the ability to understand the facial expressions of other people.

    This consequence, as SISSA scientists explain in a new research study, depends on a temporary block of proprioceptive feedback, a process that helps us understand other people’s emotions by reproducing them on our own bodies.

    By now we are all used to seeing its more or less successful results on Italian and international celebrities, but in fact the market of Botox-based procedures (cosmetic treatments that exploit the effects of type A botulin toxin) involves a large number of individuals. Just to give an idea, about 250,000 procedures were done in Italy in 2014. It is therefore natural to wonder about the possible side effects of this practice. One fairly unpredictable consequence concerns the emotional domain, and in particular the perception of emotional information and facial expressions. “The thankfully temporary paralysis of facial muscles that this toxin causes impairs our ability to capture the meaning of other people’s facial expressions,” explains Jenny Baumeister, research scientist at the International School for Advanced Studies (SISSA) in Trieste and first author of a study just published in the journal Toxicon (and carried out with the collaboration of Cattinara Hospital in Trieste).

    Baumeister’s intuition stems from a very well-known scientific theory, called embodiment. The idea is that the processing of emotional information, such as facial expressions, in part involves reproducing the same emotions on our own bodies. In other words, when we observe a smile, our face too tends to smile (often in an imperceptible and automatic fashion) as we try to make sense of that expression. However, if our facial muscles are paralyzed by Botox, then the process of understanding someone else’s emotion expression may turn out to be more difficult.

    Jenny Baumeister had a sample of subjects carrying out a series of different tests assessing their understanding of emotions, immediately before and two weeks after they had had a Botox-based aesthetic procedure, and compared the measurement with a similar sample of subjects that had no treatment. Regardless of the types of measurement (judgement or reaction times) the effect of the paralysis was obvious.

    The negative effect is very clear when the expressions observed are subtle. Instead when the smile is wide and overt, the subjects were still able to recognize it, even if they’ve had the treatment,” explains Francesco Foroni, SISSA researcher who coordinated the study. “For very intense stimuli, although there was a definite tendency to perform worse, the difference was not significant. On the other hand, for “equivocal” stimuli that are more difficult to pick up, the effect of the paralysis was very strong.”

    The finding confirms the assumption that, to some extent at least, “embodied” processes help us understand emotions. It also suggests that the negative influence of Botox may be manifest precisely in those situations in which this help could prove most useful. For instance, think of a normal conversation between two individuals, where mutual understanding is vital to ensure proper social interaction: failure to pick up on emotional nuances or sudden changes in the other person’s mood can make the difference between successful communication and communication breakdown.

    “Our study was devised to investigate embodied cognition. At the same time, we think that awareness of this consequence will be of use to those involved in aesthetic medicine, not least to adequately inform people seeking to undergo these treatments,” commented Foroni.


  10. Critical to screen patients with rheumatoid arthritis for hearing impairment

    April 25, 2016 by Ashley

    From the Bentham Science Publishers media release:

    senior visionRheumatoid arthritis (RA) is the most common autoimmune arthritis, affecting 1% of the general population. Despite its main articular manifestations, RA caninvolveextra-articular organs including the auditory system.

    Hearing impairment (HI) in RA is multifactorial. Mechanism of injury and predisposing factors are not clearly understood. Sensorineural hearing loss is the most common type in RA patients with a prevalence of 25-72%. Possible pathologies are including: Synovial destruction of incudostapedial and incudomalleolar joints, rheumatoid nodules, auditory neuropathy, destruction of the cochlear hair cells and drug-induced ototoxicity. “Elderly Patients and those with long disease duration, active disease, seropositivity, elevatedacute phase reactants and rheumatoid nodules are more likely to have HI,” demonstrated a recent study by Amir Emamifar, Kristine Bjørndal and Inger Marie Jensen Hansen.

    Environmental factors for instance smoking, alcohol and noise can deteriorate the condition. Passive smokers are also at risk of HI. Long-term exposure to alcohol affects hearing in RA, causing harmful effects on the cochlear function.

    Results of pure tone audiometry revealed that RA patients have high prevalence of HI for all frequencies.Transiently Evoked Otoacoustic Emissions (TEOAEs) test has been used widely to evaluate cochlear function, and is capable of detecting various amounts of decreases in RA patients at an early stage of the disease.

    Treatment of HI in RA is empirical. Oral steroids and intensifying Disease-Modifying Antirheumatic Drugs might be an option. Anti-oxidants (eg. vitamin E) may play a protective role for the inner ear. Regular audiometric test and TEOAEs should be performed. Patients will also benefit from the cessation of smoking and alcohol. Like other causes of HI in healthy individuals, HI in RA can also be managed by use of different types of hearing aids and implantable devices.