1. Study suggests entitled people don’t follow instructions because they see them as ‘unfair’

    January 13, 2018 by Ashley

    From the Society for Personality and Social Psychology press release:

    From job applications to being in line at the DMV, instructions, and the expectations that we follow them, are everywhere. Recent research found people with a greater sense of entitlement are less likely to follow instructions than less entitled people are, because they view the instructions as an unfair imposition on them. The results appear in the journal Social Psychological and Personality Science.

    Scientists already know entitled people — technically, individuals with a higher sense of entitlement — are more likely to believe they deserve preferences and resources that others don’t and that they are less concerned about what is socially acceptable or beneficial. For authors Emily Zitek (Cornell University) and Alexander Jordan (Harvard Medical School), understanding the reasons for their behavior could lead to solutions as well.

    “The fact that there are a lot of complaints these days about having to deal with entitled students and entitled employees,” says Zitek, “suggests the need for a solution.”

    Zitek and Jordan conducted a series of studies, first to see who was more likely to avoid following instructions in a word search. After establishing that people who scored high on measures of entitled personality were less likely to follow instructions, they provided a set of scenarios to try to understand why the entitled individuals ignore the instructions: selfishness, control, or punishment. But none of these affected the outcomes; entitled people still wouldn’t follow instructions.

    The researchers were surprised that it was so hard to get entitled individuals to follow instructions.

    “We thought that everyone would follow instructions when we told people that they would definitely get punished for not doing so, but entitled individuals still were less likely to follow instructions than less entitled individuals,” said Zitek.

    A final set of experiments, exploring fairness, finally got to the reason: “Entitled people do not follow instructions because they would rather take a loss themselves than agree to something unfair,” wrote the authors.

    “A challenge for managers, professors, and anyone else who needs to get people with a sense of entitlement to follow instructions is to think about how to frame the instructions to make them seem fairer or more legitimate,” said Zitek.

    Zitek and Jordan write that organizations and societies run more smoothly when people are willing to follow instructions.


  2. Study suggests mirror neuron activity predicts people’s decision-making in moral dilemmas

    by Ashley

    From the University of California – Los Angeles press release:

    It is wartime. You and your fellow refugees are hiding from enemy soldiers, when a baby begins to cry. You cover her mouth to block the sound. If you remove your hand, her crying will draw the attention of the soldiers, who will kill everyone. If you smother the child, you’ll save yourself and the others.

    If you were in that situation, which was dramatized in the final episode of the ’70s and ’80s TV series “M.A.S.H.,” what would you do?

    The results of a new UCLA study suggest that scientists could make a good guess based on how the brain responds when people watch someone else experience pain. The study found that those responses predict whether people will be inclined to avoid causing harm to others when facing moral dilemmas.

    “The findings give us a glimpse into what is the nature of morality,” said Dr. Marco Iacoboni, director of the Neuromodulation Lab at UCLA’s Ahmanson-Lovelace Brain Mapping Center and the study’s senior author. “This is a foundational question to understand ourselves, and to understand how the brain shapes our own nature.”

    In the study, which was published in Frontiers in Integrative Neuroscience, Iacoboni and colleagues analyzed mirror neurons, brain cells that respond equally when someone performs an action or simply watches someone else perform the same action. Mirror neurons play a vital role in how people learn through mimicry and feel empathy for others.

    When you wince while seeing someone experience pain — a phenomenon called “neural resonance” — mirror neurons are responsible.

    Iacoboni wondered if neural resonance might play a role in how people navigate complicated problems that require both conscious deliberation and consideration of another’s feelings.

    To find out, researchers showed 19 volunteers two videos: one of a hypodermic needle piercing a hand, and another of a hand being gently touched by a cotton swab. During both, the scientists used a functional MRI machine to measure activity in the volunteers’ brains.

    Researchers later asked the participants how they would behave in a variety of moral dilemmas, including the scenario involving the crying baby during wartime, the prospect of torturing another person to prevent a bomb from killing several other people and whether to harm research animals in order to cure AIDS.

    Participants also responded to scenarios in which causing harm would make the world worse — inflicting harm on another person in order to avoid two weeks of hard labor, for example — to gauge their willingness to cause harm for moral reasons and for less-noble motives.

    Iacoboni and his colleagues hypothesized that people who had greater neural resonance than the other participants while watching the hand-piercing video would also be less likely to choose to silence the baby in the hypothetical dilemma, and that proved to be true. Indeed, people with stronger activity in the inferior frontal cortex, a part of the brain essential for empathy and imitation, were less willing to cause direct harm, such as silencing the baby.

    But the researchers found no correlation between people’s brain activity and their willingness to hypothetically harm one person in the interest of the greater good — such as silencing the baby to save more lives. Those decisions are thought to stem from more cognitive, deliberative processes.

    The study confirms that genuine concern for others’ pain plays a causal role in moral dilemma judgments, Iacoboni said. In other words, a person’s refusal to silence the baby is due to concern for the baby, not just the person’s own discomfort in taking that action.

    Iacoboni’s next project will explore whether a person’s decision-making in moral dilemmas can be influenced by decreasing or enhancing activity in the areas of the brain that were targeted in the current study.

    “It would be fascinating to see if we can use brain stimulation to change complex moral decisions through impacting the amount of concern people experience for others’ pain,” Iacoboni said. “It could provide a new method for increasing concern for others’ well-being.”

    The research could point to a way to help people with mental disorders such as schizophrenia that make interpersonal communication difficult, Iacoboni said.

    The study’s first author is Leo Moore, a UCLA postdoctoral scholar in psychiatry and biobehavioral sciences. Paul Conway of Florida State University and the University of Cologne, Germany, is the paper’s other co-author.

    The study was supported by the National Institute of Mental Health, the Brain Mapping Medical Research Organization, the Brain Mapping Support Foundation, the Pierson-Lovelace Foundation, the Ahmanson Foundation, the William M. and Linda R. Dietel Philanthropic Fund at the Northern Piedmont Community Foundation, the Tamkin Foundation, the Jennifer Jones-Simon Foundation, the Capital Group Companies Charitable Foundation, the Robson family, and the Northstar Fund.


  3. Study suggests eating more foods with choline during pregnancy could boost baby’s brain

    January 12, 2018 by Ashley

    From the Cornell University press release:

    When expectant mothers consume sufficient amounts of the nutrient choline during pregnancy, their offspring gain enduring cognitive benefits, a new Cornell University study suggests.

    Choline — found in egg yolks, lean red meat, fish, poultry, legumes, nuts and cruciferous vegetables — has many functions, but this study focused on its role in prenatal brain development.

    The researchers, who published their findings online in The FASEB Journal, used a rigorous study design to show cognitive benefits in the offspring of pregnant women who daily consumed close to twice the currently recommended amount of choline during their last trimester.

    “In animal models using rodents, there’s widespread agreement that supplementing the maternal diet with additional amounts of this single nutrient has lifelong benefits on offspring cognitive function,” said Marie Caudill, professor of nutritional sciences and the study’s first author. “Our study provides some evidence that a similar result is found in humans.”

    The finding is important because choline is in high demand during pregnancy yet most women consume less than the recommended 450 milligrams per day.

    “Part of that is due to current dietary trends and practices,” said Richard Canfield, a developmental psychologist in the Division of Nutritional Sciences and the senior author of the study. “There are a lot of choline-rich foods that have a bad reputation these days,” he said. Eggs, for example, are high in cholesterol, and health professionals, including those in the government, have raised caution about pregnant women consuming undercooked eggs, which may deter women from eating them altogether, even though such risks are low for pasteurized or cooked eggs, Canfield said. Red meats are often avoided for their high saturated fat content, and liver is not commonly eaten, he added.

    Two previous studies by other research teams had mixed results after examining cognitive effects of maternal choline supplementation, perhaps due to study designs that were not tightly controlled, Caudill said.

    In this study, 26 women were randomly divided into two groups and all the women consumed exactly the same diet. Intake of choline and other nutrients were tightly controlled, which was important since the metabolism of choline and its functions can overlap with such nutrients as vitamin B12, folic acid and vitamin B6.

    “By ensuring that all the nutrients were provided in equal amounts, we could be confident that the differences in the infants resulted from their choline intake,” Caudill said. In this study, half the women received 480 mg/day of choline, slightly more than the adequate intake level, and the other half received 930 mg/day.

    Canfield and co-author Laura Muscalu, a lecturer in the Department of Psychology at Ithaca College, tested infant information processing speed and visuospatial memory at 4, 7, 10 and 13 months of age. They timed how long each infant took to look toward an image on the periphery of a computer screen, a measure of the time it takes for a cue to produce a motor response. The test has been shown to correlate with IQ in childhood. Also, research by Canfield and others shows that infants who demonstrate fast processing speeds when young typically continue to be fast as they age.

    While offspring in both groups showed cognitive benefits, information processing speeds were significantly faster for the group of expectant mothers who consumed 930 mg/day when compared with the group that took 480 mg/day over the same period.

    Though the study has a small sample, it suggests that current recommendations for daily choline intake may not be enough to produce optimal cognitive abilities in offspring, Canfield said. Current choline intake recommendations are based on amounts required to prevent liver dysfunction, and were extrapolated from studies done in men in part because no studies had investigated requirements during pregnancy.

    The study was funded by the Egg Nutrition Center, the Beef Checkoff, the U.S. Department of Agriculture, the Institute for the Social Sciences, the Bronfenbrenner Life Course Center, and the National Institute of Food and Agriculture.


  4. Study suggests punishment might not always be as effective as we think

    by Ashley

    From the Hokkaido University press release:

    Punishment might not be an effective means to get members of society to cooperate for the common good, according to a social dilemma experiment.

    A game to study human behavior has shown punishment is an ineffective means for promoting cooperation among players. The result has implications for understanding how cooperation has evolved to have a formative role in human societies.

    Human societies maintain their stability by forming cooperative partnerships. But, cooperation often comes at a cost. For example, a person taking time to raise the alarm in order to alert other members of a group to impending danger could be losing valuable time to save oneself. It is unclear why natural selection favors cooperativeness among individuals who are inherently selfish.

    In theoretical studies, punishment is often seen as a means to coerce people into being more cooperative. To examine such theory, a team of international researchers led by Marko Jusup of Hokkaido University in Japan and Zhen Wang of Northwestern Polytechnical University in China has conducted a “social dilemma experiment.” The team investigated if providing punishment as an option helps improve the overall level of cooperation in an unchanging network of individuals.

    They used a version of the commonly employed “prisoner’s dilemma” game. Two hundred and twenty-five students in China were organized into three trial groups and played 50 rounds each of the game.

    In group one, every student played with two opponents which changed every round. The students could choose between “cooperate” or “defect,” and points were given based on the combined choices made. If a student and the two opponents chose “defect,” the student gained zero points. If they all chose “cooperate,” the student gained four points. If only a student chose to defect while the other two chose to cooperate, the gain for the student was eight points.

    The second group was similar to the first one in every aspect except that the people playing the game with each other remained the same for the duration of the 50 rounds, enabling them to learn each other’s characteristics.

    In the third group, players also remained the same. However, a new option, “punish,” was introduced. Choosing punishment led to a small reduction in points for the punisher and a larger reduction of points for the punishees.

    At the end of the game, overall points were counted and the students were given monetary compensation based on the number of points won.

    The expectation is that, as individuals play more with the same opponents over several rounds, they see the benefit of cooperating in order to gain more points. Introducing punishment as an option is basically saying: if you don’t cooperate with me, I’ll punish you. In theory, it is expected that applying this option would lead to more cooperation.

    The researchers found that players in the constantly changing groups cooperated much less (4%) than those in the static groups (38%), where they were able to establish which players were willing to cooperate and thus gain a larger average financial payoff for all involved.

    Surprisingly, however, adding punishment as an option did not improve the level of cooperation (37%). The final financial payoffs in this trial group were also, on average, significantly less than those gained by players in the static group. Interestingly, less defection was seen in the punishment group when compared to the static group; some players replaced defection with punishment.

    While the implied message when punishing someone is ‘I want you to be cooperative,’ the immediate effect is more consistent with the message ‘I want to hurt you,'” write the researchers in their study published in the journal Proceedings of the National Academy of Sciences.

    Punishment seems to have an overall demoralizing effect, as individuals who get punished on multiple occasions may see a good chunk of their total payoff vanish in a short period of time, explain the researchers. This could lead players to lose interest in the game and play the remaining rounds with less of a rational strategy. The availability of punishment as an option also seems to reduce the incentive to choose cooperation over competition.

    Why, then, is punishment so pervasive in human societies? “It could be that human brains are hardwired to derive pleasure from punishing competitors,” says Jusup. “However, it is more likely that, in real life, a dominant side has the ability to punish without provoking retaliation,” adds Wang.

    Although the study provides valuable insights into how cooperation arises in human society, the team advises it would be unwise to extrapolate the implications of their study far beyond the experimental setting.


  5. Study suggests food cues undermine healthy eating choices

    January 11, 2018 by Ashley

    From the Universiteit van Amsterdam (UVA) press release:

    Obesity has become a major health issue due to the current ‘obesogenic’ environment in which unhealthy food is both easy and cheap to purchase. As a result, many (government) organisations encourage healthy eating habits among the general public by providing information on healthy diets. Nevertheless, when people encounter stimuli that they have learned to associate with certain snacks, they tend to choose those products, even when they know these are unhealthy. This is the finding of research carried out by psychologists Aukje Verhoeven, Poppy Watson and Sanne de Wit from the University of Amsterdam (UvA).

    The researchers investigated the effects of health warnings on food choices in the presence or absence of food-associated stimuli. This includes every kind of stimuli associated with food, including adverts that trigger thoughts of a tasty snack or the sight or smell of food which leads to craving.

    ‘Health warnings often make people want to choose healthier food products, yet many still end up picking unhealthy food products’, says Verhoeven. ‘We suspected this might partly be due to the fact that people learn to associate specific cues in their environment with certain food choices. For example, eating a cheese burger regularly occurs in the visual presence of a large logo M. This causes a strong association between the stimulus (the logo) and the rewarding experience of eating a cheese burger. Simply seeing an M eventually causes us to crave a burger and triggers a learned behaviour to head to a fast-food restaurant. Unhealthy choices are therefore automatically activated by learned associations, making health warnings, which focus on conscious choices, ineffective.’

    To test their hypothesis, the researchers used a specific computer task, the Pavlovian-instrumental transfer, in a controlled setting to simulate the learning processes between certain (food) choices and environmental stimuli in subjects. ‘Health warnings for healthy food choices only seem to be effective in an environment where no food cues are present. Whenever stimuli are present which people have come to associate with certain snacks, they choose the accompanying (unhealthy) food product, even when they know it is unhealthy or aren’t really craving that food product. It didn’t matter whether we alerted the subjects before or after they learned the associations with food cues’, says Verhoeven.

    How do you ensure people don’t just have the intention to buy healthier food products but actually go ahead and do so? The researchers suggest decreasing the level of food-associated stimuli people, and children in particular, are exposed to. One way to do this, for example, would be to decrease the amount of advertising for unhealthy foods. Also, the results suggest that these processes could in turn stimulate the choice for healthy products. Verhoeven: ‘It is worthwhile exposing people to healthy food products together with certain environmental cues more often, for example by showing more adverts for healthy products. The environment could also be shaped such that healthy choices are the easiest to make, for instance by placing healthy products at the front in canteens or by replacing chocolate bars with apples and healthy snacks at the cash register. In this way, you give people a gentle push in the right direction.’


  6. Study suggests consumers less likely to go through with purchases when on mobile devices

    by Ashley

    From the University of East Anglia press release:

    Shoppers hoping to bag a bargain in the post-Christmas sales are much less likely to go through with their purchases if they are using phones and tablets to buy goods online.

    This is because consumers often worry they are not seeing the full picture on a mobile app or that they could be missing out on special offers or overlooking hidden costs, according to new research. Concerns about privacy and security can also motivate people to put items into their shopping baskets but then quit without paying.

    Although mobile apps are rapidly becoming among the most popular ways to shop online, the phenomenon of shopping cart abandonment is much higher than for desktop-based online shopping. According to Market Research firm Criteo , the share of e-commerce traffic from mobile devices increased to 46% of global e-commerce traffic in Q2 2016 however, only 27% of purchases initiated on this channel were finalized and conversion rates significantly lagged behind desktop initiated purchases.

    Researchers at the University of East Anglia (UEA) investigating why this is so say it represents a huge challenge for online retailers, who are investing heavily in mobile shopping, but not reaping the rewards in successful sales.

    “Our study results revealed a paradox,” said Dr Nikolaos Korfiatis, of Norwich Business School at UEA. “Mobile shopping is supposed to make the process easier, and yet concerns about making the right choice, or about whether the site is secure enough leads to an ‘emotional ambivalence’  about the transaction – and that mean customers are much more likely to simply abandon their shopping carts without completing a purchase.”

    The researchers studied online shopping data from 2016-2017 from consumers in Taiwan and the US. They found that the reasons for hesitation at the checkout stage were broadly the same in both countries. In addition, shoppers are much more likely use mobile apps as a way of researching and organising goods, rather than as a purchasing tool, and this also contributes to checkout hesitation.

    “People think differently when they use their mobile phones to make purchases,” said Dr Korfiatis. “The smaller screen size and uncertainty about missing important details about the purchase make you much more ambivalent about completing the transaction than when you are looking at a big screen.”

    Flora Huang, the study’s lead author, added: “This is a phenomenon that has not been well researched, yet it represents a huge opportunity for retailers. Companies spend a lot of money on tactics such as pay-per-click advertising to bring consumers into online stores – but if those consumers come in via mobile apps and then are not finalising their purchases, a lot of that money will be wasted.”

    The team’s results, published in the Journal of Business Research, showed that consumers are much less likely to abandon their shopping baskets if they are satisfied with the choice process. App designers can help by minimising clutter to include only necessary elements on the device’s limited screen space and organising sites via effective product categorisation or filter options so consumers can find products more easily.

    Other strategies that might prompt a shopper to complete a purchase include adding special offers, or coupons for a nearby store at the checkout stage.

    “Retailers need to invest in technology, but they need to do it in the right way, so the investment pays off,” added Dr Korfiatis. “Customers are becoming more and more demanding and, with mobile shopping in particular, they don’t forgive failures so offering a streamlined, integrated service is really important.”

    The article ‘Mobile shopping cart abandonment: the roles of conflicts, ambivalence and hesitation’, by GH Huang, N Korfiatis, CT Chang, appears in the Journal of Business Research, published by Elsevier.

     


  7. Seeing gene influencing performance of sleep-deprived people

    January 10, 2018 by Ashley

    From the Washington State University press release:

    Washington State University researchers have discovered a genetic variation that predicts how well people perform certain mental tasks when they are sleep deprived.

    Their research shows that individuals with a particular variation of the DRD2 gene are resilient to the effects of sleep deprivation when completing tasks that require cognitive flexibility, the ability to make appropriate decisions based on changing information.

    Sleep-deprived people with two other variations of the gene tend to perform much more poorly on the same kinds of tasks, the researchers found.

    The DRD2 dopamine receptor gene influences the processing of information in the striatum, a region of the brain that is known to be involved in cognitive flexibility.

    “Our work shows that there are people who are resilient to the effects of sleep deprivation when it comes to cognitive flexibility. Surprisingly these same people are just as affected as everyone else on other tasks that require different cognitive abilities, such as maintaining focus,” said Paul Whitney, a WSU professor of psychology and lead author of the study, which appeared in the journal Scientific Reports. “This confirms something we have long suspected, namely that the effects of sleep deprivation are not general in nature, but rather depend on the specific task and the genes of the person performing the task.”

    Why sleep loss affects us differently

    When deprived of sleep, some people respond better than others. Scientists have identified genes associated with this, but they have wondered why the effects of sleep loss tend to vary widely across both individuals and cognitive tasks. For example, after a day without sleep, some people might struggle with a reaction time test but perform well on decision-making tasks, or vice versa.

    In the current study, Whitney, along with colleagues John Hinson, WSU professor of psychology, and Hans Van Dongen, director of the WSU Sleep and Performance Research Center at WSU Spokane, compared how people with different variations of the DRD2 gene performed on tasks designed to test both their ability to anticipate events and their cognitive flexibility in response to changing circumstances.

    Forty-nine adults participated in the study at the WSU Spokane sleep laboratory. After a 10-hour rest period, 34 participants were randomly selected to go 38 hours without sleep while the other participants were allowed to sleep normally.

    Before and after the period of sleep deprivation, subjects were shown a series of letter pairings on a computer screen and told to click the left mouse button for a certain letter combination (e.g., an A followed by an X) and the right mouse button for all other letter pairs. After a while, both the sleep-deprived group and the rested group were able to identify the pattern and click correctly for various letter pairs.

    Then came the tricky part: in the middle of the task, researchers told the participants to now click the left mouse button for a different letter combination. The sudden switch confounded most of the sleep-deprived participants, but those who had a particular variation of the DRD2 gene handled the switch as well as they did when well-rested.

    “Our research shows this particular gene influences a person’s ability to mentally change direction when given new information,” Van Dongen said. “Some people are protected from the effects of sleep deprivation by this particular gene variation but, for most of us, sleep loss does something to the brain that simply prevents us from switching gears when circumstances change.”

    Training to cope with sleep loss

    Sleep deprivation’s effect on cognitive flexibility can have serious consequences, especially in high stakes, real-world situations like an emergency room or military operations where the ability to respond to changing circumstances is critical. For example, after a night without sleep, a surgeon might notice a spike in a patient’s vital signs midway through a procedure but be unable to use this information to decide on a better course of action.

    The WSU research team is currently applying what they learned from their study to develop new ways to help surgeons, police officers, soldiers and other individuals who regularly deal with the effects of sleep deprivation in critical, dynamic settings cope with the loss of cognitive flexibility.

    “Our long-term goal is to be able to train people so that no matter what their genetic composition is, they will be able to recognize and respond appropriately to changing scenarios, and be less vulnerable to sleep loss.” Whitney said. “Of course, the more obvious solution is to just get some sleep, but in a lot of real-world situations, we don’t have that luxury.”


  8. Study finds no link between watching TV crime shows and ability to conceal crime

    by Ashley

    From the Universität Mainz press release:

    Does watching the work of fictional forensic investigators on TV influence viewers? There is a belief that this is the case and that the consequences of people watching shows such as the American crime drama television series “CSI: Crime Scene Investigation” are filtering through into real life, a phenomenon that has been called the CSI effect. In the worst case, it is feared, potential criminals will learn how to better conceal a crime from these shows. In addition, concerns have been expressed that members of U.S. juries may now have excessive expectations regarding the evidence and as a result are more likely to acquit the accused. A team of psychologists at Johannes Gutenberg University Mainz working under Professor Heiko Hecht have now sounded the all-clear — at least in one respect. In an experimental study, the German researchers have been able to find no evidence of a correlation between watching forensic science TV shows and the ability to get away with committing a crime. This is the first study to look at the question of whether criminals could profit from viewing dramas of this sort.

    CSI: Crime Scene Investigation is a popular U.S. TV series which first hit the small screen in its home country in 2000. It focuses on the characters and the work of a team of forensic crime scene investigators. The effect named for this series was soon applied to any repercussions that it was held such widely-viewed crime shows had with regard to the general public — including criminals, the police, and potential students of forensic medicine. “Over many years, it was presumed that certain links in this regard exist, although there were no appropriate studies to prove this,” said Dr. Andreas Baranowski. He and his colleagues at Mainz University have now undertaken four separate investigations of related claims with the aim of obtaining the most reliable possible findings.

    As a first step, the psychologists took a look at statistics from the databases of the FBI and its German equivalent, the Bundeskriminalamt (BKA), and compared the crime detection rates during the years preceding the launch of the CSI series with the subsequent rates. Then they asked 24 convicted criminals in prisons for their opinions on series such as CSI and whether they thought such shows could help when it came to escaping prosecution. Thirdly, the researchers put together a complex experimental design to find out whether viewers of TV shows like CSI would, as trial subjects, actually be better equipped to erase the traces of an, in this case, mock crime. Baranowski and his colleagues completed their series of trials in the form of a fourth test, in which a crime was re-enacted with the help of a doll’s house.

    No CSI learning effect for criminals

    On the whole, the researchers did not find any connection between watching forensic dramas and the ability to successfully avoid detection after committing a crime. However, the male subjects in the fourth part of the experiment performed better than female subjects, and younger subjects better than older subjects while more highly educated subjects did better than less well educated study subjects. Study subjects working in technical professions, primarily men, appear to have certain advantages when it comes to concealing crimes.

    Baranowski pointed out it had already been postulated in the past that something like the CSI effect could exist. Starting with Sherlock Holmes and continuing as police procedurals, such as Quincy and Law & Order, appeared on TV, warning voices made themselves heard that the wrong kind of people could benefit from the insights provided. “Every time something new emerges there are people who focus in one aspect and without a full and proper consideration sense possible risks and thus call for bans.” The findings in this context can be said to pour cold water on attitudes like this. “We can now dispel certain of the myths that have been coursing through the media and other publications for the past 20 years because we are able to state with relative certainty that people who watch CSI are no better at covering their tracks than other people.”

    Dr. Andreas Baranowski supervised the study, “The CSI-education effect: Do potential criminals benefit from forensic TV series?,” at the Division of General Experimental Psychology at Johannes Gutenberg University Mainz and works as a postdoctoral researcher in psychology at Giessen University.


  9. Study suggests adolescent brain makes learning easier

    January 9, 2018 by Ashley

    From the Leiden Universiteit press release:

    The brains of adolescents react more responsively to receiving rewards. This can lead to risky behaviour, but, according to Leiden University research, it also has a positive function: it makes learning easier. This work has been published in Nature Communications.

    Alcohol abuse, reckless behaviour and poor choice in friends: all these are inextricably linked to puberty and adolescence. In the late teens, young people test their limits, and in many cases, push beyond their limits. This is due in part to increased activity in the corpus striatum, a small area deeply hidden away inside the brain. According to previous research, that part of the brain in young people is more responsive to receiving rewards.

    Sensitive

    Leiden University scientists are now able to show that this increased activity in the corpus striatum does not have only negative consequences. ‘The adolescent brain is very sensitive to feedback,’ says Sabine Peters, assistant professor of developmental and educational psychology and lead author of the article. ‘That makes adolescence the ideal time to acquire and retain new information.’

    Peters used a large data set for her research with MRI scans. Over a period of five years, no fewer than 736 brain scans were made of a total of 300 subjects between the ages of 8 and 29. According to Peters, the data set is about ten times larger than that of most comparable studies. In the MRI scanner, participants had to solve a memory game. During that game, the researchers gave feedback on the participants’ performance.

    Instructional feedback

    ‘It showed that adolescents responded keenly to educational feedback’, says Peters. ‘If the adolescent received useful feedback, then you saw the corpus striatum being activated. This was not the case with less pertinent feedback, for example, if the test person already knew the answer. The stronger your brain recognises that difference, the better the performance in the learning task. Brain activation could even predict learning performance two years into the future.’

    It has been known for some time that adolescent brains become more ‘successful’ when they receive the same reward as small children or adults. For example, it has already been proven that the use of drugs and/or alcohol in the teenage years is linked to powerful activation in the brain’s reward system. Peters: ‘It explains why adolescents and young adults go on a voyage of discovery, with all the positive and negative consequences that entails. You see the same behaviour in many animal species, including rats and mice.’


  10. Blueberry vinegar improves memory in mice with amnesia

    by Ashley

    From the American Chemical Society press release:

    Dementia affects millions of people worldwide, robbing them of their ability to think, remember and live as they once did. In the search for new ways to fight cognitive decline, scientists report in ACS’ Journal of Agricultural and Food Chemistry that blueberry vinegar might offer some help. They found that the fermented product could restore cognitive function in mice.

    Recent studies have shown that the brains of people with Alzheimer’s disease, the most common form of dementia, have lower levels of the signaling compound acetylcholine and its receptors. Research has also demonstrated that blocking acetylcholine receptors disrupts learning and memory. Drugs to stop the breakdown of acetylcholine have been developed to fight dementia, but they often don’t last long in the body and can be toxic to the liver. Natural extracts could be a safer treatment option, and some animal studies suggest that these extracts can improve cognition. Additionally, fermentation can boost the bioactivity of some natural products. So Beong-Ou Lim and colleagues wanted to test whether vinegar made from blueberries, which are packed with a wide range of active compounds, might help prevent cognitive decline.

    To carry out their experiment, the researchers administered blueberry vinegar to mice with induced amnesia. Measurements of molecules in their brains showed that the vinegar reduced the breakdown of acetylcholine and boosted levels of brain-derived neurotrophic factor, a protein associated with maintaining and creating healthy neurons. To test how the treatment affected cognition, the researchers analyzed the animals’ performance in mazes and an avoidance test, in which the mice would receive a low-intensity shock in one of two chambers. The treated rodents showed improved performance in both of these tests, suggesting that the fermented product improved short-term memory. Thus, although further testing is needed, the researchers say that blueberry vinegar could potentially be a promising food to help treat amnesia and cognitive decline related to aging.