1. Study looks at how we perceive fictional characters

    May 14, 2017 by Ashley

    From the University at Buffalo press release:

    The Sopranos‘ Tony Soprano and Walter White from Breaking Bad rank among recent television drama’s most notorious protagonists, each of questionable morality. So, here’s the question: Do you like them?

    If you answered “yes” as many people do, a recently published paper by a University at Buffalo researcher suggests why people might feel good about characters who do bad things.

    The research has implications ranging from better understanding our relationship to fictional narratives, to possibly improving the effectiveness of character-based public service campaigns, according to lead author Matthew Grizzard, an assistant professor in UB’s Department of Communication and an expert on the cognitive, emotional and psychobiological effects of media entertainment.

    The results, with co-authors Jialing Huang, Kaitlin Fitzgerald, Changhyun Ahn and Haoran Chu, all UB graduate students, appear in the journal Communication Research.

    Grizzard says the reasons an audience may like or dislike a character has been a key question for media researchers since the 1970s.

    Morality matters. Viewers tend to like the good guys and dislike the bad guys. But Grizzard’s study, which builds on previous research, also suggests that we don’t necessarily need to see behavior to make a distinction between the hero and the villain.

    Grizzard’s study manipulated what characters looked like and measured audience perceptions. They hoped to find out whether simple differences in appearance — for example black clothes compared to lighter colors — would be enough for viewers to categorize a character as a hero or villain.

    Earlier research, meantime, had found that heroes and villains differ in morality but not competence.

    “Villains aren’t just immoral. They’re good at being bad,” according to Grizzard.

    The previous research gave Grizzard’s research team a pattern for determining whether they were activating perceptions of heroes and villains or whether they were simply creating biases unrelated to narrative characters.

    “If our data had come out where the heroic-looking character was both more moral and more competent than the villain, then we probably just created a bias. But because the hero was more moral than the villain but equally competent, we’re more confident that visuals can activate perceptions of heroic and villainous characters,” says Grizzard.

    Beyond an understanding of how visuals can influence perceptions of character, the findings indicate that judgments of characters depend on comparisons audiences make between characters — and the order of introduction plays a role.

    Heroes were judged to be more heroic when they appeared after a villain and villains were judged to be more villainous when they appeared after a hero.

    “What’s happening here is that we’re not making isolated judgements about these characters using some objective standard of morality,” says Grizzard. “We’re constantly making comparisons about these characters and the forces they face.”

    With Walter White, for example, the audience sees the evolution of a character whose ethics progressively spiral downward from season to season, and yet audiences remained loyal to him. That loyalty surprised Breaking Bad creator Vince Gilligan.

    Gilligan said in an interview that at some point he thought the audience would turn against the character. But Grizzard points out that the show’s antagonists concurrently get worse along with White.

    “We see a trend where White is not decreasing by himself. There’s likely to be a constant comparison with other characters,” says Grizzard. “So White is probably benefiting from the comparisons the audience is making.”

    Grizzard says it’s a polarization effect. The hero seems better when compared to the villain’s darkness while the villain seems worse when viewed in the hero’s light.

    But being bad isn’t, well, all bad, at least in the world of fictional narrative. It comes with certain advantages, according to Grizzard.

    “We find that characters who are perceived as villains get a bigger boost from the good actions or apparent altruism than heroes, like the Severus Snape character from the Harry Potter books and films.”

    These findings could be informative for public service campaigns or tele-novellas that try to promote certain behaviors.

    “It shows the importance of how these characters are framed,” he says.


  2. Study suggests perceived knowledge of others’ “true selves” is inherently tied to perceptions of their morality

    May 8, 2017 by Ashley

    From the Texas A&M University press release:

    How can you ever really know someone? Researchers at Texas A&M University found our perceived knowledge of others’ “true selves” is inherently tied to perceptions of their morality.

    “We found evidence consistent with a strong psychological link between morality and identity,” says Andrew Christy, a researcher in Texas A&M’s Department of Psychology who specializes in social and personality psychology.

    For “The Reciprocal Relationship Between Perceptions of Moral Goodness and Knowledge of Others’ True Selves,” published in the journal Social Psychological and Personality Science, Christy and his co-authors hypothesized that people would report knowing the most about others who were portrayed as morally good. A secondary hypothesis was that this effect would work both ways, such that others who were described as being easily knowable would also be perceived as more moral than those described as being unknowable.

    The researchers presented participants with texts describing a person’s morality and competency, then asked them to measure how well they thought they knew the person’s “true self.” In another study, participants read a passage in which the author describes their roommate as either being readily knowable or almost completely unknowable. Afterwards they answered questions about their impressions of the target, including how moral vs. immoral they perceived the target to be.

    The researchers found a strong psychological link between morality and identity. “We found that participants reported knowing the most about targets when they were described as moral, compared to targets described in different ways (e.g. competent),” explains Christy. “Our results also suggested that this is a bidirectional relationship — participants perceived the knowable target as more moral than the unknowable target.”

    Christy says these findings show that most people operate on the default assumption that others are fundamentally good. “Thus when someone is kind to us or otherwise provides evidence of goodness, this effectively confirms our pre-existing assumptions about them and leads us to feel that we really do know them.”

    So when it comes to friendships and other relationships, he concludes, “people will feel most familiar with others who make morally good first impressions.”


  3. Study looks at link between testosterone and impulsivity

    May 5, 2017 by Ashley

    From the California Institute of Technology press release:

    Hotheaded, impulsive men who shoot first and ask questions later are a staple of Westerns and 1970s cop films, but new research shows there might be truth to the trope.

    A study conducted by researchers from Caltech, the Wharton School, Western University, and ZRT Laboratory tested the hypothesis that higher levels of testosterone increase the tendency in men to rely on their intuitive judgments and reduce cognitive reflection — a decision-making process by which a person stops to consider whether their gut reaction to something makes sense. The researchers found that men given doses of testosterone performed more poorly on a test designed to measure cognitive reflection than a group given a placebo.

    The research will appear in an upcoming issue of the journal Psychological Science.

    “What we found was the testosterone group was quicker to make snap judgments on brain teasers where your initial guess is usually wrong,” says Caltech’s Colin Camerer, the Robert Kirby Professor of Behavioral Economics and T&C Chen Center for Social and Decision Neuroscience Leadership Chair. “The testosterone is either inhibiting the process of mentally checking your work or increasing the intuitive feeling that ‘I’m definitely right.'”

    The study, which is one of the largest of its type ever conducted, included 243 males who were randomly selected to receive a dose of testosterone gel or placebo gel before taking a cognitive reflection test. A math task was also given to control for participant engagement, motivation level, and basic math skills.

    The questions included on the cognitive reflection test are exemplified by the following:

    A bat and a ball cost $1.10 in total. The bat costs $1 more than the ball.

    How much does the ball cost?

    For many people, the first answer that comes to mind is that the ball costs 10 cents, but that’s incorrect because then the bat costs only 90 cents more than the ball. The correct answer is that the ball costs 5 cents and the bat costs $1.05. An individual prone to relying on their gut instincts would be more likely to accept their first answer of 10 cents. However, another person might realize their initial error through cognitive reflection and come up with the correct answer.

    Participants were not limited on time while taking the test and were offered $1 for each correct answer and an additional $2 if they answered all the questions correctly.

    The results show that the group that received testosterone scored significantly lower than the group that received the placebo, on average answering 20 percent fewer questions correctly. The testosterone group also “gave incorrect answers more quickly, and correct answers more slowly than the placebo group,” the authors write. The same effect was not seen in the results of the basic math tests administered to both groups. The results “demonstrate a clear and robust causal effect of [testosterone] on human cognition and decision-making,” they conclude.

    The researchers believe that the phenomenon they’ve observed can be linked to testosterone’s effect of increasing confidence in humans. Testosterone is thought to generally enhance the male drive for social status, and recent studies have shown that confidence enhances status.

    “We think it works through confidence enhancement. If you’re more confident, you’ll feel like you’re right and will not have enough self-doubt to correct mistakes,” Camerer says.

    Camerer says the results of the study raise questions about potential negative effects of the growing testosterone-replacement therapy industry, which is primarily aimed at reversing the decline in sex drive many middle-aged men experience.

    “If men want more testosterone to increase sex drive, are there other effects? Do these men become too mentally bold and thinking they know things they don’t?”


  4. Study suggests charisma isn’t necessary to be a good boss

    April 26, 2017 by Ashley

    From the Michigan State University press release:

    You don’t need the charisma of Steve Jobs to be an effective boss, indicates new research led by Michigan State University business scholars.

    In a series of five studies, the researchers found bosses who were supportive and set clear expectations — but weren’t necessarily transformational leaders with grand visions — were able to effectively motivate their employees.

    The study, published in the journal Organizational Behavior and Human Decision Processes, is one of the first to examine how a leader’s regulatory focus, or mindset, affects his or her own behavior and, in turn, employees’ motivation.

    The findings suggest bosses can modify their mindset to produce a certain outcome from workers, whether that’s innovation or a more conservative work focus aimed at meeting basic obligations and preventing errors, said Brent Scott, MSU professor of management and study co-author.

    Effective leadership may be based in part on leader’s ability to recognize when a particular mental state is needed in their employees and to adapt their own mental state and their behaviors to elicit that mindset,” Scott said. “Part of the story here is that you don’t have to be Steve Jobs to be an effective leader. There is no one-size-fits-all approach to managing.”

    The research team, led by Russell Johnson, MSU associate professor of management, conducted field studies and experiments with hundreds of managers and employees from a variety of industries including professional services, manufacturing and government.

    Bosses who had an innovative mindset (called promotion focus) were more likely to lead in a transformative way and thus elicit an innovative mindset among employees. Bosses with a more conservative mindset (called prevention focus) were more likely to “manage by exception,” which involves focusing almost exclusively on preventing mistakes, thus eliciting a prevention focus among workers.

    “We found that the motivations of managers are contagious and ‘trickle down’ to their subordinates,” Johnson said. “Thus, if managers are unhappy with how their people are approaching work tasks, the managers might actually be the ones responsible for eliciting their motivation in the first place. Managers can modify their leadership behavior to trigger the appropriate motivation orientation in their employees to fit the situation.”

    While the study identified some transformational leaders who elicited an innovative mindset among employees, Scott said it’s unrealistic to think that every boss can be — or necessarily wants to be — a transformational leader all of the time. Some work situations and environments may call for a more preventative approach.

    A sweet spot for managing may be what’s called “contingent reward behavior,” which brings in both a promotion and prevention focus — emphasizing gains and providing both positive and negative reinforcement based on performance.

    “The contingent approach is quid pro quo — if you do this, I’ll give you that,” Scott said. “It’s not sexy like transformational leadership, but it’s something that just about every manager can do because it doesn’t require you to ooze charisma.”


  5. Narcissism and social networking

    April 23, 2017 by Ashley

    From the University of Würzburg press release:

    Social media sites such as Facebook and Twitter have become an important part of the lives of many people worldwide. Around two billion users were active on Facebook at the end of 2016; 500 million regularly post photos on Instagram and more than 300 million communicate via Twitter.

    Various studies conducted over the past years have investigated to what extent the use of social media is associated with narcissistic tendencies — with contradictory results. Some studies supported a positive relationship between the use of Facebook, Twitter and the likes, whereas others confirmed only weak or even negative effects.

    Most comprehensive meta-analysis so far

    Fresh findings are now presented by scientists from the Leibniz Institute for Educational Trajectories Bamberg and the University of Würzburg. They were able to show that there is a weak to moderate link between a certain form of narcissism and social media activity. When taking a differentiated look at specific forms of behaviour or at the participants’ cultural background, the effect is even pronounced in some cases.

    The study is managed by Professor Markus Appel, who holds the Chair of Media Communication at the University of Würzburg, and Dr. Timo Gnambs, head of the Educational Measurement section at the Leibniz Institute for Educational Trajectories, Bamberg. For their meta-analysis, the scientists summarized the results of 57 studies comprising more than 25,000 participants in total. They have now published their findings in the Journal of Personality.

    Forms of narcissism

    They think of themselves as being exceptionally talented, remarkable and successful. They love to present themselves to other people and seek approval from them: This is how psychologists describe the typical behaviour of people commonly referred to as narcissists. “Accordingly, social networks such as Facebook are believed to be an ideal platform for these people,” says Markus Appel.

    The network gives them easy access to a large audience and allows them to selectively post information for the purpose of self-promotion. Moreover, they can meticulously cultivate their image. Therefore, researchers have suspected social networking sites to be an ideal breeding ground for narcissists from early on.

    Three hypotheses

    The recently published meta-analysis shows that the situation does not seem to be as bad as feared. The scientists examined the truth behind three hypotheses. Firstly, the assumption that grandiose narcissists frequent social networking sites more often than representatives of another form of narcissism, the “vulnerable narcissists.” Vulnerable narcissism is associated with insecurity, fragile self-esteem, and social withdrawal.

    Secondly, they assumed that the link between narcissism and the number of friends and certain self-promoting activities is much more pronounced compared to other activities possible on social networking sites.

    Thirdly, the researchers hypothesized that the link between narcissism and the social networking behaviour is subject to cultural influences. In collectivistic cultures where the focus is on the community rather than the individual or where rigid roles prevail, social media give narcissists the opportunity to escape from prevalent constraints and present themselves in a way that would be impossible in public.

    The results

    The meta-analysis of the 57 studies did in fact confirm the scientists’ assumptions. Grandiose narcissists are encountered more frequently in social networks than vulnerable narcissists. Moreover, a link has been found between the number of friends a person has and how many photos they upload and the prevalence of traits associated with narcissism. The gender and age of users is not relevant in this respect. Typical narcissists spend more time in social networks than average users and they exhibit specific behavioural patterns.

    A mixed result was found for the influence of the cultural background on the usage behaviour. “In countries where distinct social hierarchies and unequal power division are generally more accepted such as India or Malaysia, there is a stronger correlation between narcissism and the behaviour in social media than in countries like Austria or the USA,” says Markus Appel.

    However, the analysis of the data from 16 countries on four continents does not show a comparable influence of the “individualism” factor.

    Generation Me

    So is the frequently cited “Generation Me” a product of social media such as Facebook and Instagram because they promote narcissistic tendencies? Or do these sites simply provide the ideal environment for narcissists? The two scientists were not able to finally answer these questions.

    “We suggest that the link between narcissism and the behaviour in social media follows the pattern of a self-reinforcing spiral,” Markus Appel says. An individual disposition controls the social media activities and these activities in turn reinforce the disposition. To finally resolve this question, more research has to be conducted over longer periods.


  6. Study examines skills transferable from World of Warcraft

    by Ashley

    From the Missouri University of Science and Technology press release:

    “Stop playing that stupid video game and get a job.”

    It’s a sentiment expressed by generations of parents since Pong began invading unsuspecting households in 1975. But what if that “stupid game” could help you get a job, and what if that same game could make you a valuable team member once you had the job?

    A new study by researchers at Missouri University of Science and Technology found that World of Warcraft (WoW) gamers who were successful working as a team in “raids” had qualities that psychological studies have shown to translate to success on virtual workplace teams.

    These qualities include what psychologists call the Big Five personality traitsextraversion, agreeableness, openness, conscientiousness and neuroticism, as well as computer-mediated communication skills and technology readiness.

    The research team came to its conclusion by surveying 288 WoW gamers from across the massive multiplayer online role-playing game’s (MMORPG) many servers. Those surveyed were diverse in age, race, sex, class, occupation and location.

    The average survey taker played WoW eight hours a week and worked 38 hours a week — important because the research team wanted survey takers that had full-time jobs that potentially involved teamwork. The survey consisted of 140 questions asking about motivation, communication skills, preferences for teamwork, and personality, with most questions relating to the Big Five personality traits.

    WoW is the world’s most-subscribed-to MMORPG, with over 10 million subscribers. After creating a character, players explore an almost limitless virtual landscape. They complete quests and fight monsters, all while interacting and working with characters controlled by other players — a key aspect to the S&T research study.

    The team surveyed 288 players of the game’s fifth expansion set, Warlords of Draenor. They compared players survey answers to their character’s statistics. A player’s group achievement points indicate how much group gameplay they’ve participated in, and how successful it has been, says Elizabeth Short, a graduate student in industrial-organizational psychology who compiled data for the study. Short’s research team is led by Dr. Nathan Weidner, assistant professor of psychological science at Missouri S&T.

    “What we wanted to look at was virtual teamwork and what kind of characteristics a person had in-game that would translate to real life and the workplace,” she says.

    Short called the correlations the research team found between a gamer’s WoW group achievements and player traits small but “statistically significant.” One of the strongest correlations the team found was in terms of technology readiness.

    The more technologically ready you are, the more resilient around technology you are, the more adaptable you are, the more achievement points you have (in WoW). You could flip that,” she says. “The more achievements you have in game, the more technology savvy you are in real life. And that’s a good thing, especially in virtual communication teams and workplaces.”

    Short says that growing up, she was naturally shy and introverted, but WoW built confidence and helped her “shed her armor.” Short’s social confidence grew, and by the time she started college, she was able to communicate better, all because of what she learned playing WoW.

    “I loved WoW and I played it constantly,” she says. “Then I started college, and being able to use some of the things I learned in WoW, like talking and communicating with people during raids, helped me socially in school.”

    Short hopes that through the study, more gamers will find that their WoW confidence can be converted into the real world and a career.

    “I like the idea that there are aspects of gaming that help and strengthen a person with skills, knowledge and abilities to be able to transfer those skills into the workplace,” she says. “If it helps students like me, I want to see if it helps people in the workplace.

    “This research shows us that those skills, while not exactly the same, they transfer,” she adds.

    Short will be presenting the research team’s findings at the 32ndannual Society for Industrial and Organizational Psychology Conference in Orlando at the end of April.


  7. Visualizing future doesn’t increase delayed gratification

    April 20, 2017 by Ashley

    From the University of Pennsylvania press release:

    Some people are more impulsive than others.

    University of Pennsylvania researchers Joseph Kable and Trishala Parthasarathi wanted to understand why and whether that quality could change within an individual.

    They hypothesized, based on the field’s most recent research, that strong visualization of the future — vividly imagining how $40 earned in a month could be spent on an upcoming vacation, for example — might motivate someone to wait to receive a larger reward rather than take a smaller amount right away. It’s the concept known as delayed gratification.

    Kable and Parthasarathi actually discovered that the opposite was true, that great visualizers were more impulsive, findings they published in the journal Frontiers in Psychology.

    “When people have to make tradeoffs between something that’s in front of them right now and something that they can only get in the future, they differ in the extent to which they go for each outcome,” said Kable, the Baird Term Associate Professor of Psychology in Penn’s School of Arts & Sciences. As it turns out, “people who have imaginations with more vivid details are more likely to not delay gratification.”

    Or as Parthasarathi, a fifth-year Ph.D. student, explained, “Better visualizers tend to be more impulsive when they’re making choices about a smaller reward, accepting it immediately rather than waiting for a larger reward in the future.”

    To reach this conclusion, the research team devised an experiment that brought 38 adults with a median age of approximately 25 into the lab for a four-week intervention. At the start, each participant completed several decision-making tests and self-reporting surveys, including the Vividness of Visual Imagery Questionnaire, which asked participants to imagine in great detail a friend’s face or a setting sun, then rate on a scale of 1 to 5 how clearly they could see each.

    “A lower score on the scale indicated people were better able to imagine things than a higher score, which indicated people imagined things less clearly,” Parthasarathi said.

    The participants were then randomly split into two groups, one in which they got trained on improving their visualization skills, the other in which they practiced meditation. Twice weekly for the month, they worked with a health-and-wellness counselor on their respective areas.

    “People in the visualization group would think about two future goals, one at a time, and the process used to achieve each, how they felt after they achieved each and so on,” Parthasarathi said. “Those in the relaxation group were trained to think in the present, so breath awareness and attention to your body. Nothing related to thinking about the future.”

    Once the study period ended, participants completed the same battery of tests they’d taken at the beginning. Analyzing comparison data from the experiment’s start and finished provided the researchers with their counterintuitive results.

    “It certainly wasn’t what we expected. It’s surprising in light of the most recent work,” Kable said. But, he added, it’s less so if you think about findings from one of the original delayed-gratification experiments.

    Kable is referring to what’s today commonly called The Marshmallow Test. In the 1960s, Stanford University psychologist Walter Mischel offered children the opportunity to eat a single treat immediately or get double the amount if they could wait alone in the room until the researcher returned. Two plates — one with a single reward, the other with multiple — sat in plain view.

    “The thought was, ‘Your goal is right in front of you. You’ll be able to work toward it more,'” Kable explained. In fact, Mischel “found the direction of the association that we see: When the kids could see what they would get if they waited, they were more impulsive.”

    Interestingly, Parthasarathi and Kable also learned that improving someone’s visualization abilities can actually make that person more impatient.

    Despite results counter to what they expected, the researchers feel their work has real-world implications regarding impulsive behaviors. They now know that those keen on taking an immediate reward are more likely to use drugs or do poorly in school. They’re more likely to smoke and have a harder time quitting. So the psychologists can adjust behavior-changing treatments that accompany smoking cessation toward meditation and away from visualization, for instance.

    “The reason why we studied this task is we think it’s a microcosm that can tell us what people are doing outside the lab,” Kable said. “We’re still interested in what we can do to help people become more patient.”

    Funding for the work came from Penn’s Center for Interdisciplinary Research on Nicotine Addiction.


  8. Negotiations work best when both sides have matching personality traits

    April 7, 2017 by Ashley

    From the University of Georgia press release:

    Negotiations work best when both sides have matching personality traits — even if they’re both disagreeable — according to research from the University of Georgia Terry College of Business.

    Conventional wisdom would suggest that people who are outgoing and accommodating are better suited to negotiate, but a study co-authored by assistant professor of management Fadel Matta found two sides can reach accord through their common discord.

    “Normally, you would consider agreeableness — that you’re cooperative and kind — to be a good thing. And being disagreeable — being cold — to be a bad thing,” Matta said.

    “But with negotiations we find that’s not necessarily true. The same thing goes for someone who is extroverted. It’s not always a good thing when you’re entering negotiations.”

    At their core, negotiations are about a relationship. And like a relationship, they work best when both parties approach it the same way, he said. “If you’re a jerk and I’m a jerk, then it might seem like we’ll never get anywhere in negotiations, but it’s actually more useful to put two similarly minded people together,” Matta said.

    Matta and his co-authors based their research on the “Big Five” personality traits from psychological literature — conscientiousness, agreeableness, neuroticism, openness and extroversion. The study in the Journal of Applied Psychology focused on agreeableness and extroversion because of their interpersonal nature.

    “A lot of the research on personality shows that it has less of an effect than you would expect in negotiations, but that research has looked only at an individual’s personality,” Matta said. “We decided to look at the combination of personalities between two negotiators.”

    The authors surveyed more than 200 individuals about their personalities then randomly assigned them a role in a mock negotiation between two companies. After reading background information, participants negotiated against each other in order to arrive at a settlement on seven issues related to human resource management and compensation. After a settlement was reached, participants were surveyed about their perceptions of the process and their partner.

    They found that while one person’s personality could not predict outcomes, the combination of both personalities led to consistent results. Negotiations between individuals with similar scores on agreeableness and extroversion tended to go more smoothly, finish more quickly and leave both parties with better impressions of the other than negotiations between dissimilar individuals, Matta said. Researchers attributed the results to more positive emotional displays, which occur when both negotiators have similar personalities.

    “The takeaway when entering negotiations is to consider both parties’ personalities and how they might mesh, instead of just deciding to send in a really well-liked and agreeable person,” Matta said. “It’s the combination of the two people that will determine how well the negotiations proceed.”

    The study was published in the Journal of Applied Psychology.


  9. Is impatience contagious?

    April 2, 2017 by Ashley

    From the PLOS press release:

    People tend to unconsciously imitate others’ prudent, impatient or lazy attitudes, according to a study published in PLOS Computational Biology.

    “Prudence,” “impatience” or “laziness” are typically thought of as entrenched personality traits that guide how people weigh the cost of risk, delay and effort (respectively). However, new research shows that people’s attitudes towards effort, delay, or risk drift towards those of others.

    Jean Daunizeau and Marie Devaine, from INSERM, Paris, combined mathematical modelling and cognitive psychology to explore the laws that govern such attitude alignment. The authors asked 56 participants to make a series of decisions involving risks, delays or efforts, both before and after having observed the decisions of fictitious participants (in fact: artificial intelligence algorithms) whose prudent, patient and lazy attitudes were sensibly calibrated.

    The study results show that participants are bound to a “false-consensus” bias, i.e. they believe without evidence that the attitudes of others resemble their own. It also shows that people exhibit a “social influence” bias, i.e. their attitude tends to become more similar to those of people around them. Intriguingly, the social influence bias is partially determined by the false-consensus bias. In brief, it first increases with false-consensus (for small false-consensus biases), but then decreases with false-consensus (for large false-consensus biases). Note that participants seem to be mostly unaware of these biases.

    Critically, mathematical simulations demonstrate that both biases, and the surprising interaction between them, are hallmarks of a unique mechanism that is ideally suited to learning both about and from others’ covert attitudes. This is at odds with the conventional view that attitude alignment is an automatism that is triggered by the need to experience (partly deluded) feelings of social conformity.

    “Our work is in line with an ongoing effort tending toward a computational (i.e. quantitative and refutable) understanding of human and animal cognition. In particular, we showed that formal information and decision theories provide invaluable insights regarding the nature and relationship of puzzling biases of social cognition,” say the researchers.

    The authors are currently applying this work to assess whether this form of attitude alignment may differ in people suffering from neuropsychiatric conditions, such as autism spectrum disorder and schizophrenia.


  10. People can match names to faces of strangers with surprising accuracy

    March 17, 2017 by Ashley

    From the American Psychological Association press release:

    IF

    If your name is Fred, do you look like a Fred? You might — and others might think so, too. New research published by the American Psychological Association has found that people appear to be better than chance at correctly matching people’s names to their faces, and it may have something to do with cultural stereotypes we attach to names.

    In the study, published in the Journal of Personality and Social Psychology, lead author Yonat Zwebner, a PhD candidate at The Hebrew University of Jerusalem at the time of the research, and colleagues conducted a series of experiments involving hundreds of participants in Israel and France. In each experiment, participants were shown a photograph and asked to select the given name that corresponded to the face from a list of four or five names. In every experiment, the participants were significantly better (25 to 40 percent accurate) at matching the name to the face than random chance (20 or 25 percent accurate depending on the experiment) even when ethnicity, age and other socioeconomic variables were controlled for.

    The researchers theorize the effect may be, in part, due to cultural stereotypes associated with names as they found the effect to be culture-specific. In one experiment conducted with students in both France and Israel, participants were given a mix of French and Israeli faces and names. The French students were better than random chance at matching only French names and faces and Israeli students were better at matching only Hebrew names and Israeli faces.

    In another experiment, the researchers trained a computer, using a learning algorithm, to match names to faces. In this experiment, which included over 94,000 facial images, the computer was also significantly more likely (54 to 64 percent accuracy) to be successful than random chance (50 percent accuracy).

    This manifestation of the name in a face might be due to people subconsciously altering their appearance to conform to cultural norms and cues associated with their names, according to Zwebner.

    “We are familiar with such a process from other stereotypes, like ethnicity and gender where sometimes the stereotypical expectations of others affect who we become,” said Zwebner. “Prior research has shown there are cultural stereotypes attached to names, including how someone should look. For instance, people are more likely to imagine a person named Bob to have a rounder face than a person named Tim. We believe these stereotypes can, over time, affect people’s facial appearance.”

    This was supported by findings of one experiment showing that areas of the face that can be controlled by the individual, such as hairstyle, were sufficient to produce the effect.

    “Together, these findings suggest that facial appearance represents social expectations of how a person with a particular name should look. In this way, a social tag may influence one’s facial appearance,” said co-author Ruth Mayo, PhD, also from The Hebrew University of Jerusalem. “We are subject to social structuring from the minute we are born, not only by gender, ethnicity and socioeconomic status, but by the simple choice others make in giving us our name.”