1. Study suggests moviegoers’ taste in movies is idiosyncratic and at odds with critics’ preferences

    May 19, 2017 by Ashley

    From the New York University press release:

    Our taste in movies is notably idiosyncratic, and not linked to the demographic traits that studios target, finds new study on film preferences. The work also shows that moviegoers’ ratings are not necessarily in line with those of critics.

    “What we find enjoyable in movies is strikingly subjective — so much so that the industry’s targeting of film goers by broad demographic categories seems off the mark,” says Pascal Wallisch, a clinical assistant professor in New York University’s Department of Psychology and the senior author of the study, which appears in the journal Projections.

    “Critics may be adept at evaluating films, but that doesn’t mean their assessments will accurately predict how much the public will like what they see,” adds co-author Jake Whritner, a graduate of the Cinema Studies Program at NYU’s Tisch School of the Arts and currently part of the Cognitive and Data Science Lab at Rutgers University-Newark.

    Over the past century, filmmakers have sought to control viewers’ attention through different editing and shooting techniques, with the assumption that the audience will respond in the same way.

    However, while neural and attentional processing has been extensively examined, the level of agreement in the appraisal of movies among viewers has not been studied. Similarly, while past research has analyzed the relationship between reviews and box-office success as well as agreement among critics for films, none have explored agreement between critics and the general public.

    To address these questions, the researchers considered more than 200 major motion pictures, taking into account popularity, financial success, and critics’ reviews. They then surveyed over 3,000 participants, asking them to give a rating of how much they liked each of the films in the sample that they had previously seen. The researchers also asked participants to provide demographic information (e.g., age, gender) and whether they consider movie critics’ recommendations in choosing which movies to see.

    Finally, Wallisch and Whritner gathered reviews from 42 publicly accessible critics or rating sites (e.g., IMDb) for each of the films in the sample.

    The results generally showed low levels of correlation in movie preferences among study participants. However, there were some patterns. As the number of jointly seen films increased, so did the correlation of the ratings for such films — at least up to a point. When the number of ratings for a given film reached between 100 and 120, correlation grew to its highest point — but as this number continued to increase, correlation for that film’s ratings began to dip, before spiking up again at around 180 commonly seen films.

    Looking at demographics, the survey showed greater agreement in film ratings among male participants than among females — and this difference between genders was statistically significant. However, agreement among both men and women in films was relatively low. There was also little correlation between movie ratings and age — however, because the overall sample skewed younger, the significance of this result is limited. In general, neither gender nor age had much of an effect on inter-subjective agreement. Overall, the low inter-subjective agreement could account for all the vehement disagreements between people on whether or not a given movie was good: on average, one could expect a 1.25-star difference in disagreement on a scale from 0 to 4 stars.

    Turning to correlations with movie critics, the connection between the ratings of critics and any given participant was no better than the average correlation between participants. Even a critic as well regarded as the late Roger Ebert did no better in predicting how well someone would like a movie than a randomly picked participant in the sample. In contrast, critics agreed with each other relatively strongly. In fact, the best predictor of a critic’s response to a film was that of other critics while the best predictor of a non-critics’ response were the aggregated evaluations of other non-critics such as those on the Internet Movie Database (IMDB) — but not the aggregated ratings of critics such as Rotten Tomatoes. So it is not the aggregation of ratings per se that improves predictability, but aggregation of non-critic ratings.

    “Something about being a critic seems to make the recommendations of critics unsuitable for predicting the movie taste of regular people,” the authors conclude. “This study is the first to quantify this in an adequately powered fashion, and it helps to explain why people often perceive critics to be out of touch.

    “There are some people in our sample who are ‘superpredictors’ — they perform as well as the best aggregated non-critic ratings when it comes to predicting average non-critics will like. Short of these exceptional predictors, if someone seeks recommendations about what to see, their best bet is to either consult sites that aggregate individual judgements, or to find other individuals or critics with similar tastes.”


  2. Study looks at how personalities change under the influence of alcohol

    by Ashley

    From the Association for Psychological Science press release:

    People typically report substantive changes to their personality when they become intoxicated, but observations from outsiders suggest less drastic differences between “sober” and “drunk” personalities, according to research published in Clinical Psychological Science, a journal of the Association for Psychological Science.

    “We were surprised to find such a discrepancy between drinkers’ perceptions of their own alcohol-induced personalities and how observers perceived them,” says psychological scientist Rachel Winograd of the University of Missouri, St. Louis — Missouri Institute of Mental Health. “Participants reported experiencing differences in all factors of the Five Factor Model of personality, but extraversion was the only factor robustly perceived to be different across participants in alcohol and sober conditions.”

    Winograd and colleagues speculate that this discrepancy may come down to inherent differences in point of view:

    “We believe both the participants and raters were both accurate and inaccurate — the raters reliably reported what was visible to them and the participants experienced internal changes that were real to them but imperceptible to observers,” she explains.

    The idea that we transform into different people when we’re under the influence is a popular one. And systematic differences in an individual’s sober behavior and their drunken behaviors can even inform clinical determinations about whether someone has a drinking problem. But the science on “drunk personality” as a concept is less clear. In Winograd’s previous studies, participants reliably reported that their personality changes when they imbibe, but experimental evidence for this kind of global change was lacking.

    Winograd and colleagues decided to bring the question into the lab, where they could carefully calibrate alcohol consumption and closely monitor individual behavior. They recruited 156 participants, who completed an initial survey gauging their typical alcohol consumption and their perceptions of their own “typical sober” personality and “typical drunk” personality.

    Later, the participants came to the lab in friend groups of 3 or 4, where the researchers administered a baseline breathalyzer test and measured the participants’ height and weight. Over the course of about 15 minutes, each participant consumed beverages — some drank Sprite, while others consumed individually-tailored vodka and Sprite cocktails designed to produce a blood alcohol content of about .09.

    After a 15-minute absorption period, the friends worked through a series of fun group activities — including discussion questions and logic puzzles — intended to elicit a variety of personality traits and behaviors.

    The participants completed personality measures at two points during the lab session. And outside observers used video recordings to complete standardized assessments of each individual’s personality traits.

    As expected, participants’ ratings indicated change in all five of the major personality factors. After drinking, participants reported lower levels of conscientiousness, openness to experience, and agreeableness, and they reported higher levels of extraversion and emotional stability (the inverse of neuroticism).

    The observers, on the other hand, noted fewer differences across the sober and intoxicated participants’ personality traits. In fact, observer ratings indicated reliable differences in only one personality factor: extraversion. Specifically, participants who had consumed alcohol were rated higher on three facets of extraversion: gregariousness, assertiveness, and levels of activity.

    Given that extraversion is the most outwardly visible personality factor, it makes sense that both parties noted differences in this trait, the researchers argue.

    They acknowledge, however, that they cannot rule out other influences — such as participants’ own expectations of their drunk personality — that may have contributed to the discrepancy in ratings.

    “Of course, we also would love to see these findings replicated outside of the lab — in bars, at parties, and in homes where people actually do their drinking,” says Winograd.

    “Most importantly, we need to see how this work is most relevant in the clinical realm and can be effectively included in interventions to help reduce any negative impact of alcohol on peoples’ lives,” she concludes.


  3. Personality factors are best defense against losing your job to a robot

    May 15, 2017 by Ashley

    From the University of Houston press release:

    Worried robots will take your job? Researchers say people who are more intelligent and who showed an interest in the arts and sciences during high school are less likely to fall victim to automation.

    Later educational attainment mattered, but researchers said the findings highlight the importance of personality traits, intelligence and vocational interests in determining how well people fare in a changing labor market. The work was published this week in the European Journal of Personality.

    “Robots can’t perform as well as humans when it comes to complex social interactions,” said Rodica Damian, assistant professor of social and personality psychology at the University of Houston and lead author of the study. “Humans also outperform machines when it comes to tasks that require creativity and a high degree of complexity that is not routine. As soon as you require flexibility, the human does better.”

    Researchers used a dataset of 346,660 people from the American Institutes of Research, which tracked a representative sample of Americans over 50 years, looking at personality traits and vocational interests in adolescence, along with intelligence and socioeconomic status. It is the first study to look at how a variety of personality and background factors predict whether a person will select jobs that are more (or less) likely to be automated in the future.

    “We found that regardless of social background, people with higher levels of intelligence, higher levels of maturity and extraversion, higher interests in arts and sciences … tended to select (or be selected) into less computerizable jobs 11 and 50 years later,” they wrote.

    In addition to Damian, the researchers included Marion Spengler of the University of Tuebingen and Brent W. Roberts of the University of Illinois at Urbana-Champaign.

    Damian said the findings suggest traditional education may not be fully equipped to address upcoming changes in the labor market, although she acknowledged the educational system has changed since the research subjects were in school in the 1960s.

    “Perhaps we should consider training personality characteristics that will help prepare people for future jobs,” she said.

    The researchers found that every 15-point increase in IQ predicted a 7 percent drop in the probability of one’s job being computerized, the equivalent of saving 10.19 million people from losing their future careers to computerization if it were extrapolated across the entire U.S. population. Similarly, an increase of one standard deviation in maturity or in scientific interests — equal to an increase of 1 point on a 5-point scale, such as moving from being indifferent to scientific activities to liking them fairly well — across the U.S. population would each be equivalent to 2.9 million people avoiding a job loss to computerization.

    While IQ is not easily changed, a solution could be to find effective interventions to increase some personality traits — doing well in social interactions, for example, or being industrious — or interest in activities related to the arts and sciences, Damian said.

    Machine learning and big data will allow the number of tasks that machines can perform better than humans to increase so rapidly that merely increasing educational levels won’t be enough to keep up with job automation, she said. “The edge is in unique human skills.”

    Still, that can correlate with more education, and the researchers say an across-the-board increase in U.S. education levels could mean millions fewer jobs at risk. Targeting at-risk groups would yield significant benefits, she said.

    And while skeptics question whether the labor market will be able to absorb millions of higher skilled workers, Damian looks at it differently.

    “By preparing more people, at least more people will have a fighting chance,” she said.


  4. Study looks at how we perceive fictional characters

    May 14, 2017 by Ashley

    From the University at Buffalo press release:

    The Sopranos‘ Tony Soprano and Walter White from Breaking Bad rank among recent television drama’s most notorious protagonists, each of questionable morality. So, here’s the question: Do you like them?

    If you answered “yes” as many people do, a recently published paper by a University at Buffalo researcher suggests why people might feel good about characters who do bad things.

    The research has implications ranging from better understanding our relationship to fictional narratives, to possibly improving the effectiveness of character-based public service campaigns, according to lead author Matthew Grizzard, an assistant professor in UB’s Department of Communication and an expert on the cognitive, emotional and psychobiological effects of media entertainment.

    The results, with co-authors Jialing Huang, Kaitlin Fitzgerald, Changhyun Ahn and Haoran Chu, all UB graduate students, appear in the journal Communication Research.

    Grizzard says the reasons an audience may like or dislike a character has been a key question for media researchers since the 1970s.

    Morality matters. Viewers tend to like the good guys and dislike the bad guys. But Grizzard’s study, which builds on previous research, also suggests that we don’t necessarily need to see behavior to make a distinction between the hero and the villain.

    Grizzard’s study manipulated what characters looked like and measured audience perceptions. They hoped to find out whether simple differences in appearance — for example black clothes compared to lighter colors — would be enough for viewers to categorize a character as a hero or villain.

    Earlier research, meantime, had found that heroes and villains differ in morality but not competence.

    “Villains aren’t just immoral. They’re good at being bad,” according to Grizzard.

    The previous research gave Grizzard’s research team a pattern for determining whether they were activating perceptions of heroes and villains or whether they were simply creating biases unrelated to narrative characters.

    “If our data had come out where the heroic-looking character was both more moral and more competent than the villain, then we probably just created a bias. But because the hero was more moral than the villain but equally competent, we’re more confident that visuals can activate perceptions of heroic and villainous characters,” says Grizzard.

    Beyond an understanding of how visuals can influence perceptions of character, the findings indicate that judgments of characters depend on comparisons audiences make between characters — and the order of introduction plays a role.

    Heroes were judged to be more heroic when they appeared after a villain and villains were judged to be more villainous when they appeared after a hero.

    “What’s happening here is that we’re not making isolated judgements about these characters using some objective standard of morality,” says Grizzard. “We’re constantly making comparisons about these characters and the forces they face.”

    With Walter White, for example, the audience sees the evolution of a character whose ethics progressively spiral downward from season to season, and yet audiences remained loyal to him. That loyalty surprised Breaking Bad creator Vince Gilligan.

    Gilligan said in an interview that at some point he thought the audience would turn against the character. But Grizzard points out that the show’s antagonists concurrently get worse along with White.

    “We see a trend where White is not decreasing by himself. There’s likely to be a constant comparison with other characters,” says Grizzard. “So White is probably benefiting from the comparisons the audience is making.”

    Grizzard says it’s a polarization effect. The hero seems better when compared to the villain’s darkness while the villain seems worse when viewed in the hero’s light.

    But being bad isn’t, well, all bad, at least in the world of fictional narrative. It comes with certain advantages, according to Grizzard.

    “We find that characters who are perceived as villains get a bigger boost from the good actions or apparent altruism than heroes, like the Severus Snape character from the Harry Potter books and films.”

    These findings could be informative for public service campaigns or tele-novellas that try to promote certain behaviors.

    “It shows the importance of how these characters are framed,” he says.


  5. Study suggests perceived knowledge of others’ “true selves” is inherently tied to perceptions of their morality

    May 8, 2017 by Ashley

    From the Texas A&M University press release:

    How can you ever really know someone? Researchers at Texas A&M University found our perceived knowledge of others’ “true selves” is inherently tied to perceptions of their morality.

    “We found evidence consistent with a strong psychological link between morality and identity,” says Andrew Christy, a researcher in Texas A&M’s Department of Psychology who specializes in social and personality psychology.

    For “The Reciprocal Relationship Between Perceptions of Moral Goodness and Knowledge of Others’ True Selves,” published in the journal Social Psychological and Personality Science, Christy and his co-authors hypothesized that people would report knowing the most about others who were portrayed as morally good. A secondary hypothesis was that this effect would work both ways, such that others who were described as being easily knowable would also be perceived as more moral than those described as being unknowable.

    The researchers presented participants with texts describing a person’s morality and competency, then asked them to measure how well they thought they knew the person’s “true self.” In another study, participants read a passage in which the author describes their roommate as either being readily knowable or almost completely unknowable. Afterwards they answered questions about their impressions of the target, including how moral vs. immoral they perceived the target to be.

    The researchers found a strong psychological link between morality and identity. “We found that participants reported knowing the most about targets when they were described as moral, compared to targets described in different ways (e.g. competent),” explains Christy. “Our results also suggested that this is a bidirectional relationship — participants perceived the knowable target as more moral than the unknowable target.”

    Christy says these findings show that most people operate on the default assumption that others are fundamentally good. “Thus when someone is kind to us or otherwise provides evidence of goodness, this effectively confirms our pre-existing assumptions about them and leads us to feel that we really do know them.”

    So when it comes to friendships and other relationships, he concludes, “people will feel most familiar with others who make morally good first impressions.”


  6. New study suggests positive side to worrying

    May 2, 2017 by Ashley

    From the University of California – Riverside press release:

    Worry — it does a body good. And, the mind as well. A new paper by Kate Sweeny, psychology professor at the University of California, Riverside, argues there’s an upside to worrying.

    “Despite its negative reputation, not all worry is destructive or even futile,” Sweeny said. “It has motivational benefits, and it acts as an emotional buffer.”

    In her latest article, “The Surprising Upsides of Worry,” published in Social and Personality Psychology Compass, Sweeny breaks down the role of worry in motivating preventive and protective behavior, and how it leads people to avoid unpleasant events. Sweeny finds worry is associated with recovery from traumatic events, adaptive preparation and planning, recovery from depression, and partaking in activities that promote health, and prevent illness. Furthermore, people who report greater worry may perform better — in school or at the workplace — seek more information in response to stressful events, and engage in more successful problem solving.

    Worry as a Motivator

    The motivational power of worry has been studied and linked to preventive health behavior, like seatbelt use. In a nationally representative sample of Americans, feelings of worry about skin cancer predicted sunscreen use. And participants who reported higher levels of cancer-related worries also conducted breast self-examinations, underwent regular mammograms, and sought clinical breast examinations.

    “Interestingly enough, there are examples of a more nuanced relationship between worry and preventive behavior as well,” Sweeny said. “Women who reported moderate amounts of worry, compared to women reporting relatively low or high levels of worry, are more likely to get screened for cancer. It seems that both too much and too little worry can interfere with motivation, but the right amount of worry can motivate without paralyzing.”

    In the paper, Sweeny noted three explanations for worry’s motivating effects.

    1. Worry serves as a cue that the situation is serious and requires action. People use their emotions as a source of information when making judgements and decisions.

    2. Worrying about a stressor keeps the stressor at the front of one’s mind and prompts people toward action.

    3. The unpleasant feeling of worry motivates people to find ways to reduce their worry.

    “Even in circumstances when efforts to prevent undesirable outcomes are futile, worry can motivate proactive efforts to assemble a ready-made set of responses in the case of bad news,” Sweeny said. “In this instance, worrying pays off because one is actively thinking of a ‘plan B.'”

    Worry as a Buffer

    Worry can also benefit one’s emotional state by serving as an emotional bench-mark. Compared to the state of worry, any other feeling is pleasurable by contrast. In other words, the pleasure that comes from a good experience is heightened if preceded by a bad experience.

    “If people’s feelings of worry over a future outcome are sufficiently intense and unpleasant, their emotional response to the outcome they ultimately experience will seem more pleasurable in comparison to their previous, worried state,” Sweeny said.

    Research on bracing for the worst provides indirect evidence for the role of worry as an emotional buffer, according to Sweeny. As people brace for the worst, they embrace a pessimistic outlook to mitigate potential disappointment, boosting excitement if the news is good. Therefore, both bracing and worrying have an emotional payoff following the moment of truth.

    “Extreme levels of worry are harmful to one’s health. I do not intend to advocate for excessive worrying. Instead, I hope to provide reassurance to the helpless worrier — planning and preventive action is not a bad thing,” Sweeny said. “Worrying the right amount is far better than not worrying at all.”


  7. How dogs interact with others plays a role in decision-making

    May 1, 2017 by Ashley

    From the Canisius College press release:

    Researchers at Canisius College found that the relationship between two dogs living in the same household may impact how much influence they have on each other’s behavior. Dogs who showed little to no aggression towards their housemates were more likely to automatically follow each other than dogs who viewed their canine housemates as rivals. The study results are reported in the latest issue of Animal Cognition.

    To conduct this study, Christy Hoffman, Ph.D., and Malini Suchak, Ph.D., assistant professors of Animal Behavior, Ecology, and Conservation, traveled to 37 multi-dog households to test the dogs in their own homes. This was an unusual approach, since most studies test dogs in the laboratory and often pair them with other dogs they don’t know. “We really wanted to look at the impact of the relationship between the dogs on their behavior, and doing that in a setting natural to the dogs, with dogs they already know, is really important,” Suchak says.

    To classify how competitive household dogs were with each other, the owners were asked to fill out a survey known as the Canine Behavioral Assessment and Research Questionnaire (C-BARQ). Dogs that were low in rivalry never or rarely displayed aggressive behavior toward the other dog. Dogs that were high in rivalry displayed some degree of aggression around valuable resources, suggesting they have a more competitive nature. After their owners completed the C-BARQ, the dogs participated in a simple task. A research assistant placed two plates containing food in front of both dogs. One dog was allowed to approach the plates and eat the food from one plate before being walked out of the room. At that point, the second dog was allowed to make a choice. If the second dog followed the first dog, he arrived at an empty plate. If he didn’t follow the first dog, he went straight to the plate that still contained food.

    Dogs that were low in rivalry were more likely to follow the first dog and frequently ended up at the empty plate. What surprised the researchers was that low rivalry dogs only blindly followed the demonstrator when allowed to make their choice immediately. “Low and high rivalry dogs only differed in the choices they made when there was no delay,” Hoffman says. “When they had to wait 5 seconds before making their choice, all dogs tended to go directly to the full plate.”

    Suchak adds, “This suggests that the low rivalry dogs may have been automatically following their housemates. When we forced the dogs to wait, it was as if the low rivalry dogs actually took the time to think about the situation, and they went straight for the food.”

    The researchers also tested the dogs in a condition where a human removed the food from one plate before the dog made a choice. Interestingly, low rivalry dogs were more likely to follow the human demonstrator when there was no delay, a finding that paralleled what happened with dog demonstrators. Hoffman suggests this may have to do with the personality of low rivalry dogs, “Since the tendency of the low rivalry dogs to follow was seen when the demonstrator was both another dog and a human, competitiveness may be a characteristic of the individual that extends beyond their relationship with other dogs.”

    This means that if owners have a high rivalry dog “that dog may be more likely to think for himself, and less likely to blindly follow, than a dog that is less competitive,” says Hoffman. “On the whole, our findings show there is variation in the ways dogs make decisions and that how dogs interact with others plays a big role in how they respond under conditions that require quick thinking.”


  8. Narcissism and social networking

    April 23, 2017 by Ashley

    From the University of Würzburg press release:

    Social media sites such as Facebook and Twitter have become an important part of the lives of many people worldwide. Around two billion users were active on Facebook at the end of 2016; 500 million regularly post photos on Instagram and more than 300 million communicate via Twitter.

    Various studies conducted over the past years have investigated to what extent the use of social media is associated with narcissistic tendencies — with contradictory results. Some studies supported a positive relationship between the use of Facebook, Twitter and the likes, whereas others confirmed only weak or even negative effects.

    Most comprehensive meta-analysis so far

    Fresh findings are now presented by scientists from the Leibniz Institute for Educational Trajectories Bamberg and the University of Würzburg. They were able to show that there is a weak to moderate link between a certain form of narcissism and social media activity. When taking a differentiated look at specific forms of behaviour or at the participants’ cultural background, the effect is even pronounced in some cases.

    The study is managed by Professor Markus Appel, who holds the Chair of Media Communication at the University of Würzburg, and Dr. Timo Gnambs, head of the Educational Measurement section at the Leibniz Institute for Educational Trajectories, Bamberg. For their meta-analysis, the scientists summarized the results of 57 studies comprising more than 25,000 participants in total. They have now published their findings in the Journal of Personality.

    Forms of narcissism

    They think of themselves as being exceptionally talented, remarkable and successful. They love to present themselves to other people and seek approval from them: This is how psychologists describe the typical behaviour of people commonly referred to as narcissists. “Accordingly, social networks such as Facebook are believed to be an ideal platform for these people,” says Markus Appel.

    The network gives them easy access to a large audience and allows them to selectively post information for the purpose of self-promotion. Moreover, they can meticulously cultivate their image. Therefore, researchers have suspected social networking sites to be an ideal breeding ground for narcissists from early on.

    Three hypotheses

    The recently published meta-analysis shows that the situation does not seem to be as bad as feared. The scientists examined the truth behind three hypotheses. Firstly, the assumption that grandiose narcissists frequent social networking sites more often than representatives of another form of narcissism, the “vulnerable narcissists.” Vulnerable narcissism is associated with insecurity, fragile self-esteem, and social withdrawal.

    Secondly, they assumed that the link between narcissism and the number of friends and certain self-promoting activities is much more pronounced compared to other activities possible on social networking sites.

    Thirdly, the researchers hypothesized that the link between narcissism and the social networking behaviour is subject to cultural influences. In collectivistic cultures where the focus is on the community rather than the individual or where rigid roles prevail, social media give narcissists the opportunity to escape from prevalent constraints and present themselves in a way that would be impossible in public.

    The results

    The meta-analysis of the 57 studies did in fact confirm the scientists’ assumptions. Grandiose narcissists are encountered more frequently in social networks than vulnerable narcissists. Moreover, a link has been found between the number of friends a person has and how many photos they upload and the prevalence of traits associated with narcissism. The gender and age of users is not relevant in this respect. Typical narcissists spend more time in social networks than average users and they exhibit specific behavioural patterns.

    A mixed result was found for the influence of the cultural background on the usage behaviour. “In countries where distinct social hierarchies and unequal power division are generally more accepted such as India or Malaysia, there is a stronger correlation between narcissism and the behaviour in social media than in countries like Austria or the USA,” says Markus Appel.

    However, the analysis of the data from 16 countries on four continents does not show a comparable influence of the “individualism” factor.

    Generation Me

    So is the frequently cited “Generation Me” a product of social media such as Facebook and Instagram because they promote narcissistic tendencies? Or do these sites simply provide the ideal environment for narcissists? The two scientists were not able to finally answer these questions.

    “We suggest that the link between narcissism and the behaviour in social media follows the pattern of a self-reinforcing spiral,” Markus Appel says. An individual disposition controls the social media activities and these activities in turn reinforce the disposition. To finally resolve this question, more research has to be conducted over longer periods.


  9. Study examines skills transferable from World of Warcraft

    by Ashley

    From the Missouri University of Science and Technology press release:

    “Stop playing that stupid video game and get a job.”

    It’s a sentiment expressed by generations of parents since Pong began invading unsuspecting households in 1975. But what if that “stupid game” could help you get a job, and what if that same game could make you a valuable team member once you had the job?

    A new study by researchers at Missouri University of Science and Technology found that World of Warcraft (WoW) gamers who were successful working as a team in “raids” had qualities that psychological studies have shown to translate to success on virtual workplace teams.

    These qualities include what psychologists call the Big Five personality traitsextraversion, agreeableness, openness, conscientiousness and neuroticism, as well as computer-mediated communication skills and technology readiness.

    The research team came to its conclusion by surveying 288 WoW gamers from across the massive multiplayer online role-playing game’s (MMORPG) many servers. Those surveyed were diverse in age, race, sex, class, occupation and location.

    The average survey taker played WoW eight hours a week and worked 38 hours a week — important because the research team wanted survey takers that had full-time jobs that potentially involved teamwork. The survey consisted of 140 questions asking about motivation, communication skills, preferences for teamwork, and personality, with most questions relating to the Big Five personality traits.

    WoW is the world’s most-subscribed-to MMORPG, with over 10 million subscribers. After creating a character, players explore an almost limitless virtual landscape. They complete quests and fight monsters, all while interacting and working with characters controlled by other players — a key aspect to the S&T research study.

    The team surveyed 288 players of the game’s fifth expansion set, Warlords of Draenor. They compared players survey answers to their character’s statistics. A player’s group achievement points indicate how much group gameplay they’ve participated in, and how successful it has been, says Elizabeth Short, a graduate student in industrial-organizational psychology who compiled data for the study. Short’s research team is led by Dr. Nathan Weidner, assistant professor of psychological science at Missouri S&T.

    “What we wanted to look at was virtual teamwork and what kind of characteristics a person had in-game that would translate to real life and the workplace,” she says.

    Short called the correlations the research team found between a gamer’s WoW group achievements and player traits small but “statistically significant.” One of the strongest correlations the team found was in terms of technology readiness.

    The more technologically ready you are, the more resilient around technology you are, the more adaptable you are, the more achievement points you have (in WoW). You could flip that,” she says. “The more achievements you have in game, the more technology savvy you are in real life. And that’s a good thing, especially in virtual communication teams and workplaces.”

    Short says that growing up, she was naturally shy and introverted, but WoW built confidence and helped her “shed her armor.” Short’s social confidence grew, and by the time she started college, she was able to communicate better, all because of what she learned playing WoW.

    “I loved WoW and I played it constantly,” she says. “Then I started college, and being able to use some of the things I learned in WoW, like talking and communicating with people during raids, helped me socially in school.”

    Short hopes that through the study, more gamers will find that their WoW confidence can be converted into the real world and a career.

    “I like the idea that there are aspects of gaming that help and strengthen a person with skills, knowledge and abilities to be able to transfer those skills into the workplace,” she says. “If it helps students like me, I want to see if it helps people in the workplace.

    “This research shows us that those skills, while not exactly the same, they transfer,” she adds.

    Short will be presenting the research team’s findings at the 32ndannual Society for Industrial and Organizational Psychology Conference in Orlando at the end of April.


  10. Negotiations work best when both sides have matching personality traits

    April 7, 2017 by Ashley

    From the University of Georgia press release:

    Negotiations work best when both sides have matching personality traits — even if they’re both disagreeable — according to research from the University of Georgia Terry College of Business.

    Conventional wisdom would suggest that people who are outgoing and accommodating are better suited to negotiate, but a study co-authored by assistant professor of management Fadel Matta found two sides can reach accord through their common discord.

    “Normally, you would consider agreeableness — that you’re cooperative and kind — to be a good thing. And being disagreeable — being cold — to be a bad thing,” Matta said.

    “But with negotiations we find that’s not necessarily true. The same thing goes for someone who is extroverted. It’s not always a good thing when you’re entering negotiations.”

    At their core, negotiations are about a relationship. And like a relationship, they work best when both parties approach it the same way, he said. “If you’re a jerk and I’m a jerk, then it might seem like we’ll never get anywhere in negotiations, but it’s actually more useful to put two similarly minded people together,” Matta said.

    Matta and his co-authors based their research on the “Big Five” personality traits from psychological literature — conscientiousness, agreeableness, neuroticism, openness and extroversion. The study in the Journal of Applied Psychology focused on agreeableness and extroversion because of their interpersonal nature.

    “A lot of the research on personality shows that it has less of an effect than you would expect in negotiations, but that research has looked only at an individual’s personality,” Matta said. “We decided to look at the combination of personalities between two negotiators.”

    The authors surveyed more than 200 individuals about their personalities then randomly assigned them a role in a mock negotiation between two companies. After reading background information, participants negotiated against each other in order to arrive at a settlement on seven issues related to human resource management and compensation. After a settlement was reached, participants were surveyed about their perceptions of the process and their partner.

    They found that while one person’s personality could not predict outcomes, the combination of both personalities led to consistent results. Negotiations between individuals with similar scores on agreeableness and extroversion tended to go more smoothly, finish more quickly and leave both parties with better impressions of the other than negotiations between dissimilar individuals, Matta said. Researchers attributed the results to more positive emotional displays, which occur when both negotiators have similar personalities.

    “The takeaway when entering negotiations is to consider both parties’ personalities and how they might mesh, instead of just deciding to send in a really well-liked and agreeable person,” Matta said. “It’s the combination of the two people that will determine how well the negotiations proceed.”

    The study was published in the Journal of Applied Psychology.