1. Study suggests moviegoers’ taste in movies is idiosyncratic and at odds with critics’ preferences

    May 19, 2017 by Ashley

    From the New York University press release:

    Our taste in movies is notably idiosyncratic, and not linked to the demographic traits that studios target, finds new study on film preferences. The work also shows that moviegoers’ ratings are not necessarily in line with those of critics.

    “What we find enjoyable in movies is strikingly subjective — so much so that the industry’s targeting of film goers by broad demographic categories seems off the mark,” says Pascal Wallisch, a clinical assistant professor in New York University’s Department of Psychology and the senior author of the study, which appears in the journal Projections.

    “Critics may be adept at evaluating films, but that doesn’t mean their assessments will accurately predict how much the public will like what they see,” adds co-author Jake Whritner, a graduate of the Cinema Studies Program at NYU’s Tisch School of the Arts and currently part of the Cognitive and Data Science Lab at Rutgers University-Newark.

    Over the past century, filmmakers have sought to control viewers’ attention through different editing and shooting techniques, with the assumption that the audience will respond in the same way.

    However, while neural and attentional processing has been extensively examined, the level of agreement in the appraisal of movies among viewers has not been studied. Similarly, while past research has analyzed the relationship between reviews and box-office success as well as agreement among critics for films, none have explored agreement between critics and the general public.

    To address these questions, the researchers considered more than 200 major motion pictures, taking into account popularity, financial success, and critics’ reviews. They then surveyed over 3,000 participants, asking them to give a rating of how much they liked each of the films in the sample that they had previously seen. The researchers also asked participants to provide demographic information (e.g., age, gender) and whether they consider movie critics’ recommendations in choosing which movies to see.

    Finally, Wallisch and Whritner gathered reviews from 42 publicly accessible critics or rating sites (e.g., IMDb) for each of the films in the sample.

    The results generally showed low levels of correlation in movie preferences among study participants. However, there were some patterns. As the number of jointly seen films increased, so did the correlation of the ratings for such films — at least up to a point. When the number of ratings for a given film reached between 100 and 120, correlation grew to its highest point — but as this number continued to increase, correlation for that film’s ratings began to dip, before spiking up again at around 180 commonly seen films.

    Looking at demographics, the survey showed greater agreement in film ratings among male participants than among females — and this difference between genders was statistically significant. However, agreement among both men and women in films was relatively low. There was also little correlation between movie ratings and age — however, because the overall sample skewed younger, the significance of this result is limited. In general, neither gender nor age had much of an effect on inter-subjective agreement. Overall, the low inter-subjective agreement could account for all the vehement disagreements between people on whether or not a given movie was good: on average, one could expect a 1.25-star difference in disagreement on a scale from 0 to 4 stars.

    Turning to correlations with movie critics, the connection between the ratings of critics and any given participant was no better than the average correlation between participants. Even a critic as well regarded as the late Roger Ebert did no better in predicting how well someone would like a movie than a randomly picked participant in the sample. In contrast, critics agreed with each other relatively strongly. In fact, the best predictor of a critic’s response to a film was that of other critics while the best predictor of a non-critics’ response were the aggregated evaluations of other non-critics such as those on the Internet Movie Database (IMDB) — but not the aggregated ratings of critics such as Rotten Tomatoes. So it is not the aggregation of ratings per se that improves predictability, but aggregation of non-critic ratings.

    “Something about being a critic seems to make the recommendations of critics unsuitable for predicting the movie taste of regular people,” the authors conclude. “This study is the first to quantify this in an adequately powered fashion, and it helps to explain why people often perceive critics to be out of touch.

    “There are some people in our sample who are ‘superpredictors’ — they perform as well as the best aggregated non-critic ratings when it comes to predicting average non-critics will like. Short of these exceptional predictors, if someone seeks recommendations about what to see, their best bet is to either consult sites that aggregate individual judgements, or to find other individuals or critics with similar tastes.”


  2. Study looks at how personalities change under the influence of alcohol

    by Ashley

    From the Association for Psychological Science press release:

    People typically report substantive changes to their personality when they become intoxicated, but observations from outsiders suggest less drastic differences between “sober” and “drunk” personalities, according to research published in Clinical Psychological Science, a journal of the Association for Psychological Science.

    “We were surprised to find such a discrepancy between drinkers’ perceptions of their own alcohol-induced personalities and how observers perceived them,” says psychological scientist Rachel Winograd of the University of Missouri, St. Louis — Missouri Institute of Mental Health. “Participants reported experiencing differences in all factors of the Five Factor Model of personality, but extraversion was the only factor robustly perceived to be different across participants in alcohol and sober conditions.”

    Winograd and colleagues speculate that this discrepancy may come down to inherent differences in point of view:

    “We believe both the participants and raters were both accurate and inaccurate — the raters reliably reported what was visible to them and the participants experienced internal changes that were real to them but imperceptible to observers,” she explains.

    The idea that we transform into different people when we’re under the influence is a popular one. And systematic differences in an individual’s sober behavior and their drunken behaviors can even inform clinical determinations about whether someone has a drinking problem. But the science on “drunk personality” as a concept is less clear. In Winograd’s previous studies, participants reliably reported that their personality changes when they imbibe, but experimental evidence for this kind of global change was lacking.

    Winograd and colleagues decided to bring the question into the lab, where they could carefully calibrate alcohol consumption and closely monitor individual behavior. They recruited 156 participants, who completed an initial survey gauging their typical alcohol consumption and their perceptions of their own “typical sober” personality and “typical drunk” personality.

    Later, the participants came to the lab in friend groups of 3 or 4, where the researchers administered a baseline breathalyzer test and measured the participants’ height and weight. Over the course of about 15 minutes, each participant consumed beverages — some drank Sprite, while others consumed individually-tailored vodka and Sprite cocktails designed to produce a blood alcohol content of about .09.

    After a 15-minute absorption period, the friends worked through a series of fun group activities — including discussion questions and logic puzzles — intended to elicit a variety of personality traits and behaviors.

    The participants completed personality measures at two points during the lab session. And outside observers used video recordings to complete standardized assessments of each individual’s personality traits.

    As expected, participants’ ratings indicated change in all five of the major personality factors. After drinking, participants reported lower levels of conscientiousness, openness to experience, and agreeableness, and they reported higher levels of extraversion and emotional stability (the inverse of neuroticism).

    The observers, on the other hand, noted fewer differences across the sober and intoxicated participants’ personality traits. In fact, observer ratings indicated reliable differences in only one personality factor: extraversion. Specifically, participants who had consumed alcohol were rated higher on three facets of extraversion: gregariousness, assertiveness, and levels of activity.

    Given that extraversion is the most outwardly visible personality factor, it makes sense that both parties noted differences in this trait, the researchers argue.

    They acknowledge, however, that they cannot rule out other influences — such as participants’ own expectations of their drunk personality — that may have contributed to the discrepancy in ratings.

    “Of course, we also would love to see these findings replicated outside of the lab — in bars, at parties, and in homes where people actually do their drinking,” says Winograd.

    “Most importantly, we need to see how this work is most relevant in the clinical realm and can be effectively included in interventions to help reduce any negative impact of alcohol on peoples’ lives,” she concludes.


  3. Study looks at motivations behind participation in extreme sports

    May 16, 2017 by Ashley

    From the Queensland University of Technology press release:

    Researchers have debunked the myth that extreme sportsmen and women are adrenalin junkies with a death wish, according to a new study.

    The research has been published in the latest edition of Psychology of Consciousness: Theory, Research and Practice by QUT Adjunct Professor Eric Brymer, who is currently based at Leeds Beckett University in the UK, and QUT Professor Robert Schweitzer.

    Professors Brymer and Schweitzer said extreme sports were leisure activities in which a mismanaged mistake or accident could result in death, such as BASE jumping, big wave surfing and solo rope free climbing.

    “Extreme sports have developed into a worldwide phenomenon and we are witnessing an unprecedented interest in and engagement with these activities,” Professor Brymer said.

    “While participant numbers in many traditional team and individual sports such as golf, basketball and racket sports seem to have declined over the past decade, participant numbers in extreme sports have surged, making it a multi-million dollar industry.”

    Professor Brymer said until now there had been a gross misunderstanding of what motivates people to take part in extreme sports, with many writing it off as an activity for adrenalin junkies.

    “Our research has shown people who engage in extreme sports are anything but irresponsible risk-takers with a death wish. They are highly trained individuals with a deep knowledge of themselves, the activity and the environment who do it to have an experience that is life enhancing and life changing,” he said.

    “The experience is very hard to describe in the same way that love is hard to describe. It makes the participant feel very alive where all senses seem to be working better than in everyday life, as if the participant is transcending everyday ways of being and glimpsing their own potential.

    “For example, BASE jumpers talk about being able to see all the colours and nooks and crannies of the rock as they zoom past at 300km/h, or extreme climbers feel like they are floating and dancing with the rock. People talk about time slowing down and merging with nature.”

    Professor Schweitzer said understanding motivations for extreme sports were important to understanding humans.

    “Far from the traditional risk-focused assumptions, extreme sports participation facilitates more positive psychological experiences and express human values such as humility, harmony, creativity, spirituality and a vital sense of self that enriches everyday life,” Professor Schweitzer said.

    He said because extreme sports participants found it hard to put their experiences into words, the research project had taken a new approach to understanding the data.

    “So rather than a theory based approach which may make judgements that don’t reflect the lived experience of extreme sports participants, we took a phenomenological approach to ensure we went in with an open mind,” he said.

    “This allowed us to focus on the lived-experience of extreme sport with the goal of explaining themes that are consistent with participants’ experience.

    “By doing this we were able to, for the first time, conceptualise such experiences as potentially representing endeavours at the extreme end of human agency, that is making choices to engage in activity which may in certain circumstances lead to death.

    “However, such experiences have been shown to be affirmative of life and the potential for transformation.

    “Extreme sport has the potential to induce non-ordinary states of consciousness that are at once powerful and meaningful.

    “These experiences enrich the lives of participants and provide a further glimpse into what it means to be human.”


  4. Personality factors are best defense against losing your job to a robot

    May 15, 2017 by Ashley

    From the University of Houston press release:

    Worried robots will take your job? Researchers say people who are more intelligent and who showed an interest in the arts and sciences during high school are less likely to fall victim to automation.

    Later educational attainment mattered, but researchers said the findings highlight the importance of personality traits, intelligence and vocational interests in determining how well people fare in a changing labor market. The work was published this week in the European Journal of Personality.

    “Robots can’t perform as well as humans when it comes to complex social interactions,” said Rodica Damian, assistant professor of social and personality psychology at the University of Houston and lead author of the study. “Humans also outperform machines when it comes to tasks that require creativity and a high degree of complexity that is not routine. As soon as you require flexibility, the human does better.”

    Researchers used a dataset of 346,660 people from the American Institutes of Research, which tracked a representative sample of Americans over 50 years, looking at personality traits and vocational interests in adolescence, along with intelligence and socioeconomic status. It is the first study to look at how a variety of personality and background factors predict whether a person will select jobs that are more (or less) likely to be automated in the future.

    “We found that regardless of social background, people with higher levels of intelligence, higher levels of maturity and extraversion, higher interests in arts and sciences … tended to select (or be selected) into less computerizable jobs 11 and 50 years later,” they wrote.

    In addition to Damian, the researchers included Marion Spengler of the University of Tuebingen and Brent W. Roberts of the University of Illinois at Urbana-Champaign.

    Damian said the findings suggest traditional education may not be fully equipped to address upcoming changes in the labor market, although she acknowledged the educational system has changed since the research subjects were in school in the 1960s.

    “Perhaps we should consider training personality characteristics that will help prepare people for future jobs,” she said.

    The researchers found that every 15-point increase in IQ predicted a 7 percent drop in the probability of one’s job being computerized, the equivalent of saving 10.19 million people from losing their future careers to computerization if it were extrapolated across the entire U.S. population. Similarly, an increase of one standard deviation in maturity or in scientific interests — equal to an increase of 1 point on a 5-point scale, such as moving from being indifferent to scientific activities to liking them fairly well — across the U.S. population would each be equivalent to 2.9 million people avoiding a job loss to computerization.

    While IQ is not easily changed, a solution could be to find effective interventions to increase some personality traits — doing well in social interactions, for example, or being industrious — or interest in activities related to the arts and sciences, Damian said.

    Machine learning and big data will allow the number of tasks that machines can perform better than humans to increase so rapidly that merely increasing educational levels won’t be enough to keep up with job automation, she said. “The edge is in unique human skills.”

    Still, that can correlate with more education, and the researchers say an across-the-board increase in U.S. education levels could mean millions fewer jobs at risk. Targeting at-risk groups would yield significant benefits, she said.

    And while skeptics question whether the labor market will be able to absorb millions of higher skilled workers, Damian looks at it differently.

    “By preparing more people, at least more people will have a fighting chance,” she said.


  5. Study looks at how we perceive fictional characters

    May 14, 2017 by Ashley

    From the University at Buffalo press release:

    The Sopranos‘ Tony Soprano and Walter White from Breaking Bad rank among recent television drama’s most notorious protagonists, each of questionable morality. So, here’s the question: Do you like them?

    If you answered “yes” as many people do, a recently published paper by a University at Buffalo researcher suggests why people might feel good about characters who do bad things.

    The research has implications ranging from better understanding our relationship to fictional narratives, to possibly improving the effectiveness of character-based public service campaigns, according to lead author Matthew Grizzard, an assistant professor in UB’s Department of Communication and an expert on the cognitive, emotional and psychobiological effects of media entertainment.

    The results, with co-authors Jialing Huang, Kaitlin Fitzgerald, Changhyun Ahn and Haoran Chu, all UB graduate students, appear in the journal Communication Research.

    Grizzard says the reasons an audience may like or dislike a character has been a key question for media researchers since the 1970s.

    Morality matters. Viewers tend to like the good guys and dislike the bad guys. But Grizzard’s study, which builds on previous research, also suggests that we don’t necessarily need to see behavior to make a distinction between the hero and the villain.

    Grizzard’s study manipulated what characters looked like and measured audience perceptions. They hoped to find out whether simple differences in appearance — for example black clothes compared to lighter colors — would be enough for viewers to categorize a character as a hero or villain.

    Earlier research, meantime, had found that heroes and villains differ in morality but not competence.

    “Villains aren’t just immoral. They’re good at being bad,” according to Grizzard.

    The previous research gave Grizzard’s research team a pattern for determining whether they were activating perceptions of heroes and villains or whether they were simply creating biases unrelated to narrative characters.

    “If our data had come out where the heroic-looking character was both more moral and more competent than the villain, then we probably just created a bias. But because the hero was more moral than the villain but equally competent, we’re more confident that visuals can activate perceptions of heroic and villainous characters,” says Grizzard.

    Beyond an understanding of how visuals can influence perceptions of character, the findings indicate that judgments of characters depend on comparisons audiences make between characters — and the order of introduction plays a role.

    Heroes were judged to be more heroic when they appeared after a villain and villains were judged to be more villainous when they appeared after a hero.

    “What’s happening here is that we’re not making isolated judgements about these characters using some objective standard of morality,” says Grizzard. “We’re constantly making comparisons about these characters and the forces they face.”

    With Walter White, for example, the audience sees the evolution of a character whose ethics progressively spiral downward from season to season, and yet audiences remained loyal to him. That loyalty surprised Breaking Bad creator Vince Gilligan.

    Gilligan said in an interview that at some point he thought the audience would turn against the character. But Grizzard points out that the show’s antagonists concurrently get worse along with White.

    “We see a trend where White is not decreasing by himself. There’s likely to be a constant comparison with other characters,” says Grizzard. “So White is probably benefiting from the comparisons the audience is making.”

    Grizzard says it’s a polarization effect. The hero seems better when compared to the villain’s darkness while the villain seems worse when viewed in the hero’s light.

    But being bad isn’t, well, all bad, at least in the world of fictional narrative. It comes with certain advantages, according to Grizzard.

    “We find that characters who are perceived as villains get a bigger boost from the good actions or apparent altruism than heroes, like the Severus Snape character from the Harry Potter books and films.”

    These findings could be informative for public service campaigns or tele-novellas that try to promote certain behaviors.

    “It shows the importance of how these characters are framed,” he says.


  6. Study suggests perceived knowledge of others’ “true selves” is inherently tied to perceptions of their morality

    May 8, 2017 by Ashley

    From the Texas A&M University press release:

    How can you ever really know someone? Researchers at Texas A&M University found our perceived knowledge of others’ “true selves” is inherently tied to perceptions of their morality.

    “We found evidence consistent with a strong psychological link between morality and identity,” says Andrew Christy, a researcher in Texas A&M’s Department of Psychology who specializes in social and personality psychology.

    For “The Reciprocal Relationship Between Perceptions of Moral Goodness and Knowledge of Others’ True Selves,” published in the journal Social Psychological and Personality Science, Christy and his co-authors hypothesized that people would report knowing the most about others who were portrayed as morally good. A secondary hypothesis was that this effect would work both ways, such that others who were described as being easily knowable would also be perceived as more moral than those described as being unknowable.

    The researchers presented participants with texts describing a person’s morality and competency, then asked them to measure how well they thought they knew the person’s “true self.” In another study, participants read a passage in which the author describes their roommate as either being readily knowable or almost completely unknowable. Afterwards they answered questions about their impressions of the target, including how moral vs. immoral they perceived the target to be.

    The researchers found a strong psychological link between morality and identity. “We found that participants reported knowing the most about targets when they were described as moral, compared to targets described in different ways (e.g. competent),” explains Christy. “Our results also suggested that this is a bidirectional relationship — participants perceived the knowable target as more moral than the unknowable target.”

    Christy says these findings show that most people operate on the default assumption that others are fundamentally good. “Thus when someone is kind to us or otherwise provides evidence of goodness, this effectively confirms our pre-existing assumptions about them and leads us to feel that we really do know them.”

    So when it comes to friendships and other relationships, he concludes, “people will feel most familiar with others who make morally good first impressions.”


  7. Study looks at link between testosterone and impulsivity

    May 5, 2017 by Ashley

    From the California Institute of Technology press release:

    Hotheaded, impulsive men who shoot first and ask questions later are a staple of Westerns and 1970s cop films, but new research shows there might be truth to the trope.

    A study conducted by researchers from Caltech, the Wharton School, Western University, and ZRT Laboratory tested the hypothesis that higher levels of testosterone increase the tendency in men to rely on their intuitive judgments and reduce cognitive reflection — a decision-making process by which a person stops to consider whether their gut reaction to something makes sense. The researchers found that men given doses of testosterone performed more poorly on a test designed to measure cognitive reflection than a group given a placebo.

    The research will appear in an upcoming issue of the journal Psychological Science.

    “What we found was the testosterone group was quicker to make snap judgments on brain teasers where your initial guess is usually wrong,” says Caltech’s Colin Camerer, the Robert Kirby Professor of Behavioral Economics and T&C Chen Center for Social and Decision Neuroscience Leadership Chair. “The testosterone is either inhibiting the process of mentally checking your work or increasing the intuitive feeling that ‘I’m definitely right.'”

    The study, which is one of the largest of its type ever conducted, included 243 males who were randomly selected to receive a dose of testosterone gel or placebo gel before taking a cognitive reflection test. A math task was also given to control for participant engagement, motivation level, and basic math skills.

    The questions included on the cognitive reflection test are exemplified by the following:

    A bat and a ball cost $1.10 in total. The bat costs $1 more than the ball.

    How much does the ball cost?

    For many people, the first answer that comes to mind is that the ball costs 10 cents, but that’s incorrect because then the bat costs only 90 cents more than the ball. The correct answer is that the ball costs 5 cents and the bat costs $1.05. An individual prone to relying on their gut instincts would be more likely to accept their first answer of 10 cents. However, another person might realize their initial error through cognitive reflection and come up with the correct answer.

    Participants were not limited on time while taking the test and were offered $1 for each correct answer and an additional $2 if they answered all the questions correctly.

    The results show that the group that received testosterone scored significantly lower than the group that received the placebo, on average answering 20 percent fewer questions correctly. The testosterone group also “gave incorrect answers more quickly, and correct answers more slowly than the placebo group,” the authors write. The same effect was not seen in the results of the basic math tests administered to both groups. The results “demonstrate a clear and robust causal effect of [testosterone] on human cognition and decision-making,” they conclude.

    The researchers believe that the phenomenon they’ve observed can be linked to testosterone’s effect of increasing confidence in humans. Testosterone is thought to generally enhance the male drive for social status, and recent studies have shown that confidence enhances status.

    “We think it works through confidence enhancement. If you’re more confident, you’ll feel like you’re right and will not have enough self-doubt to correct mistakes,” Camerer says.

    Camerer says the results of the study raise questions about potential negative effects of the growing testosterone-replacement therapy industry, which is primarily aimed at reversing the decline in sex drive many middle-aged men experience.

    “If men want more testosterone to increase sex drive, are there other effects? Do these men become too mentally bold and thinking they know things they don’t?”


  8. Study suggests charisma isn’t necessary to be a good boss

    April 26, 2017 by Ashley

    From the Michigan State University press release:

    You don’t need the charisma of Steve Jobs to be an effective boss, indicates new research led by Michigan State University business scholars.

    In a series of five studies, the researchers found bosses who were supportive and set clear expectations — but weren’t necessarily transformational leaders with grand visions — were able to effectively motivate their employees.

    The study, published in the journal Organizational Behavior and Human Decision Processes, is one of the first to examine how a leader’s regulatory focus, or mindset, affects his or her own behavior and, in turn, employees’ motivation.

    The findings suggest bosses can modify their mindset to produce a certain outcome from workers, whether that’s innovation or a more conservative work focus aimed at meeting basic obligations and preventing errors, said Brent Scott, MSU professor of management and study co-author.

    Effective leadership may be based in part on leader’s ability to recognize when a particular mental state is needed in their employees and to adapt their own mental state and their behaviors to elicit that mindset,” Scott said. “Part of the story here is that you don’t have to be Steve Jobs to be an effective leader. There is no one-size-fits-all approach to managing.”

    The research team, led by Russell Johnson, MSU associate professor of management, conducted field studies and experiments with hundreds of managers and employees from a variety of industries including professional services, manufacturing and government.

    Bosses who had an innovative mindset (called promotion focus) were more likely to lead in a transformative way and thus elicit an innovative mindset among employees. Bosses with a more conservative mindset (called prevention focus) were more likely to “manage by exception,” which involves focusing almost exclusively on preventing mistakes, thus eliciting a prevention focus among workers.

    “We found that the motivations of managers are contagious and ‘trickle down’ to their subordinates,” Johnson said. “Thus, if managers are unhappy with how their people are approaching work tasks, the managers might actually be the ones responsible for eliciting their motivation in the first place. Managers can modify their leadership behavior to trigger the appropriate motivation orientation in their employees to fit the situation.”

    While the study identified some transformational leaders who elicited an innovative mindset among employees, Scott said it’s unrealistic to think that every boss can be — or necessarily wants to be — a transformational leader all of the time. Some work situations and environments may call for a more preventative approach.

    A sweet spot for managing may be what’s called “contingent reward behavior,” which brings in both a promotion and prevention focus — emphasizing gains and providing both positive and negative reinforcement based on performance.

    “The contingent approach is quid pro quo — if you do this, I’ll give you that,” Scott said. “It’s not sexy like transformational leadership, but it’s something that just about every manager can do because it doesn’t require you to ooze charisma.”


  9. Narcissism and social networking

    April 23, 2017 by Ashley

    From the University of Würzburg press release:

    Social media sites such as Facebook and Twitter have become an important part of the lives of many people worldwide. Around two billion users were active on Facebook at the end of 2016; 500 million regularly post photos on Instagram and more than 300 million communicate via Twitter.

    Various studies conducted over the past years have investigated to what extent the use of social media is associated with narcissistic tendencies — with contradictory results. Some studies supported a positive relationship between the use of Facebook, Twitter and the likes, whereas others confirmed only weak or even negative effects.

    Most comprehensive meta-analysis so far

    Fresh findings are now presented by scientists from the Leibniz Institute for Educational Trajectories Bamberg and the University of Würzburg. They were able to show that there is a weak to moderate link between a certain form of narcissism and social media activity. When taking a differentiated look at specific forms of behaviour or at the participants’ cultural background, the effect is even pronounced in some cases.

    The study is managed by Professor Markus Appel, who holds the Chair of Media Communication at the University of Würzburg, and Dr. Timo Gnambs, head of the Educational Measurement section at the Leibniz Institute for Educational Trajectories, Bamberg. For their meta-analysis, the scientists summarized the results of 57 studies comprising more than 25,000 participants in total. They have now published their findings in the Journal of Personality.

    Forms of narcissism

    They think of themselves as being exceptionally talented, remarkable and successful. They love to present themselves to other people and seek approval from them: This is how psychologists describe the typical behaviour of people commonly referred to as narcissists. “Accordingly, social networks such as Facebook are believed to be an ideal platform for these people,” says Markus Appel.

    The network gives them easy access to a large audience and allows them to selectively post information for the purpose of self-promotion. Moreover, they can meticulously cultivate their image. Therefore, researchers have suspected social networking sites to be an ideal breeding ground for narcissists from early on.

    Three hypotheses

    The recently published meta-analysis shows that the situation does not seem to be as bad as feared. The scientists examined the truth behind three hypotheses. Firstly, the assumption that grandiose narcissists frequent social networking sites more often than representatives of another form of narcissism, the “vulnerable narcissists.” Vulnerable narcissism is associated with insecurity, fragile self-esteem, and social withdrawal.

    Secondly, they assumed that the link between narcissism and the number of friends and certain self-promoting activities is much more pronounced compared to other activities possible on social networking sites.

    Thirdly, the researchers hypothesized that the link between narcissism and the social networking behaviour is subject to cultural influences. In collectivistic cultures where the focus is on the community rather than the individual or where rigid roles prevail, social media give narcissists the opportunity to escape from prevalent constraints and present themselves in a way that would be impossible in public.

    The results

    The meta-analysis of the 57 studies did in fact confirm the scientists’ assumptions. Grandiose narcissists are encountered more frequently in social networks than vulnerable narcissists. Moreover, a link has been found between the number of friends a person has and how many photos they upload and the prevalence of traits associated with narcissism. The gender and age of users is not relevant in this respect. Typical narcissists spend more time in social networks than average users and they exhibit specific behavioural patterns.

    A mixed result was found for the influence of the cultural background on the usage behaviour. “In countries where distinct social hierarchies and unequal power division are generally more accepted such as India or Malaysia, there is a stronger correlation between narcissism and the behaviour in social media than in countries like Austria or the USA,” says Markus Appel.

    However, the analysis of the data from 16 countries on four continents does not show a comparable influence of the “individualism” factor.

    Generation Me

    So is the frequently cited “Generation Me” a product of social media such as Facebook and Instagram because they promote narcissistic tendencies? Or do these sites simply provide the ideal environment for narcissists? The two scientists were not able to finally answer these questions.

    “We suggest that the link between narcissism and the behaviour in social media follows the pattern of a self-reinforcing spiral,” Markus Appel says. An individual disposition controls the social media activities and these activities in turn reinforce the disposition. To finally resolve this question, more research has to be conducted over longer periods.


  10. Study examines skills transferable from World of Warcraft

    by Ashley

    From the Missouri University of Science and Technology press release:

    “Stop playing that stupid video game and get a job.”

    It’s a sentiment expressed by generations of parents since Pong began invading unsuspecting households in 1975. But what if that “stupid game” could help you get a job, and what if that same game could make you a valuable team member once you had the job?

    A new study by researchers at Missouri University of Science and Technology found that World of Warcraft (WoW) gamers who were successful working as a team in “raids” had qualities that psychological studies have shown to translate to success on virtual workplace teams.

    These qualities include what psychologists call the Big Five personality traitsextraversion, agreeableness, openness, conscientiousness and neuroticism, as well as computer-mediated communication skills and technology readiness.

    The research team came to its conclusion by surveying 288 WoW gamers from across the massive multiplayer online role-playing game’s (MMORPG) many servers. Those surveyed were diverse in age, race, sex, class, occupation and location.

    The average survey taker played WoW eight hours a week and worked 38 hours a week — important because the research team wanted survey takers that had full-time jobs that potentially involved teamwork. The survey consisted of 140 questions asking about motivation, communication skills, preferences for teamwork, and personality, with most questions relating to the Big Five personality traits.

    WoW is the world’s most-subscribed-to MMORPG, with over 10 million subscribers. After creating a character, players explore an almost limitless virtual landscape. They complete quests and fight monsters, all while interacting and working with characters controlled by other players — a key aspect to the S&T research study.

    The team surveyed 288 players of the game’s fifth expansion set, Warlords of Draenor. They compared players survey answers to their character’s statistics. A player’s group achievement points indicate how much group gameplay they’ve participated in, and how successful it has been, says Elizabeth Short, a graduate student in industrial-organizational psychology who compiled data for the study. Short’s research team is led by Dr. Nathan Weidner, assistant professor of psychological science at Missouri S&T.

    “What we wanted to look at was virtual teamwork and what kind of characteristics a person had in-game that would translate to real life and the workplace,” she says.

    Short called the correlations the research team found between a gamer’s WoW group achievements and player traits small but “statistically significant.” One of the strongest correlations the team found was in terms of technology readiness.

    The more technologically ready you are, the more resilient around technology you are, the more adaptable you are, the more achievement points you have (in WoW). You could flip that,” she says. “The more achievements you have in game, the more technology savvy you are in real life. And that’s a good thing, especially in virtual communication teams and workplaces.”

    Short says that growing up, she was naturally shy and introverted, but WoW built confidence and helped her “shed her armor.” Short’s social confidence grew, and by the time she started college, she was able to communicate better, all because of what she learned playing WoW.

    “I loved WoW and I played it constantly,” she says. “Then I started college, and being able to use some of the things I learned in WoW, like talking and communicating with people during raids, helped me socially in school.”

    Short hopes that through the study, more gamers will find that their WoW confidence can be converted into the real world and a career.

    “I like the idea that there are aspects of gaming that help and strengthen a person with skills, knowledge and abilities to be able to transfer those skills into the workplace,” she says. “If it helps students like me, I want to see if it helps people in the workplace.

    “This research shows us that those skills, while not exactly the same, they transfer,” she adds.

    Short will be presenting the research team’s findings at the 32ndannual Society for Industrial and Organizational Psychology Conference in Orlando at the end of April.