Research Finds Peer Pressure Drives the Spread of “Fake News”

Professor Hemant Kakkar studied the psychological and group pressures behind the sharing of misinformation on social media

March 9, 2023
Behavioral Science
Animated GIF illustrating Fake News Story

It’s no secret that people crave a sense of belonging, and that peer pressure can contribute to personal actions to try and fit into a group. Hemant Kakkar, an assistant professor of management and organizations at Duke University’s Fuqua School of Business, wanted to understand if that same psychology applies to the spread of misinformation on social media, commonly known as “fake news.”

Kakkar found when people share “fake news” on social platforms, the pressure to conform to the group is even stronger, because members who don't share misinformation receive social punishment. His findings are published in a new paper, “Tribalism and Tribulations” in the Journal of Experimental Psychology: General.

This is the latest in a series of findings from Kakkar about the root causes of the spread of “fake news”—previous research explored the personality types most involved in sharing misinformation.

In the latest study, Kakkar and Asher Lawson, who earned his PhD from Fuqua and is now a professor at INSEAD, and former research assistant, Shikhar Anand, now at McKinsey, wondered if the need to belong made people willing to promote pieces of news shared by groups—no matter their opinion about the validity of the sources, to avoid being ostracized and marginalized.

Kakkar and colleagues studied data from about 13,000 U.S. Twitter users who had shared “fake news” in 2020. In that year, social platforms had come under a lot of heat, Kakkar said, because of the confluence of “fake news” about COVID, misinformation about the U.S. elections, and social protests following the killing of George Floyd.

The researchers analyzed the Twitter interactions within groups who had shared news stories from websites that fact-checking sources deemed “fake news” and recorded users’ activity at two different points in time. Kakkar says data showed that members who hadn’t shared “fake news” posted by the group were subjected to reduced interaction by their social connections compared to those who shared the “fake news.” In short, not sharing “fake news” was socially costly for members who behaved differently from the rest of the group.

In a follow-up experiment, the researchers surveyed a sample of almost 1,000 people reflecting the demographic of the U.S. They asked the participants to pick among six “fake news” headlines sourced from the website snopes.com that they were likely to share. Then, the researchers asked which friend they were more likely to interact with, based on the content the friends had shared. Kakkar says this experiment also showed evidence of the social costs of not sharing “fake news.”

“The big surprise from both studies was that we found these effects to be stronger with ‘fake news’ than for real news,” Kakkar said. “Which basically shows that these ‘fake news’ groups are much tighter. People in these groups want others to behave the same way as they do.”

Kakkar believes only a minority of the population spreads misinformation. “The general consensus is that a small minority of people on Twitter are responsible for most of the ‘fake news’ shared on social media,” Kakkar said, “But the consequences are felt by everyone.”

Kakkar says the corrective measures adopted by some platforms--like Twitter’s fact-checking warnings or the ban of certain users--have been successful to an extent, but they haven’t eradicated the problem.

“Tech companies could do more,” Kakkar said. “For example, they could come up with some kind of a genuine score, a publicly visible metric that indicates how many fake news stories this person has shared over time. If people see someone who has a very low score, thus implying having a history of sharing fake stories, maybe they don’t want to share this person’s content, even though this person is part of their group, because now that genuine score is visible, and when something is public and visible, then you are more likely to think it is wrong to share it.”

Kakkar also thinks that governments have a role to play in regulating tech companies. “But at the end of the day, they are private companies,” Kakkar said. “Governments can provide guidelines, but it is a delicate balance to avoid interfering with their freedom.”

Kakkar says that moving beyond politics, misinformation can especially impact already vulnerable populations.

“I'm talking about families, for example, in rural areas of India, where farmers and artisans have to sell their produce or handicrafts, and they are misinformed on the price they can charge,” Kakkar said, “What interventions can we devise to educate them and help them make better decisions?”

 

This story may not be republished without permission from Duke University's Fuqua School of Business. Please contact media-relations@fuqua.duke.edu for additional information.

Contact Info

For more information contact our media relations team at media-relations@fuqua.duke.edu.