New Cornell research offers hope for those hoping to quash the fake news and false rumors that reverberate around the internet.
When Twitter users tweet a false rumor, they are more than twice as likely to accept correction if it comes from a mutual follower – someone they follow who also follows them – compared with when they are corrected by someone with whom they have no Twitter relationship, according to a study published Sept. 5 in Political Communication.
“Basically, people don’t want to look foolish in front of their friends, but are less concerned with what strangers think,” said lead author Drew Margolin, assistant professor of communication and the Geri Gay Faculty Fellow.
“Our argument in this paper is that the commitment to the truth begins at the social level. We are social beings first,” Margolin said. “We care about our friends, family – our social group – and its interests at least as much, and often more, than we care about whether something is true or false.”
And the effect of friendship appears to be stronger than the effect of politics. Twitter users in the study were less likely to accept corrections about political rumors, compared with rumors on nonpolitical topics. But if the correction – even on a political topic – came from a follower, the correction had a stronger effect; it made acceptance more likely.
“In a highly competitive political environment, warnings from friends can signal that spreading a rumor may hurt a shared cause,” Margolin said. “By contrast, admonishments from strangers may be taken as evidence that the misinformation is providing a strategic advantage.”
The team studied Twitter users who corrected or “snoped” each other – using fact-checking sites like snopes.com – during the U.S. presidential campaigns in 2012 and 2016. The researchers began by selecting all tweets posted during those periods that were “triplets” – three-part conversations. In these triplets, a Twitter user made a claim, another user replied to it, referring to one of three fact-checking website domains (Snopes.com, FactCheck.org, PolitiFact.com), and the Twitter user replied. The researchers analyzed these tweets to determine whether the conversations involved mutual followers or strangers and whether the Twitter user accepted or rejected the correction.
“The social relationships that underlie our political discussions matter, and that applies even to something that seems objective,” Margolin said.
The research draws on the theories of Philip Tetlock, a well-known psychologist who suggests people have different modes of reasoning. These include the “intuitive scientist,” when one tries to be as logical and as accurate as possible. In contrast, an “intuitive politician” is much more concerned with reputation and relationships.
The default communication mode in these contexts seems to be intuitive politician, Margolin said.
“When you have a social relationship, there’s a sense of mutual concern, so you wouldn’t be doing something just to shut me up or harm me or take away my power,” he said. “Whereas if you’re a complete stranger, especially if you’re from another ideological camp, that might be exactly what you’re trying to do.”
This research suggests a new insight into the discussion of social media and filter bubbles. Most discussion to date focuses on exposure to different kinds of information. In contrast, this analysis shows that exposure to different information is often not enough to change someone’s mind. What really matters is mutual commitment between people sharing information, Margolin said.
“If we want to get others to take certain facts more seriously, we have to form personal bonds with them, rather than simply trying to find a way for an algorithm to change what flows across their screen,” Margolin said.
Margolin wrote the study, “Political Fact-Checking on Twitter: When Do Corrections Have an Effect?,” with Aniko Hannak of Northeastern University and Ingmar Webber of Hamad Bin Khalifa University. The study was funded in part by a grant from the Cornell Institute for the Social Sciences.