In first look at Facebook data, researchers to track patterns of fake news

A Cornell researcher is collaborating on an unprecedented study examining Facebook data to look for patterns in “problematic sharing” – posting links to stories that have already been flagged or proven false – to determine whether this activity spikes around episodes such as elections or terrorist attacks.

The analysis will shed light on the spread of fake news around the web – a phenomenon that isn’t yet fully understood, in part because researchers outside Facebook have had very limited access to its information.

“In terms of the total amount of false news being shared in America on any given day, we actually don’t really know,” said Drew Margolin, assistant professor of communication and a member of the research team. “Facebook has an enormous user base, so just understanding those patterns is our goal.”

“There has been a lot of discussion about policy interventions, especially for the large social media companies like Facebook, but regulation without knowledge of the underlying dynamics is really problematic.”

Drew Margolin, assistant professor of communication

The project is among 12 recipients of the inaugural Social Media and Democracy Research Grants, a partnership between the Social Science Research Council and Social Science One, a new organization enabling academics to analyze data held by private companies.

The team will focus on examining when problematic links are shared – times of day, days of week and seasons, for instance – to compare it with sensitive periods, like the weeks leading up to an election. Better understanding these rhythms could provide policymakers with useful information as they try to limit the spread of misinformation.

For example, if the study shows that more people link to fake news following a mass shooting or terrorist attack, stricter regulations could be put in place after such events, with lighter regulations in place otherwise, Margolin said. Similar regulations exist in the financial industry – stricter in some periods to prevent problems like bank runs.

“There has been a lot of discussion about policy interventions, especially for the large social media companies like Facebook, but regulation without knowledge of the underlying dynamics is really problematic,” Margolin said. “It’s really unclear what kinds of interventions will create a benefit, so we need studies like ours to understand the baseline dynamics of this kind of sharing.”

The information will be aggregated to protect user privacy, but researchers will be able to see which groups of people shared certain articles, broken down by characteristics like age and gender. Because Facebook guesses its users’ political affiliations, the researchers will be able to explore whether certain articles were mostly shared by people on the extreme right or left. They’ll also know whether links were shared without being opened, known as “careless sharing.”

In addition to informing future policy, the yearlong study will yield new information about political partisanship among the two-thirds of Americans on Facebook.

“It gives us the opportunity to see [partisanship] in a realistic form from a social science point of view with a diversity of participants – young and old, rich and poor – and all in their natural environments,” he said. “Behavioral dynamics on Facebook are de facto important, because of the size and scope of its reach.”

The project will be led by researchers at Ohio State University, with collaborators at Stony Brook University and the University of Michigan, as well as Cornell. It’s part of the first round of projects funded by Social Science One, an independent group that solicits and peer-reviews academics’ proposals to analyze privately held data.

“These companies – not just Facebook – are concerned about their proprietary data but they can’t really regulate themselves, so they need to share information,” Margolin said. “This could be a good model through which we can have informed policy, without them having to give away what they see as proprietary.”

Media Contact

Jeff Tyson