Tip Sheets

In limiting political content, Facebook risks advancing censorship narrative

Media Contact

Jeff Tyson

Facebook announced on Wednesday that it will begin implementing changes to its algorithm to reduce political content on its users’ news feeds. The social media giant will be testing its new algorithm this week on users in Canada, Brazil and Indonesia and will expand it to the United States in the coming weeks.


Sarah Kreps

Professor of Government

Sarah Kreps, a professor of government, studies technology, international politics and national security, and is author of the book “Social Media and International Relations.” Kreps says that by reducing political content on newsfeeds, Facebook risks sowing more discord across the political spectrum by playing into a harmful narrative that the tech company censors.

Kreps says:

“Facebook’s decision to tweak its algorithm to depoliticize newsfeeds has the possible virtue of turning down the thermometer in the political landscape if people are less exposed to vitriolic political content. But it also risks being counterproductive. If Facebook looks like it’s manipulating the conversation by reducing the visibility of conservative voices that have been Facebook’s most engaged pages in recent months, it will simply play into the narrative of censorship that has helped aggrieved groups to recruit new adherents, albeit possibly on alternative platforms.  

“The consequence won’t be more harmony but actually the opposite, more discord across the political spectrum. Whether the new algorithm actually alters some group’s visibility will be impossible to corroborate, but groups will be able to make these claims and Facebook will again find itself playing an unwitting—or perhaps witting—part of the political debate.” 

J. Nathan Matias

Professor of Communication

Nathan Matias, an assistant professor of communication, studies algorithms and the role of digital technologies in advancing or hindering the public interest. Matias says Facebook has an obligation to be transparent with the public about algorithm changes and the way it shares content, and that the social network should require an independent evaluation of these changes.

Matias says:

“By adjusting its algorithms, Facebook has tremendous power to shape the attention, beliefs, and behavior of people around the world. Democracies should expect this kind of power to be guided by evidence and accountable to the public. The last time Facebook reduced political content in their algorithm, they deliberately and secretly reduced the influence of some publishers more than others, based on political leaning.

“By telling the public about upcoming tests to reduce political content, Facebook is taking a partial, insufficient step towards transparency. The company should invite independent evaluation and full transparency over the impact of such wide-reaching changes. If they refuse, lawmakers in affected countries should require independent evaluation by law.

Cornell University has television, ISDN and dedicated Skype/Google+ Hangout studios available for media interviews.