Tip Sheets

‘Politics, not policy’: Meta ending fact-check program

Media Contact

Becka Bowyer

Meta will stop using third-party fact checkers on Facebook, Threads and Instagram and instead rely on users to add notes to posts. The following Cornell University experts are available for interviews.


Alexios Mantzarlis

Director of the Security, Trust, and Safety Initiative (SETS)

Alexios Mantzarlis is director of the Security, Trust, and Safety Initiative at Cornell Tech. In a previous role as director of the International Fact-Checking Network, he played an important role in setting up the partnership between Facebook and its third party fact-checkers.

Mantzarlis says:

“Mark Zuckerberg had eight years’ worth of data to prove his belief that Meta’s Third-Party Fact-Checking Program was biased. Instead of sharing any hard evidence, however, he chose to cosplay like Elon Musk and promise free expression for all. 

“He chose to ignore research that shows that politically asymmetric interventions against misinformation can result from politically asymmetric sharing of misinformation. He chose to ignore that a large chunk of the content fact-checkers are flagging is likely not political in nature but low-quality spammy clickbait that his platforms have commodified. He chose to ignore research that shows Community Notes users are very much motivated by partisan motives and tend to over-target their political opponents.

“The program was by no means perfect, and fact-checkers have no doubt erred in some percentage of their labels. But we should be clear that Zuckerberg’s promise of ‘get[ting] rid of fact-checkers’ was a choice of politics, not policy.

“Alongside terminating the fact-checking program, Zuckerberg also announced a more lax approach to content moderation tout court whereby Meta will not proactively seek potentially harmful content across a wide range of domains. Depending on how this is applied, the consequences of this decision will be an increase in harassment, hate speech and other harmful behavior across billion-user platforms.”

Gordon Pennycook

Associate professor of psychology in the College of Arts and Sciences

Gordon Pennycook, associate professor of psychology, studies misinformation and has investigated various interventions on social media, including accuracy prompts, fact-checking or debunking, crowdsourcing, and labeling or warnings.

Pennycook says:

“Although research supports the idea that crowdsourcing fact-checking can be effective when done correctly, it is important to understand that this is intended to supplement fact-checking from professionals – not to replace it. The extent to which layperson evaluations can be used to inform fact-checking depends entirely on the underlying quality of the information that the laypeople are being exposed to. 

“In an information ecosystem where misinformation is having a large influence, crowdsourced fact-checking will simply reflect the mistaken beliefs of the majority. I support using crowdsourced fact-checking, but removing third-party (professional) fact-checking strikes me as a major mistake.”

Cornell University has television, ISDN and dedicated Skype/Google+ Hangout studios available for media interviews.