Tip Sheets

Experts split on ‘prebunking’ – shifting blame or empowering users?

Media Contact

Becka Bowyer

Google is expanding its campaign to fight misinformation through an approach it calls ‘prebunking’, which involves using a series of short videos to teach people how to spot false claims before they encounter them. The following Cornell University experts are available to discuss this strategy.


Jeff Niederdeppe

Professor, Department of Communication

Jeff Niederdeppe, professor of communication and Senior Associate Dean of Faculty Development at the Jeb E. Brooks School of Public Policy, studies the mechanisms and effects of mass media campaigns, strategic messages and news coverage in shaping social policy. He says prebunking shifts blame and responsibility for stopping the spread of misinformation.

Niederdeppe says:

“It is easy to see why a company like Google would want to implement such a campaign – there is evidence that it is possible to inoculate at least some people against the corrosive effects of misinformation, and it is far more efficient, and likely effective, to prevent the spread of misinformation than it is to try to identify, track, and respond to it in real-time, given what we know about how fast and unpredictably it can spread.

“At the same time, the use of such a strategy shifts blame and responsibility for stopping the spread of misinformation to people themselves, absolving Google from the responsibility of stopping ill-meaning forces using their platform(s) to post, share, and amplify misinformation in the first place.

“If they are serious about this strategy, they will need to spend considerable resources to evaluate its success and refine their prebunking messaging and campaign as it unfolds. Time will tell if this is a good-faith effort to reduce the spread of misinformation or something designed to convince citizens and policymakers that they are part of the solution and not the problem.”

Drew Margolin

Associate Professor

Drew Margolin, professor of communication, studies the way people communicate online and the role of accountability, credibility, and legitimacy within social networks. He says the strategy empowers users through transparency.

Margolin says:

“This campaign makes good sense as part of a broader set of measures. It empowers users through transparency, because it gives users access to the patterns of misinformation that the companies see and already use in their own judgments. This not only educates the public about misinformation they might see, but it also helps them evaluate other things companies might do, like flag particular posts. 

“The truth is that there are two kind of experts in the struggle over spreading/stopping misinformation – the misinformers, who try to figure out what will spread, and the platforms, who try to stop it.  Individuals don't see enough cases, or know their truth well enough, to make a judgment. With this program, companies are giving them a cheat sheet based on what they see in the much larger datasets they have access to.”

Cornell University has television, ISDN and dedicated Skype/Google+ Hangout studios available for media interviews.