Speaker: Content moderation is free speech, not censorship
By Patricia Waldron
Kate Starbird wouldn’t repeat the insulting comments she and her colleagues have received online.
Her research tracking the spread of rumors and misinformation across the internet has made her a target of exactly the kinds of false claims she documents. “It’s really meta to now be the focus of the phenomenon I study,” Starbird said.
In her decade researching this topic, Starbird, an associate professor in the Department of Human Centered Design & Engineering at the University of Washington, has witnessed the spread of unintentional misinformation and the growth of deceptive, organized disinformation campaigns that have metastasized throughout social media platforms. She notes that disinformation is so pervasive because it takes advantage of our commitment to free speech, and is perpetuated both by individuals who benefit from the false claims and their unwitting followers.
Starbird shared her views April 26 in Gates Hall in the final installment of the Cornell Ann S. Bowers College of Computing and Information Science’s Distinguished Speaker Series on free expression. Her talk, “Reflections on Disinformation, Democracy, and Free Expression,” was held in concert with the university’s Freedom of Expression theme year.
Starbird and her colleagues’ work is especially salient during a presidential election year.
With the rise of generative AI, there has been a proliferation of deepfakes, such as voice ads from politicians, which have the potential to play a big role in the election, said Kavita Bala, dean of Cornell Bowers CIS, in her introductory remarks. There is a huge concern that the future of democracy is going to be derailed by these kinds of technologies, she said, so understanding the spread of information – true or false – is vital.
Starbird began her work in this area by monitoring rumors that circulated after natural disasters and crises like the Boston Marathon bombing in 2013. “We began to realize that we weren’t just looking at accidental rumors, but pervasive disinformation that was sinking into the structure of the internet,” she said.
Her early work showed that foreign and domestic agents were involved in disinformation campaigns. Russia’s Internet Research Agency, for example, had infiltrated both sides of the political discourse before the 2016 U.S. presidential election. Its actions served to erode public trust in U.S. institutions and the “shared ground” necessary for a functioning democracy, she said.
As part of the Election Integrity Partnership in 2020, Starbird studied social media posts spreading disinformation about the 2020 U.S. presidential election in an effort to combat false claims before they went viral. She found that “the Big Lie” – the idea that the presidential election was stolen from Donald Trump – was largely spread by a small group of political operators, including Trump himself. Meanwhile, everyday people were reinforcing the idea by sharing their own misconceptions of being disenfranchised.
Then in 2022, the harassment began. Starbird and her team experienced online insults and threats, lawsuits, a congressional investigation and dozens of public records requests looking for evidence of government collaboration and social media censorship. Purveyors of disinformation who benefit from deceiving people were attempting to discredit her and her work, and to rebrand content moderation, information literacy efforts and her entire area of study – all valid acts of free speech – as censorship, she said.
Content moderation is one tool, but not a long-term solution, Starbird said. Investing in local journalism, teaching media literacy and providing better tools on social media platforms that make it easier to factcheck and recognize false claims are also necessary, she said.
Despite the bleakness of the current social media landscape, Starbird said she’s heartened that young researchers are getting involved in this area, even in the face of online attacks.
“They understand the stakes,” Starbird said. “They’re not going to abandon their research questions or the hope that we can innovate to create social platforms that support, rather than destroy, democratic discourse.”
Patricia Waldron is a writer for the Cornell Ann S. Bowers College of Computing and Information Science.
Media Contact
Get Cornell news delivered right to your inbox.
Subscribe