From left, Megan McArdle, Washington Post columnist; Jamal Greene, professor at Columbia Law School; and Rick Geddes, professor of policy analysis and management, participate in Civil Discourse: The Peter ’69 and Marilyn ’69 Coors Conversation Series, hosted April 14 by Cornell Law School.

Coors lecture highlights content moderation issues

The kinds of speech that should, and should not, be allowed on social media platforms – and who should make such distinctions – were discussed by a journalist and law professor during the final installment of Civil Discourse: The Peter ’69 and Marilyn ’69 Coors Conversation Series, hosted April 14 by Cornell Law School.

At the event, “Deplatforming: Does Big Tech Protect or Prevent Public Discourse,” Columbia Law School professor Jamal Greene pointed out that the large size of social networks such as Facebook, Instagram and Twitter makes content moderation difficult.  

“Why treat a social media platform any differently than you treat a newspaper?” Greene asked. “Because of the scale of social media. Instead of having, in any given issue of the [Washington] Post, a few hundred bylines or pieces of content, we’re talking about billions of pieces of content every single day.”

While many people get news from these platforms, some argue that the reliability of the information is more of a detriment to public discourse than an aid. Critics say social media can be used to create misinformation and disinformation – with huge consequences for things such as a democratic election or a pandemic.

Washington Post columnist Megan McArdle, the event’s other guest, said that it is widely known in her profession that the most effective way to get engagement on a post is to make people angry. Because these posts get the most engagement, they’re boosted in algorithms and placed in front of users the most often.

“A research psychologist of my acquaintance says it reminds him of addicts,” McArdle said. “I’m just going out, and I’m looking for something that’s going to make me mad. Because when I’m mad, I’m not sad, I’m not worried, I’m not anxious. I’m just angry. So temporarily, it feels like it solves your problem, but much the same way that heroin temporarily feels like it solves your problems, it actually makes everything worse in the long run.”

The lack of agreement around how to moderate speech on social media apps has created mistrust on both sides of the political spectrum, the speakers said. The left gets upset when companies don’t crack down on harmful posts of harassment, bullying and hate speech, and the right accuses them of censorship when they do.

Some companies have resorted to deplatforming – blocking, deleting or deactivating users for their content. Twitter deplatformed former President Donald Trump in the wake of the 2021 storming of the U.S. Capitol, but it can also happen to private citizens.

McArdle said a cultural solution would be more effective than letting social media networks or the legal system impose rules on content. She suggested companies create institutional, private social media policies for their employees.

“You can’t do this and work here,” McArdle said of a potential workplace policy.  “Because it doesn’t matter that your Twitter bio says that that your opinions don’t represent ours. If you endorse genocide, even as a joke, I'm going to be asked why I employ someone who thinks genocide is funny.”

She noted that historically, innovations in communication technology have caused turmoil for society. From the printing press to the telegraph, radio and television, new technology can breed distrust and forces people to engage with those with different opinions.

“What we have to do now is evolve the norms and the institutions that are going to let us navigate this and take ourselves, ultimately, to a place where once again, society has processed the change, and is able to live together with a certain amount of trust,” McArdle said. “We’re not there yet, but I believe because I believe in in people.”

Greene is part of a different kind of potential solution; he serves as a co-chair of Facebook’s oversight board, an independent entity that adjudicates Facebook’s content moderation decisions.

“I do think the broader question of ‘How do we structure institutions that are deserving of trust, that are accountable, but are nonetheless independent of either profit or partisan motive, in making important decisions about public discourse?’ is a vital question,” Greene said.

Rick Geddes, professor of policy analysis and management, and founding director of the Cornell Program in Infrastructure Policy, moderated the conversation. The Peter ’69 and Marilyn ’69 Coors Conversation Series brings in high-profile guests with a range of political viewpoints to foster greater understanding of important topics.

Media Contact

Rachel Rhodes