David Lazer, professor of political science and of computer and information sciences at Northeastern University, delivers the Cornell Center for Social Sciences’ Distinguished Lecture in the Social Sciences Oct. 24 in Statler Auditorium.

Expert: Design information networks that support democracy

Attempting to set the time on his wife’s watch years ago, David Lazer gave up after a couple of frustrating hours and ran a Google search. 

Problem solved: Deep in an obscure online forum, years earlier, a woman from a far-away part of the world had posted the complicated sequence of button-pressing necessary to complete the task. 

“This is so routine for us that we don’t realize what a miracle it is,” said Lazer, professor of political science and of computer and information sciences at Northeastern University, while delivering the Cornell Center for Social Sciences’ Distinguished Lecture in the Social Sciences Oct. 24 in Statler Auditorium. “She had figured something out, and it spread to me. It was a kind of innovation, and that’s something that’s very special about today’s information ecosystem.” 

The innovation in this case helped Lazer, who is also co-director for the NULab for Texts, Maps and Networks. 

But there are growing concerns that the search engines and social media networks most people now rely upon for information are also tools for spreading misinformation and “fake news” that undermine democracies. 

“When we think of democracy, we should think about the structures of information,” Lazer said. “We need to think about building these systems for democracy.” 

Knowledge is a network phenomenon, Lazer said in his talk, “Democracy, Today: Fake News, Social Networks and Algorithms,” which was co-sponsored by the departments of Communication, Computer Science and Information Science. Much of what we know comes not from direct experience, he said, but from people and sources we trust. 

But our structures for acquiring and sharing knowledge, including about critical choices influencing elections and democracy, have changed dramatically over the past generation, Lazer said. 

Twentieth century media offered “artisanal curation,” he said. People such as reporters and editors acted as mediators of information they deemed accurate and important. 

Changing business models have shifted power to systems operated by the “meta-mediators” through which many now access news – search engines and social network news feeds. 

The modern information ecosystem offers citizens vastly more choice, but lacks human curation, concentrates attention on a small number of sources that top search results or feeds, and is vulnerable to manipulation, Lazer said. 

One example: Google search results that failed to clearly debunk a myth that eating apricot seeds can combat cancer. (They actually can be lethal.) 

Ads prominently offered apricot seeds for sale. The top link led to a news story exposing the fallacy, but a large excerpt highlighted in the results was ambiguous. Crowd-sourced product reviews on Amazon.com and WebMD touted the seeds’ supposed benefits. Even a link to the National Institutes of Health’s website featured an article from a suspect publisher. 

“There clearly is vulnerability to our crowd-sourced systems and automated systems for information,” Lazer said. 

Despite that vulnerability, Lazer’s research into the spread of fake news on Twitter during the 2016 presidential election suggests it was less influential than some may think. 

Fake news accounted for about 5% of exposure to political news content, a “nontrivial amount,” Lazer said. But just 16 of the nearly 16,500 users analyzed accounted for 80% of that content’s spread. 

“Fake news isn’t really a broad, systemic problem on Twitter,” Lazer said. “It’s really a question of a very, very seedy but small neighborhood.” 

Lazer is more concerned, he said, about the impact of fake news and misinformation in developing countries with less robust information systems. 

As local media outlets struggle, Lazer suggested that modern universities “have increased burdens and duties to inform our democracy, to inform public discourse.” He pointed as a model to West Virginia University graduate students who exposed Volkswagen’s cheating on emissions tests. 

Meanwhile, he said, social media platforms should invest in more field research about their systems’ influence, and in understanding their impact in diverse cultures around the world. He said calls for regulatory intervention to break up companies like Facebook “may actually reduce their capacities” to stem misinformation. 

“A lot [have] realized belatedly they’re in the democracy business,” he said. “Better late than never.”

Media Contact

Abby Butler