Cornell Chronicle writer Laura Reiley speaks with Claire Wardle, associate professor in the Department of Communication in the College of Agriculture and Life Sciences, about the “armpit of the internet.”
Research Matters: Learning from ‘the armpit of the internet’
By Laura Reiley, Cornell Chronicle
This week’s episode of Research Matters features misinformation expert Claire Wardle, discussing how today’s information ecosystem has become increasingly polluted by misleading and emotionally charged content that spreads faster than facts.
Wardle, associate professor in the Department of Communication in the College of Agriculture and Life Sciences, spoke with Cornell Chronicle staff writer Laura Reiley about what spending time in what she calls “the armpit of the internet” has taught her about how people use, value and share information.
“Those spaces are communities,” Wardle said. “And people feel something when they go there. They feel heard. They’re told to do their own research. Then they turn up with their own research and people go, ‘Oh, that’s interesting.’ But there’s a sense that they’re part of something.”
The conversation also explored possible solutions—from media literacy and community-based trust networks to new models of journalism that help people navigate a chaotic digital information landscape.
Read the transcript
Laura Reiley: 0:28
Hi, I'm Laura Reiley, and this is Research Matters, a show about Cornell researchers who are tackling some of the world's toughest problems and finding solutions that make a real difference in our everyday lives. Today, we speak with Dr. Claire Wardle, a leading researcher on misinformation and the way it spreads. She's been studying how false and misleading content moves through social media and traditional media for more than a decade. And her work is helping governments, tech platforms and journalists understand what we're up against. Welcome, Claire.
Claire Wardle: 0:55
Thanks for having me.
Laura Reiley: 0:56
Well, great. I think let's start with how big a problem is misinformation. And I think this will be...all of us will... we'll be nodding along with, with your answer.
Claire Wardle: 1:05
So I got into this work in 2007, 2008. I was living in the UK and the BBC asked me to do some research with them, and at the time, the BBC maybe had about three hoaxes a year. I mean, they were aware of misinformation, even though we didn't really call it that. And at the time we're like, oh, it's a, it's a problem. And of course now they, you know, they get hoaxed three times a second. I mean, it's just the environment has changed to such a degree. And as I'm sure we'll talk about, it has been growing. But then you throw AI into the mix and it's now, it's on steroids.
Laura Reiley: 1:37
Absolutely. So obviously we have different... we have bad actors, you know, kind of disinformation, bad actors, but we also have kind of this cultural splintering. Like, how much does that play a part in, in the dissemination of misinformation?
Claire Wardle: 1:53
Yeah. So you're right that the number of people who deliberately create false information to cause harm is actually relatively small. And if we as individual people on social media didn't constantly click the share button, we wouldn't have the problem. But so much of the the issue that we have with our information ecosystems is that they are driven by algorithms and algorithms promote the content that we really like because it makes us feel good about ourselves. So it reaffirms our world views. And in those spaces we're much more likely to like press share on that kind of content. And so if I'm a bad actor, I'm going to spread as much of that kind of content that's going to make different people living in their different splintered sections of the information ecosystem angry, mad, scared, feeling smug that they know the right thing because then that triggers us to share.
Laura Reiley: 2:43
So obviously, what you're saying is, is that, that those feelings drive the, the, the, I don't know, continuation of this disinformation or misinformation. So what can we do to kind of push back against, the clicking, the compulsive nature of it?
Claire Wardle: 3:01
Yeah. I mean, I'll say for too long, people like me, researchers, journalists, fact checkers, there was a sense of, oh, people have a rational relationship to information. If only we gave more facts, we'd all be OK because we struggle, I think, those of us who've been trained as researchers or journalists to recognize that everybody — doesn't matter about your education or who you vote for — as humans, we have an emotional relationship to information, and so that's part of the problem. So when we talk about building resilience within communities, or somebody says, "How do I talk to Uncle Bob at the Thanksgiving table?" It's about saying to everybody, "Hey, we're all susceptible to this." Like we're all, we all get angry or mad or smug. And so when we have those feelings, that's when we need to put the phone down. By recognizing in ourselves, "Who do I help by sharing this unless I'm 100% positive it's true?" And most of the time, none of us can be 100% positive.
Laura Reiley: 3:54
So, talk a little bit about, kind of, media literacy education. What would that look like and how would we do it?
Claire Wardle: 4:03
So about ten years ago, when there, when many people suddenly got concerned about this problem of misinformation, I'd be at lots of these conferences and somebody would like, "What about media literacy?" And people like, "Oh, that's going to take too long. Why can't we just get Facebook to change its algorithm? Why can't we just have a spam filter for misinformation?" But I wish we'd realized that if we had invested heavily in educating people about the ways our brains can be manipulated, the fact that we're all susceptible to this and not think about media literacy, it's like, this is how you read a headline, but to actually just have a conversation about what it means to be a human navigating our contemporary information spaces, what does it mean? Why are we sharing? We often share content because we're all performing an identity on social media. Do I want to share an image of a burnt lasagna I made last night? No. I want to share the pictures that make me look great. And I also want to share links to information that makes people think about me in a certain way. So once you start talking to people about those dynamics, but also why people are trying to manipulate this, how they're making money, why they're trying to divide us. People go, "Oh," like, that's media literacy. But it needs to be a conversation, not a PowerPoint in a, you know, high school thinking that we ticked a box. But, "Oh, we've done media literacy now." This has to be, you know, cradle to grave. And it needs to be a conversation amongst all of us about how do we navigate our contemporary information spaces. It does not look how it did 30 years ago.
Laura Reiley: 5:26
So is there a context for it? I mean, is there a place that we can obviously, you said, you know, the high school classroom. That seems like a logical place to introduce these ideas because I think teenagers are especially vulnerable to misinformation and, you know, maybe even kind of more addictive, compulsive, you know, social media use. Where else can we do this? Where else can we have these conversations?
Claire Wardle: 5:50
Well, one thing I would tell you is I think many of us are addicted to social media. I mean, I think there's this idea that it's, oh, it's all the kids. We actually know that older adults, particularly retired adults, spend a lot of time on their screens. It's filling something in that void. So I think when we say, where should this happen? I think it should be ongoing and it shouldn't always be in a classroom. Right. Like I work a lot with communities. What does it mean for a library to have a conversation about it or the Rotary Club? Or what does it mean to think about different spaces? Sometimes I'm in a taxi with somebody wants to talk about this stuff, and I'm like, yeah, let's talk about it. Like it's a subject. I don't study some obscure chemistry equation, like I study misinformation and lots of people see relevance in their lives. Unfortunately, lots of people see their families torn apart by information that is dividing them. So I think that there are non-traditional spaces for learning. And learning doesn't always have to look like a PowerPoint presentation and a didactic relationship to people who are learning. And I think the more we talk about it, the better off we are. And sometimes I use the analogy of pollution and I say, you know, if I just share something from Facebook without checking, it's a bit like throwing a can of, you know, a Diet Coke can out of the window when I'm driving down the highway, right. Like what really is the harm? I'm being lazy, like whatever, I can't be bothered to find a... But actually, if we all do that, we have a problem, right? So what does it mean to think by helping one another or drink driving? Right? It used to be that somebody would leave a party drunk, and you're like, "Oh, I hope that they get home safe." And now we're like, "Hey, maybe I could take your keys." Like, what does it mean to say to one another, "Hey, you know, you shared something yesterday and I don't know if it's true, but, like, I'm seeing some of this stuff and I'd love to have a conversation about it."
Laura Reiley: 7:29
So it's kind of a zeitgeist shift a little bit.
Claire Wardle: 7:31
Yeah, like a recognition of the harms.
Laura Reiley: 7:33
So I'd love to talk about kind of the community piece of it I find really interesting and well, let's talk about elections. You know what, what can communities or kind of smaller I mean, we have this idea that there's this, this top down kind of, you know, approach to misinformation. What can communities do? And by that I don't mean like your town, but just kind of smaller online communities or, you know, your literal community.
Claire Wardle: 8:02
Yeah, and some of this is just raising awareness, right. Like sometimes I'll say, coming into an election, hey, you know, in lots of elections around the world, we always see genuine imagery, but used out of context, right? So often we'll see a ballot box from another country, where we don't like where they don't have paper and pencils, and this is how they do it. And we get we get misled, not deliberately. So I say to people, if you see images of ballots and with this, oh my goodness, there has been voter fraud, just stop a little bit and know that that's a kind of a known tactic that has been used elsewhere. Or to say, be aware, it's very easy to manipulate by just pressing on people's existing biases or, you know, their confirmation bias. So just be aware that if you read something like, yeah, I knew it was true, just stop a little bit and be aware of how you're responding to that. And that doesn't matter whether it's a local level or it's right, you know, international. It affects everybody in the same way.
Laura Reiley: 8:55
So the economics of this is also interesting or maybe troubling because, you know, if, if clicks generate revenue and and being appalled or being irate or whatever, those, those kind of emotions drive more clicks, there's a powerful financial incentive to disseminate misinformation. And as we've seen, I was a newspaper journalist for 32 years, and I've seen, you know, all of my colleagues lose jobs, you know, this winnowing. And we've seen the the kind of the rise of the the traditional media outlet now kind of owned by the benevolent billionaire, who may lose interest, you know. So what what how can we think about it in terms of the economics?
Claire Wardle: 9:39
So I think many people don't understand how easy it is to make money off the internet. Right. So when I sit down and say, I —
Laura Reiley: 9:46
Oh, do tell. Give us our side hustle here.
Claire Wardle: 9:48
I'll say to somebody who maybe is really getting into wellness issues and suddenly lost a bit of weight and is like working out and, you know, they might send me a link to a site that they're looking at and I'm like, I'm really glad you're feeling great about everything. I just want to make it clear, like, look at all these supplements that are being sold. And can we just check whether these supplements are actually doing what they propose?
Laura Reiley: 10:09
You're saying that that the advertisers that are paying them should be vetted more by the person who's —
Claire Wardle: 10:15
Yeah, exactly. By driving people to our web, either you have a website where you're selling content, but you're also selling advertising. Or for example, you know, on TikTok, for example, TikTok Shop people are directly getting content or creators themselves are doing videos over one minute and have signed up for the program, which means they get money for that. I think many people don't understand the different ways that people can make money directly off the internet, and some people are doing it absolutely aboveboard, selling good, created content. But some people are making the most of the kind of what's happening the way that the internet is designed. I mean, Facebook, a couple of weeks ago, there was a report that said they made $16 billion off fraud and scams last year. I mean, it's built into the systems, that content that is absolutely false, but is being driven by people who believe, like we all want to believe, that there's a secret supplement that's going to make us lose weight. We all want to —
Laura Reiley: 11:06
GLP-1s.
Claire Wardle: 11:09
We want to believe that there's this magic fix. And I think once you say to somebody, "I know that you really respect this person, but can we really look at all the ways that they're making money off you." Right. And be and think about the financial gains as well as the content. Like you can't disengage the two. Like, are they trying to sell you ideas because they're trying to sell you something that's going to make them rich? You know, and some of these influencers, once you start showing their bank balance, they're like, oh, like — I mean, all of us understand, like if you talk to your Nana about when she opens up her mail and there's a scam inside of it, right? Like we have to teach older adults about that. Once you start teaching people about the kind of financial underpinnings of the internet, people start to go, oh, maybe that's why I'm being sent to that website. Or, you know, there's more of an awareness then.
Laura Reiley: 11:56
So are there ways to have, traditional media outlets work towards greater resilience? I mean, we've seen just such a cataclysm across, you know, local media, local news, national news. What can be done in that sphere?
Claire Wardle: 12:13
Yeah. So thinking about the local level, I often, you know, and I'm actually on the board of the Ithaca Voice like I, you know, I really believe very much in local journalism, but what I often say is if you look at local community boards, like maybe it's Nextdoor or a local Facebook community, people feel heard. They're participating in conversations with other people and news is the fact that a traffic light is out, the fact that there's been something that's happened at the PTA and also like what's happening to the local pub that seems to have closed down, like news is all sorts of things, but it's relevant locally. And I think newspapers still have this sense that it's kind of 1996, which is, you know, there are seven stories a day that we decide and we put out and you might comment at the bottom of the article, but we're probably not going to read it. And we don't necessarily bring you into the process of making news. And I say this at the local level, but also the national level. Institutions, whether they're universities or newsrooms or government agencies, are really bad at listening to communities and creating participatory spaces. I spend time in what I call the armpit of the internet. So conspiracy spaces, whether that's anti-vax groups...
Laura Reiley: 13:18
My condolences. That's a bad place to be.
Claire Wardle: 13:20
Those spaces are communities. People turn up and go, "hi" —
Laura Reiley: 13:24
So Reddit and the subreddits.
Claire Wardle: 13:26
And people feel something when they go there. They feel heard. They're told to do their own research. They turn up with their own research and people go, oh, that's interesting, right? There's a sense that they're part of something. And so as we think about what does the next 30 years of our information environment look like, there has to be a recognition that people want to be part of something. And I think these top down broadcast type moments, I mean, this podcast, Cornell will probably tweet out a link and say, listen to this. We say, read this article, watch this video. But it's still essentially broadcasting. What does it mean to create meaningful spaces where people feel part of a conversation? And until that happens, there will be this intrinsic draw towards spaces that are participatory.
Laura Reiley: 14:09
Well, so I think there's always been this tension between kind of citizen journalism and, and, you know, professional journalists because there's this idea of the belief in expertise. Right. And so you're saying that there should be more interactivity or, you know, citizen participation in the process and that there should be less, "Here's what you need to know."
Claire Wardle: 14:30
So, this question of expertise, I mean, we could do a whole podcast just on expertise. But traditionally we had a process where expertise came from how many letters did you have after your name? How many years did you spend at school? Expertise was that. In many spaces that I spend time expertise is actually about lived experience. Right. So if you are talking about, for example, RSV and you're a mother and you are on TikTok holding your crying baby, talking about what you did last night to make your baby feel better, as humans, we're very drawn to that first-person story. And so we have to understand that you can't... We can no longer say, hey, trust me, look at my LinkedIn. Look at all the like who should trust me because of that. We now live in a world where people have had poor experiences with the health care sector, they've had poor experiences with government agencies, and so they are less likely to trust an institution just because they're told to trust them. They want to have an experience with an institution that says, oh yeah, they listen to me. They understood what I'm going through. And so that's why people are actually turning away from traditional institutions and turning towards each other. The problem is, in some of those community spaces, we do not have anybody who is an MD. So we have people sharing first-person anecdotes about what they did with their crying child who had RSV.
Laura Reiley: 15:50
But there may not be data to support their strategy. It maybe. You know something that worked for them but is not —
Claire Wardle: 15:56
Like Ivermectin during Covid. Like we have a number of examples where people have these kind of folk theories around what works that is not scientifically proven.
Laura Reiley: 16:02
Vaccination or yeah, there are so many things.
Claire Wardle: 16:06
Exactly, and so but the problem is as scientists and I say that as one of those people, too often we communicate which says follow the science. And so if you say follow the science, take the COVID vaccine and you won't get COVID, because that's the messaging that we had early on, people got vaccinated did get COVID and said, wait a second, right.
Laura Reiley: 16:25
Hey, Fauci —
Claire Wardle: 16:25
Yes, yes, lock him up. And so the damage that was done by really well-meaning people trying to do the right thing and protect the most people possible when the truth is, we didn't know. This was a brand new virus and we didn't know. And I wish we had had more space for ambiguity in those early days.
Laura Reiley: 16:43
We don't like ambiguity. We never have. All right, well, so what's next for you? Like, what are the what are the big questions that you're kind of thinking about right now or working on?
Claire Wardle: 16:52
So after way too many years spending time in conspiracy communities, which is not great for my mental health, but also realizing we can keep playing whac-a-mole with the bad stuff, but like, it's never going to go away, you know, and I've now, I think got to this point of 20% of the internet is probably always going to be a little bit rubbish. Right. But what does it mean for the 80% of the internet? So what does it mean to really think about the information ecosystem as a whole and say, well, why are people drawn to that 20%? How can we improve the 80%? So I do a lot of work now with organizations about how do we build these participatory spaces, how do we do deep listening to hear what people's questions and concerns are? Because actually, when you do listen to people, they often aren't repeating back rumors. They're saying, "Hey, I mean, should I be getting my COVID and flu vaccine at the same time?" "I don't know, I'm getting conflicting advice." Right? People have really valid questions and when they go searching for the answers, they often can't find them. And so then they're seduced by the people who've like, I've got a certain answer for you who are much better at telling you that they've got, you know, with certainty, whereas we're like, well, the data is a little bit like it's not good to hear scientists talk.
Laura Reiley: 18:01
Sure.
Claire Wardle: 18:02
We say we can't prove anything. So I think what I'm trying to do now is to really think about those of us who control parts of the information ecosystem where we do have control, how can we think about more authentic, compelling communication techniques that really listen to people and bring them in and make them feel part of something, rather than dismissing the bad stuff and being like, oh, we just need to get rid of the bad stuff. The bad stuff is speaking to people in a really interesting way.
Laura Reiley: 18:28
So is the Wikipedia model kind of the show your work or the group of experts where you can actually track back, you know, is that is that a good model? Is there a way to kind of watermark demonstrable truth or things that have been vetted appropriately or, you know, is that... is that...just a naive of me to think that that's a possibility?
Claire Wardle: 18:48
I mean, I mean, Wikipedia is an international treasure. It's extraordinary. And what it does is it often shows that there are debates, right? It doesn't it doesn't have one snippet at the top that says, "Here's the truth." It has these long articles and it has lots and lots of footnotes. Right. And we see this now with content moderation. So on X and TikTok and now Facebook, we have community notes, which are flawed. And there's interesting research happening around what's making them work or not. But I often hear from audiences that are like, oh, I like community notes because it's showing me that different people have different perspectives. I can look at that and make a decision, because when I spend time talking to my students, they'll often admit, oh, I don't read a news article or watch a TikTok video until I read the comments. They want to know what other people think before they — So this sense of like, there's one truth and just trust me because I'm a doctor or I'm a scientist, that doesn't work, right? So we have to show our work. But again, in a way that recognizes that unless we say, is the sky blue, like a lot of the things that are concerning to people are unsettled science. There are different viewpoints, right? And I think if we don't acknowledge different viewpoints, that's when people say, you're lying to me. So more ambiguity, more diversity of viewpoints and more bringing people in to add to the puzzle. Often, the joy of Wikipedia is there's all those footnotes because they're all adding to what is the truth or how do people see this? So I'd love us to build more of those kind of infrastructure.
Laura Reiley: 20:14
Great. So for listeners who want to be better about this, who want to, as consumers of information, want to do better, do you have any book recommendations or anything that they should read on this topic t o be better informed?
Claire Wardle: 20:28
Yeah. I mean, there's, a book that I really enjoyed was Naomi Klein's Doppelganger, and she is coming at this from a left-wing perspective, but she does a great job of explaining, you know, when people said, "Oh my goodness, I can't get vaccinated because the mRNA vaccine has got a microchip in it," she said. It's very easy to, you know, I laughed, too, like it makes no sense. Well, actually, we know that technology companies are surveying us. We know that tech companies are embedding all sorts of cookies and all sorts of ways to know where we're going. So it's not completely out of the realms that people should be concerned about surveillance. So she does a really good job, I think, of explaining why we are where we are today. And she talks about these two very different realities. I mean, I think the information ecosystem is splintered in many, many ways. But she talks about kind of there's these parallel universes where people are like existing, watching very different content, consuming very different types of content and having very, very different worldviews. And so what does it mean when we don't have those shared spaces of reality?
Laura Reiley: 21:27
Wonderful. It sounds like a great book. Well, I regret to say I think we are kind of at the end of our time together. This is gone really quickly. Thank you, Claire, for coming. This is Claire Wardle, professor in the Department of Communication at Cornell and one of the world's leading experts on misinformation. And I am Laura Reiley. Thanks for listening to Research Matters, where we talk with Cornell researchers working to make real world problems better. If you like this episode, subscribe wherever you get your podcasts and share it with a friend who loves facts as much as you do. Thank you.
Media Contact
Get Cornell news delivered right to your inbox.
Subscribe