Combating misinformation online is an ongoing challenge for big tech, and it’s especially difficult when it’s on a discussion board with millions of people during a pandemic.
One such place is the r/Coronavirus community on the website Reddit. In January 2020, it had around 1,000 members. That number spiked to 1.5 million by March of 2020, partly due to Reddit highlighting it on their homepage over any of the other related subreddits. Today, the page has 2.4 million users, with around 10,000 new comments a day.
The forum has become a one-stop shop for up-to-date coronavirus information, offering up pandemic news, locations of vaccination sites and how to sign up for clinical trials. The community has also hosted Q&A discussions with the likes of Bill Gates and Tom Frieden, former director of the U.S. Centers for Disease Control and Prevention, as well as top researchers. Even Reddit CEO Steve Huffman reached out to the volunteers who moderate the forum to tell them that he starts his day by reading it and to thank them for their work.
But the work these moderators do isn’t easy, as the forum is also a breeding ground for misinformation. They work tirelessly to make sure the information on the subreddit is reliable, taking time away from their jobs as doctors, researchers and students.
Science News spoke to three of these moderators about what it’s like to combat misinformation online during a pandemic. Head moderator Patrick Doherty is a biotech research scientist; Jennifer Cole is a biological anthropologist at Royal Holloway University of London, who studies online communities related to health and became an infodemic manager, after receiving training from a World Health Organization initiative to fight misinformation; and Rohan — who requested not to use his full name due to the daily harassment he receives on Reddit — is a M.D./Ph.D. student in molecular biology.
Answers have been edited for clarity and length.
SN: How did you become a moderator?
Doherty: I actually was recruited by one of the other moderators. At the start of the pandemic, there was a lot of really bad preprint papers that were coming out. And one that came out was about how … the coronavirus could have potentially been manufactured in a lab using an HIV strain. A [Reddit] user had posted it. So I had written a detailed comment in response, explaining why the paper was bad and why the results didn’t mean anything. The paper eventually ended up getting retracted (SN: 3/26/20). The moderators saw my comment and liked how I expressed the science, so they invited me to a be a moderator.
Rohan: I started in September 2020, the day before [then-President Donald] Trump tested positive (SN: 10/5/20). Over the course of the previous six months of the pandemic, I had seen a lot of misinformation on the subreddit. I wanted to contribute to removing some of that stuff, and I also thought there was a lot of opportunity for the subreddit to run special projects, like motivate people to get vaccines or help them find vaccination locations. And I thought given my background, I would be able to give some help with that.
SN: Has there been anything that’s surprised you about moderating r/Coronavirus?
Cole: Honestly, largely no. Because I’ve done this before with Ebola. There’s been nothing different in this pandemic to what there was in Ebola, there’s just been more of it. The scale has been different, but the kind of conspiracy theories you see and the kind of things people say are no different.
SN: What is it like moderating every day? How often do you take breaks?
Doherty: It can be kind of soul crushing sometimes, especially when there wasn’t a lot of good news. Now there’s good news about vaccines (SN: 3/30/21; SN: 3/8/21) . But before, every day, I was opening up the sub, and every morning I would read the front page of our subreddit, and it was all just bad news. It can be a lot.
Rohan: There’s an ebb and flow to how much time it takes to moderate. For example, if there’s big news about a vaccine being approved, then we’ll all just be spending a significant portion of the day answering user questions and combating misinformation. But just general day-to-day management, it’s a pretty large team and we try to coordinate with each other. It does take a lot of collective time, and we try to make sure that if someone’s having a busy day or week, then we try to help them out.
SN: How do you distinguish between misinformation that should be taken down versus a genuine question?
Cole: At first, instead of just removing somebody, we engage with them. If their information is wrong, we explain why it’s wrong. And certainly the first time that users post something that is wrong, we will try and correct them and push them in the direction of the better information. If they keep coming back obviously trying to push a narrative, that’s when we will ban them. You do need to make a distinction between people who might have heard it somewhere and don’t understand it very well and need you to explain it to them a bit better, versus people who are trying to push a narrative. Sometimes we’ll check on users’ posting history and what else they’re posting elsewhere.
SN: What’s the biggest lesson you’ve learned?
Doherty: Misinformation is really hard to combat, because someone can post two sentences of made-up stuff, which takes them only five seconds. But If I want to refute that, I have to find one source, then two sources, then three sources, and a breakdown scientifically of why that’s not true. I can’t just say “no, it doesn’t,” because then you’re just leaving it to the reader about who they trust more. Whereas you have to go and find sources and show why you’re right and that takes time. It’s really easy to share a meme and get 25,000 likes and people are convinced that it’s true, and it only took that person 10 seconds to make it.
SN: I’m sure banning people leads to harassment. Have you been harassed?
Doherty: I’ve never been doxed [that’s when someone publishes private personal information online]. I keep my name separate from my username. I never say who I am on the subreddit. But if you delete someone’s comment, I’ve had someone say “slit your throat” or just really awful death threat sort of stuff. You can report that to Reddit, and they’ll ban the user from the site for things like that, but we get a lot of stuff like that. You get used to it, but you don’t really get used to it.
Rohan: Most of the nasty direct messages are just vitriol or people being nasty. That’s essentially a daily occurrence. And not infrequently, but multiple times in a day. Beyond that, there’s more minor threats such as “Oh, I’ll report you” or “Oh, soon you will be revealed and exposed as a shill.” Those probably come a few times a week, more frequently if it’s a busy period or particularly sensitive topic. The serious threats, like the threats of actual harm to me, are fortunately significantly rarer. Usually, it’s someone saying they’ll dox me or that they’ll “find me” and that I should kill myself. Those are unpleasant, but significantly rarer, probably on the order of a month or more in between.
Cole: I’ve had attacks that I’d describe as pathetic. They’re not scary or frightening. But part of the ethics agreement with my university is that if I do research on these online communities, I do it under my own name so that it’s transparent. My university is aware that I do this. My campus security also knows. One thing that people online do is say things like “we know where you work.” But do they ever go as far as contacting the university? No, they don’t.
SN: How has r/Coronavirus changed over the past year?
Rohan: It’s shifted from being just a place to get news about the pandemic and its response, and more of a place to get information that’s actually more actionable for the users. So for example, one of our moderators put together a wonderful list of vaccine location resources from around the U.S., Canada and even around the world. And I run a piece that answers user questions on the vaccines, so I have a little write-up about what we know about vaccines. And in the comments, users can come ask questions, and I try my best within 24 hours to answer any of those questions or tell them to go talk to their doctor.
SN: Dealing with sad news and mad people every day sounds bad for mental health. Why do you keep at it?
Rohan: Being able to just sit down and methodically answer vaccine questions and address concerns is probably one of my favorite parts of doing this. There was one person who was talking about how their family has some history of medical conditions, and that they are scared and didn’t think they’d get the vaccine. They wanted someone to explain a couple of questions to them. I remember I went back and forth with this user probably five times over the course of several hours that day. At the end of it, they told me they were going to go get the vaccine as soon as they were eligible.
Doherty: I’ve really grown to like the community that I’ve helped build. We’ve learned a lot about what’s misinformation and what’s not misinformation. It’s sort of a learned skill. Not that we’re 100 percent perfect, but I just feel like we have a unique skillset at this point, and it’d feel wrong to stop. I’d feel guilty. Also, the team. The moderators have become good friends. We do Zoom hangouts and happy hours, and we joke about hanging out when this is all over. We’ve become a real group of friends.
This story was originally published by Science News and republished here with permission.