Back to Blog
Some scientific facts are black-and-white, of course, but moderators may be underestimating the challenge of determining which facts are which in the contest of a broad policy on health misinformation. While aren’t legally liable, certainly they’re ethically liable for permitting ideas that are dangerous to the health and lives of their user base.” “I think that there’s a substantial difference when it comes to questioning basic science in the midst of a global pandemic. If people are arguing over political candidates, okay,” said a moderator of the popular advice forum r/AmITheAsshole named Frank, who-like the other moderators quoted in this story-asked to go by his first name out of concern about harassment. “If people are discussing whether Bigfoot is real, okay. ![]() (Also, a bunch of really hateful users had to be removed.) To many of its current users, the new moderation debate-the one over health misinformation-should be less complicated than what came before. The platform was a free-for-all for many years, rampant with misogyny and racism, and a good chunk of the user base had to be led gradually toward policies that could protect the speech of some by limiting the speech of others. Reddit is notorious for having spent a very, very long time deciding that it should deal meaningfully with harassment, hate speech, and other forms of abuse. ![]() For the moment, it’s the best fix available. Until that tension is resolved, the platforms are relying-to a significant extent-on anti-vaccine activists’ and COVID-19 conspiracy theorists’ inclinations to break unrelated, existing rules. Platforms, their users, and social-media experts are still in the early stages of debating what effective content moderation around health misinformation should look like, and many of them know that overbroad policies would create serious problems and dramatically limit reasonable debate. We should expect this drama to play out again and again in the months to come. It used tools developed for a prior crisis in content moderation, over hate speech and harassment, and adapted them to meet the present one. In other words, when confronted with a problem that its policies did not cover-health misinformation-Reddit jerry-rigged a fix. That rule, a spokesperson explained to me in an email, can be interpreted to prohibit the spread of “falsifiable health information that encourages or poses a significant risk of physical harm to the reader.” The 54 quarantined subreddits were also not cited for any specific mangling of true facts but rather for violating the first rule of Reddit’s content policy, which protects users from bullying, hate speech, and threats of violence. The ban of r/NoNewNormal was not on account of its users’ habit of sharing health misinformation and disinformation, but rather for “brigading”-that is, their history of attacking other Reddit communities with spam or trolling. The moderators’ letter had specifically asked for special attention, enforcement, or rules around COVID-19 misinformation. “Discuss this dramatic happening here!” (Others celebrated with memes, obviously.)īut Reddit’s leadership hadn’t quite acceded to the protesters’ demands. “r/NoNewNormal has been banned!” read a post in the subreddit dedicated to on-site drama. (The blackout protest included the 3.3-million-member forum for Pokémon Go, which became its own news item.) In the end, Huffman did take action: On September 1, Reddit removed its most notorious subreddit for anti-vaccine conspiracy theories, called r/NoNewNormal, and “quarantined”-meaning, covered with a warning screen and removed from search results-54 others, including the subreddit dedicated to the antiparasitic drug ivermectin. Shortly after his post, many of the moderators who had shared the letter shut down their subreddits in outrage. ![]() ![]() Reddit’s CEO, Steve Huffman, responded with his own open letter noting that “dissent is a part of Reddit and the foundation of democracy,” and that those who disagree with the CDC are not violating the site’s policies. “Subreddits which exist solely to spread medical disinformation and undermine efforts to combat the global pandemic should be banned,” it said. The letter emphasized that vaccines are safe, masks are effective, and social-distancing measures are useful. As part of an organized protest, the moderators of dozens of large subreddits, or forums on the site, shared a letter condemning Reddit for failing to act on the “rampant” spread of COVID-19 misinformation and allowing conspiracy-minded anti-vaccine subreddits to proliferate. At the end of August, Reddit users told the company’s leadership they had blood on their hands.
0 Comments
Read More
Leave a Reply. |