Reddit banned two contentious but popular groups that regularly featured human injury and death following the widespread sharing of videos of the New Zealand terrorist incident on Friday.
The social network moved to eliminate the r/watchpeopledie and r/gore subreddits less than 24 hours after a shooter killed at least 49 people at two mosques in Christchurch and streamed the event live on Facebook. Internet users were able to capture the video before Facebook took it down, uploading clips of the incident on platforms including Twitter and YouTube. People then shared links to those videos in the now-banned Reddit groups, which critics and some Reddit users had questioned for years.
“We are very clear in our site terms of service that posting content that incites or glorifies violence will get users and communities banned from Reddit,” a company spokesperson said in a statement. “Subreddits that fail to adhere to those site-wide rules will be banned.”
Reddit’s decision to remove r/gore and the seven-year-old r/watchpeopledie subreddits illustrates the tensions social networks face as they police content that violates their rules in real-time. Since the incident, Facebook, YouTube, Reddit, and Twitter have all struggled to keep video of the massacre from spreading, playing a game of digital whack-a-mole as users uploaded content that continued to avoid the detection of algorithms and content moderators.
On Thursday night, a Reddit spokesperson told BuzzFeed News that r/watchpeopledie, where links led to videos of people being executed or hit by cars, was allowed on the site because it provided a service to members — some of whom the company said were medical professional or first responders — to learn about or cope with death. By Friday morning, however, Reddit moved to end r/watchpeopledie, which had more than 300,000 subscribers, and r/gore, as a result of members continually linking to videos of the New Zealand incident while moderators failed to act or even encouraged their posting.
“The video stays up until someone censors it,” one moderator on r/watchpeopledie wrote Thursday night. “This video is being scrubbed from major social media platforms but hopefully Reddit believes in letting you decide for yourself whether or not you want to see unfiltered reality.”
The web has long offered places to share graphic imagery, from Rotten.com, a now-defunct website where users could post shocking photos of dead bodies, to mainstream platforms like Twitter, which allowed the proliferation of terrorist beheading videos on the grounds that they were newsworthy. Among popular internet sites, Reddit has gone further than most, allowing communities that share and comment on subjects ranging from pornography, incels, and death. At times, Reddit has removed controversial groups, including one encouraging fat shaming and another that promoted the Pizzagate child trafficking conspiracy, for violating various aspects of its terms of service.
These decisions have always led to debate among users and sparked arguments about free speech on the platform. Reddit, as a private company, has no legal obligation to protect users’ rights to free speech.
A Reddit spokesperson declined to say whether or not more channels would be banned in light of the sharing of details or information glorifying the New Zealand massacre, but noted the company would be proactive in attempting to track users and groups that broke its terms of service. Critics, however, said the company was only acting in light of the negative publicity surrounding the deadly shooting.
“The only thing that changed between yesterday and today was Reddit getting negative publicity about those subreddits existing,” a Twitter user named Colin Sullender wrote. “The platform has turned a blind eye towards the content there for years, much of it far worse than the NZ shooting video.”
A Reddit user who asked to be identified as “Alex” said she had followed r/watchpeopledie for a a few weeks, watching videos that included videos of people being shot and set on fire in executions. The videos she said, helped her understand the “actual cost of life.”
“But when I watched clips of the New Zealand video — it became even more devastating,” she said to BuzzFeed News. “It made right wing extremism not just something [on] paper.”
“I think people would take the shooting a lot more seriously if they saw the video,“ she added.
Elizabeth Weeks, a Seattle-based attorney and moderator for the subreddit r/changemyview, a group that attempts to present different views to subscribers who have a set opinion about certain topics, said that while she didn’t have to deal with filtering out illicit content associated with the Christchurch shooting, she saw both sides of the issue.
“I really worry a lot about the velocity of how these things are posted and how they can have a normalizing effect,” Weeks said. “But I also do understand the more pro-libertarian view about there being no harm if you’re not acting [upon what you’re watching.]”
Weeks, who sometimes deals with users who say they are suicidal, said that interaction with Reddit has improved over the last few years, and had sympathy for the company because of the number of communities and people it is dealing with. But tech companies have often waited to improve communication and moderation until they’re facing scrutiny and media coverage for their failings, and it’s unclear if there’s a scalable solution to moderate their billions of users.
“I’m actually of the mind that the solution to this problem is to slow down conversation and perhaps break off networks into smaller pieces,” Weeks said.