YouTube is now blocking users from commenting on almost all videos that feature children after reports of an organized pedophile ring using the platform’s comments section to find and exploit minors.
The video platform wrote a blog post on Thursday explaining its decision to disable comments on videos that feature children younger than 13 years old. YouTube said it will also ban comments on videos with teens ages 13-18 that the company believes may attract predatory behavior.
“We recognize that comments are a core part of the YouTube experience and who you connect with and grow your audience,” the company wrote. “At the same time, the important steps we’re sharing today are critical for keeping young people safe.”
Not all videos with minors will have disabled comments. YouTube, which is part of Google, said it will grant exceptions to the ban for a “small number of channels that actively moderate their comments and take additional steps to protect children.”
The decision came after video blogger Matt Watson released a video earlier this month exposing how a “soft-core pedophilia ring” is using YouTube to share and comment on videos of very young children, giving timestamps and details in the comments section of when children were in states of undress or in certain compromising positions. YouTube’s algorithm would then recommend users who viewed those videos more footage featuring children. Watson said he was able to find links to actual child pornography with just a few clicks on the site.
Since Watson’s video, reports have come out about a clip encouraging suicide and self-harm being spliced into children’s videos on YouTube and YouTube Kids, an app created for children to use. The videos were in the style of “Splatoon,” an animated cartoon whose episodes get millions of views on YouTube. The video platform told BuzzFeed that it terminated the channels of users responsible for the spliced videos.
YouTube removed more than 400 channels and tens of millions of comments on videos last week that feature children. The announcement on Thursday is a significant step forward from the company’s past handling of inappropriate behavior on the site.
The company said it expects the change to take several months to fully go into effect. YouTube is also planning to launch a new automated tool to detect and remove twice as many predatory comments.
YouTube’s recommendation algorithm is not just responsible for promoting pedophilia. It has also recommended videos of conspiracy theories, dangerous pranks and the self-harm footage mentioned above.
The algorithm “is extremely biased toward conspiracy theories. It promotes a huge amount of false information ― literally crazier the better for the algorithm to recommend it,” former Google employee Guillaume Chaslot, who helped create the algorithm, previously told HuffPost.
YouTube is working to improve the algorithm, but that didn’t stop several major advertisers from pulling their spending on the site. After February’s series of reports about YouTube’s exploitative videos, brands like Hasbro, Disney, AT&T and Nestlé suspended their advertisements.