YouTube has a problem: its recommendation algorithm assembles video playlists for pedophiles. Now a Republican senator has proposed an aggressive new law to solve the problem ― but could inadvertently imperil family vloggers’ livelihoods.
Sen. Josh Hawley (R-Mo.) announced his bill this month, days after a New York Times report revealed how YouTube’s automatic algorithms grab innocuous home movies of children, sometimes partially clothed, and put them in front of users already searching for sexualized content.
“YouTube says that it has the technology to take child videos out of its auto-recommendation process. That’s what they should do,” Hawley, who did not respond to requests for comment, said in a recent TV interview. “The only reason they’re not doing it is they want to make money on it.”
His bill, which would impose criminal penalties and fines on YouTube and other video platforms for violations, aims to force platforms to “prioritize the safety of children over money.” But members of YouTube’s massive community of family vloggers fear that such a measure would drastically reduce engagement on their channels — and for some, dramatically cut their advertising revenue.
YouTube’s predator problem has already hurt the finances of the family vlogging industry and threatens to do more damage, said Josh Cohen, co-founder of Tubefilter, a news site that analyzes and reports on YouTube and other social media platforms.
“Before these issues arose with potential predators seeking out underage content on YouTube, family channels were considered one of the safest bets for advertisers, and potentially the most lucrative,” Cohen said. “Family-friendly programming on YouTube gets content creators dramatically higher advertising rates.”
Hours after the Times’ report published, YouTube announced that it was reducing its recommendations for borderline videos “featuring minors in risky situations.” But that vague step doesn’t go far enough, Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.) argued in a recent letter to YouTube CEO Susan Wojcicki.
“With YouTube asleep at the wheel, predators have taken the reins,” the senators said. They demanded action “starting with the design of [YouTube’s] algorithms,” and asked if the company would disable recommendations for videos “involving minors” until it finds a lasting solution.
YouTube’s recommendation algorithm drives an enormous amount of traffic on the site and is responsible for more than 70% of viewers’ watch time. Restricting videos of children from being algorithmically recommended would have an extreme impact on family vlogging channels’ engagement levels and earnings, Cohen said.
Google, YouTube’s parent company, did not respond to a request for comment.
In recent years, family vlogging has exploded into a burgeoning industry in which thousands of parents and kids regularly film home videos — daily activities, silly pranks and so forth — then upload them to YouTube for millions to watch. For many, it’s both a fun hobby and a serious business: through YouTube’s Partner Program, qualifying video creators can take a cut of the earnings from ads that appear on their channels.
Some parents, like Britt Null, a single mother of three young children in Kansas City, Missouri, run vlogs to modestly supplement their incomes from other sources. Others have left established careers to pursue vlogging full-time.
The result can be big money: YouTube’s highest earner is Ryan Kaji, 7, who makes videos reviewing toys with his siblings. Kaji, whose parents were unavailable for an interview, has gained nearly 20 million subscribers since launching his channel in 2015. He made an estimated $22 million last year.
Null and other family vloggers are still reeling from YouTube’s decision to disable comments on most videos featuring kids. In February, a video exposing how pedophiles had been flooding certain YouTube videos of kids with fetishizing comments went viral, and major brands swiftly announced a boycott of the platform. Shortly afterward, and without notice to users, the site started demonetizing some videos featuring children and turned off the comments sections on tens of millions of ones “that may include minors and therefore are at risk of predatory comments.”
“It felt like we were being punished for something that wasn’t our fault,” said Null, who posts weekly vlogs with her children. Several of her videos, such as one showing her daughter doing a hair tutorial and another in which the kids do a Q&A, were among those affected, even though Null has filters to prevent certain inappropriate words from appearing in the comments sections.
At the time, Wojcicki described the changes as a “tradeoff.”
“There are going to be many young people out there who are going to be upset because they’re going to feel like they’re posting videos and they no longer have the ability to use comments in the way that other creators can,” she acknowledged. “In the end that was a tradeoff that we made because we felt like we wanted to make sure that protecting children was our number-one priority.”
The online exploitation of children is a complicated issue without an easy solution, and YouTube has taken heat from all sides. The platform recently faced backlash for taking action against some ASMR videos starring minors. Filmed to trigger a tingling sensation in viewers called “autonomous sensory meridian response,” the videos feature kids making various stimulating sounds that include licking their lips or sucking on different items.
Critics say such videos should not be algorithmically promoted or monetized because they sexualize and exploit children, while young, so-called ASMRtists and their parents maintain that’s not the case and have lambasted YouTube for removing their content.
Meanwhile, many family vloggers who share clearly innocuous, family-oriented videos have vented their frustrations over the demonetizing steps already taken by YouTube and the disabling of comments. And they fear policy changes like the one proposed in Hawley’s bill that would be a major blow to their revenue stream.
For Null, losing the comments section on many of her videos meant losing her main form of communication with nearly 400,000 subscribers.
“It’s not a black-and-white issue, but it had a big effect on content creators, and this is content that we work very hard on,” she said. “When that happened, content creators started to lose trust [in YouTube] and I saw a lot of talk about diversifying: making sure your content is in other places, making sure you have a website.”
Although Null said she is glad YouTube appears to be taking the issue of online child safety seriously, she and other vloggers suspect the platform’s actions are aimed less at solving the predator problem and more at placating advertisers — at the unfortunate expense of creators.
“I think it’s good they’re trying to do something,” she said. “I also think they’re just trying to do something to stop [advertisers] from walking out.”