YouTube Is A Pedophile’s Paradise

Algorithms are driving users to extreme content as predators compile playlists of children in compromising situations, a HuffPost investigation found.
Illustration: Rebecca Zisser/HuffPost; Photo: Getty

Eleven-year-old Allie sways back and forth in purple pajamas, mumbling softly.

“I feel so dizzy,” says the girl, who runs a tiny YouTube channel from her bedroom in Florida. She rolls her eyes into her head and collapses onto the bed behind her, next to a pile of teddy bears. After lying there motionless for a moment, she pops back up.

“Um, I wasn’t really sure what else to add, ’cause all that was requested was to faint while putting my eyes backwards,” she says to the camera, thanking a user who goes by “Martin” for the suggestion.

Allie’s channel is full of skits that she has eagerly filmed at the request of strangers on YouTube. She’s learned that her audience particularly enjoys watching her pretend to pass out and hypnotize herself; those kinds of requests come in all the time. In January, a user known online as “Damien” even scripted a scene for her, in which she dons a leotard with little pink flowers and gets abducted, tied up and knocked unconscious.

By contrast, Martin’s channel is empty — save for a few posts in which he talks about masturbating and his desire to ejaculate on a woman’s face. Damien’s channel features a series of movie clips showing young women being beaten and attacked with chloroform.

For Allie, whose real name is being withheld, the attention is exciting. (HuffPost was able to reach her parents and inform them of the situation; they did not agree to an interview.) To the girl’s great delight, her dizzy-themed videos randomly blow up sometimes, pulling in thousands of views despite her small following. She refers to her viewers as “fans” and promises to film whatever they’d like to see. That often means unwittingly acting out sexual fetishes for predators, who flock to her content like flies.

This didn’t happen by accident. YouTube’s automated recommendation engine propels sexually implicit videos of children like Allie from obscurity into virality and onto the screens of pedophiles. Executives at the Google-owned company are well aware of this: For years, media outlets and vloggers have been sounding the alarm over YouTube’s aggressive promotion of videos featuring vulnerable and partially clothed kids. The New York Times reported last spring on a mother who was horrified to find that a video of her young daughter playing in a pool had been watched hundreds of thousands of times.

“YouTube is a company made up of parents and families,” YouTube assured users and stakeholders in the wake of that story, which laid out in detail how the platform would serve pedophiles “a catalog of videos” sexualizing children. “We’ll always do everything we can to prevent any use of our platform that attempts to exploit or endanger minors.”

But despite its statements to the contrary, YouTube continues to actively put children in danger, a HuffPost investigation has found.

The tech behemoth’s safety measures — such as its pledge to disable comments on videos prominently featuring minors — are inconsistently enforced and insufficient. Undeterred, pedophiles are openly grooming kids and compiling their videos into sexualized playlists, making girls like Allie easy prey.

What’s worse, YouTube’s algorithm is still plucking videos of children in potentially sexually suggestive scenarios — bathing, doing gymnastics, spreading their legs — and lining them up en masse for users with predatory viewing habits. It’s even pushing viewers of erotica on the site toward a repository of videos starring scantily clad minors.

YouTube could easily rein in its amplification of such problematic content — as it has promised to do time and again in the face of bad press, sweeping advertiser boycotts and threats of legislation. Instead, it has rolled out half-measures to quell the public furor while continuing to facilitate, incentivize and profit off the sexual exploitation of children.

YouTube’s automated recommendation engine propels sexually implicit videos of children from obscurity into virality and onto the screens of pedophiles.
YouTube’s automated recommendation engine propels sexually implicit videos of children from obscurity into virality and onto the screens of pedophiles.
Getty/YouTube

Catering To Extreme Tastes

Footage of kids in swimsuits, and similar videos that are innocuous but could be tempting to pedophiles, account for a small fraction of content on YouTube, and often are not in violation of its Community Guidelines. There’s nothing wrong with parents uploading clips from their family trip to the beach. The issue is not the volume or even the content itself per se, but how the platform herds users directly to it.

YouTube’s recommendation algorithm, which automatically generates video suggestions and brings users from one clip to the next, drives more than 70% of traffic on the site. It was designed with the goal of keeping people watching for as long as possible by feeding them increasingly “extreme” content to binge, according to one of its architects.

“Increasing users’ watch time is good for YouTube’s business model,” said former Google engineer Guillaume Chaslot, who helped create the recommendation algorithm but is now among its most outspoken critics. The longer people stay on YouTube to watch videos, he explained, the more ads they’ll see, which ultimately results in greater advertising revenue. (YouTube disputes that it steers users toward “extreme” videos; it contends that in general, its recommendations more often lead to mainstream content.)

Algorithmic promotion would explain why a random channel’s video filming up a toddler’s skirt has 473,000 views. And why a video of two men drowning in a river has 539,000, despite one of the victims’ mothers publicly pleading for it to be removed. It would also explain why a video of a woman engaging in an uncensored sexual act with a pet has more than 113 million views.

Each unique view represents around 10 recommendations from the algorithm, Chaslot said, indicating that YouTube has promoted the video of explicit bestiality more than a billion times. The company’s executives don’t specifically aim to push people toward such lurid material, he noted, and it’s certainly not their objective to guide users from sexy, adult-themed videos to kids, yet they continue to employ a profit-maximizing algorithm that they’re unable — or unwilling — to control.

Chaslot suspects the latter. YouTube vowed last year to reduce the distribution of videos “featuring minors in risky situations” — which would take little effort, he said — but it has continued to do the opposite, often blasting such content to hundreds of thousands of viewers or more. (YouTube declined to explain how it defines “risky” in this context.)

“Until they change the algorithm, the problem won’t go away,” said Chaslot, who stressed that YouTube could also train its artificial intelligence systems to detect predatory behavior by users.

As things stand, he added, it is effectively the case that if the algorithm determines “pedophiles spend more time online than other people, then it’s going to find ways to convert more people into pedophiles.”

Searching YouTube for clips featuring adult nudity quickly led down a rabbit hole of algorithmically recommended videos showing partially clothed children.
Searching YouTube for clips featuring adult nudity quickly led down a rabbit hole of algorithmically recommended videos showing partially clothed children.
YouTube

Breeding Predators

In spite of YouTube’s ban on “gratuitous” nudity, finding erotic content on the site is as easy as typing “sexy” into the search bar. Users who go looking for it, however, may quickly and unintentionally wind up watching videos of children that could be sexually arousing to pedophiles. Clicking on just one or two clips showing partially clothed children can plunge users into an echo chamber comprising little else, HuffPost found.

For the purpose of this investigation, HuffPost reporters created a new Google account, logged into YouTube and played a video with more than 3.7 million views featuring a fully nude man and woman. From there, YouTube’s autoplay tool assembled a lineup of videos with naked adults. Several seemed to be in clear violation of YouTube’s nudity policy: One with 1.9 million views showed a woman rolling around in a thong with her legs spread, briefly exposing her genitals.

It didn’t take long for the algorithm to insert a video with children into the queue. The second suggestion from the top alongside a YouTube-recommended clip featuring dozens of naked, partying adults was a family camping video with a thumbnail showing two shirtless kids in a pool.

On its own, that video does not seem predatory. Yet the same cannot be said of its enormous audience: It has 423,000 views and bears a weeks-old, time-coded comment left by the user “ХХХ AМATЕUR SЕХ VIDЕО - CLICK HERE” that takes viewers straight to the moment the children are first shown in their bathing suits.

Recommended alongside the pool clip was a video thumbnail showing a girl who looks to be no older than six lying on her back with her stomach exposed. In replies to a top-rated comment with a timecode leading to that moment in the video, which was preceded by two ads, viewers have written, “mmmmmmmm sexy!” and “She needs a tummy rub.” Yet another video recommended alongside the pool clip showed a young girl being touched by a man who looks to be a doctor. “she looks amazing and then some 😍,” reads one comment. Next up was a segment with two ads and the words “EXTREMELY FLEXIBLE GIRL” in the title, which showed a young girl in her underwear and an open-back top. It has 11.5 million views.

“Parents need to know they can’t trust YouTube to protect their kids.”

- Haley McNamara, National Center on Sexual Exploitation

Within a few clicks, YouTube’s algorithm had taken HuffPost through a progression of recommendations from videos showing adult nudity to those featuring children in potentially sexual contexts. (YouTube claimed this does not represent an average user’s experience and that it could not replicate HuffPost’s results; however, the same process repeated multiple times from new accounts yielded the same pattern of results.)

Many of the recommended clips with kids had ads and time-coded comments leading to compromising moments, such as a video of a small girl being baptized in the nude. One timecode skipped to her getting undressed; others jumped to scenes where her bottom was exposed. “Why did they strip her naked?” asked a commenter. “I’m not complaining.”

And although YouTube now restricts minors from live-streaming, it still directed HuffPost to live-streamed videos featuring minors, including one with a young girl doing the splits in revealing shorts. Comments had been disabled on that video, but YouTube still allowed a live chat bar, enabling viewers to leave posts in real-time such as, “She is turning into a beautifuuuuuuuuul little woman” and “omg I love your long sexy legs 😍.”

After guiding HuffPost to a clip that showed two little girls getting changed, YouTube’s algorithm even suggested a playlist containing similar content. Before long, HuffPost’s recommendation bar was chock-full of videos of kids splashing in bathing suits, stretching in spandex, tumbling around in their underwear and getting undressed. If you fall down a rabbit hole on YouTube — intentionally or otherwise — it’s hard to get out.

YouTube told HuffPost that it has “disabled comments and limited recommendations on hundreds of millions of videos containing minors in risky situations” and that it uses machine learning classifiers to identify violative content. It did not explain how or why so many other videos showing vulnerable and partially clothed children are still able to slip through the cracks, drawing in extraordinary viewership and predatory comments.

Once HuffPost reporters had clicked through two dozen YouTube-recommended videos, which started out with content featuring adults and then shifted to children, the YouTube homepage looked like this:

Playlists For Pedophiles

Algorithmic promotion isn’t the only YouTube function that’s helping to put masses of potentially exploitative videos in front of predators who wouldn’t otherwise see them.

YouTube’s playlist tool allows users to assemble public content from other people’s channels without obtaining their permission — or even alerting them. It’s a useful feature for people who might want to watch a bunch of sports highlights grouped together, instead of searching the platform for each individual video. But pedophiles are freely exploiting it by hunting for independently innocuous clips of little kids (often posted to parents’ channels), compiling them into fetish-themed playlists, then dispersing the lists.

Aggregating “innocent content of minors for the purposes of sexual gratification” is prohibited on YouTube, though the platform does not proactively search for violations; instead, it relies on users to report them. As a result, many predatory playlists exist without penalty.

Through simple searches, HuffPost easily found sexually exploitative playlists with titles such as “Little girls,” “wet clothed girls” and “Girls passed out.” A video featured in a playlist called “Sleeping girls” showed a toddler nodding off in her car seat. It was pulled from a channel that appears to belong to the girl’s father, who has just 19 subscribers. The clip has close to 30,000 views. (YouTube removed these playlists after HuffPost flagged them in an email to the company.)

It’s unlikely that a parent would even know if a video of their child has been pulled into another user’s playlist, because YouTube wouldn’t notify them. But if they do somehow become aware, there’s no easy way for them to get the video removed from that person’s playlist — aside from deleting the video altogether or setting it to private mode.

“Regardless of what they keep saying, they’re not optimizing for child safety — they’re optimizing for engagement and profit.”

- Hany Farid, digital forensics expert

The French vlogger behind the popular YouTube channel “Le Roi des Rats” called attention to this issue back in January 2017. In a viral video, he showed a stream of examples of playlists with titles “like on a pornographic website” containing videos featuring prepubescent girls, and revealed how YouTube would recommend such lists.

His exposé made headlines and elicited a tepid response from YouTube: It’s up to users to flag problematic content, the company told Le Parisien. The vlogger later warned of people trading links to sexually exploitative videos and playlists in dedicated off-site forums. But to date, the issue remains largely unaddressed.

YouTube told HuffPost that it has continuously been looking into ways it can better enforce its playlist aggregation policy at scale, but declined to answer what improvements it has made thus far, if any.

It’s likely that Allie’s content has also received a boost in traffic from playlist viewers who wouldn’t have found her videos otherwise. “Marco,” a person who sends the girl video requests, pulled one of her hypnosis skits into a playlist that consists of dozens of other clips featuring little girls and young women being “hypnotized.” In another one of Marco’s playlists, titled “Petite,” small children can be seen giggling and pretending to lull each other into trances.

“The aggregation of those videos into a playlist — having 100 of those videos in a row — shows a predatory, potentially pedophiliac intent,” said Haley McNamara, the vice president of advocacy and outreach for the National Center on Sexual Exploitation. “[YouTube] has failed to engage with the nuances of grooming and sexual exploitation.”

YouTube has made some progress in combating the sexual exploitation of children on its platform, McNamara said — such as its removal of nearly 1 million videos that violated its child safety policy in the final quarter of 2019 — but its approach has been too limited.

“They will talk about how they do take down many videos, which is great — and they report things to the National Center for Missing and Exploited Children when it’s explicit child sexual abuse materials — but there’s much more sexual abuse than just explicit pornographic images,” she added. “They need to recognize that aggregation can be used in a predatory way as well.”

Until then, she said, “Parents need to know they can’t trust YouTube to protect their kids.”

YouTube vowed last spring to ban comments on videos of minors. But its enforcement has been inconsistent — allowing predators to groom children and to write comments sexualizing their prepubescent bodies.
YouTube vowed last spring to ban comments on videos of minors. But its enforcement has been inconsistent — allowing predators to groom children and to write comments sexualizing their prepubescent bodies.
Illustration: HuffPost; Photos: YouTube

Grooming Minors

YouTube’s algorithm has fostered an environment that caters to — and potentially breeds — child predators. It propels them toward videos that star children and are sometimes even uploaded by the children themselves. This is especially problematic because the site is teeming with unsupervised kids, despite its policy requiring any user under the age of 13 to be accompanied by an adult. In some cases, predators can still have a direct line of communication with minors on YouTube without ever leaving the platform.

For Allie, her situation is cyclical: By driving people to her unwittingly fetish-themed content, where they often leave comments requesting more, YouTube has incentivized her to continue producing it.

“I noticed everyone really likes the hypnosis videos. I actually checked [my] first hypnosis video, and it has like 1.3K views! I was so happy to see that,” she said in a post last year. Her most popular video, in which she “hypnotizes” another little girl, has 6,200 views. Before she started filming fetishes on demand, her videos rarely amassed 60 views.

Last year, when Disney, Nestlé, AT&T and other major brands pulled ads from YouTube over concerns about predatory comments on videos of kids, YouTube suddenly jumped into action: It announced that it would disable comment sections on videos featuring minors. This measure is undoubtedly effective in reducing the risk of child grooming — when it’s actually enforced.

Allie is far from the only child starring in YouTube videos that still have comments enabled. In fact, YouTube algorithmically recommended to HuffPost an abundance of videos — old and new — that feature children and that still have comments turned on. In some cases, pedophiles have left comments that objectify and sexualize those children.

“No form of content that endangers minors is acceptable to us.”

- YouTube spokesperson

A video of very young girls walking a runway in bathing suits was littered with lewd comments discussing their prepubescent bodies — including some written as recently as last week. Users left timecodes to guide each other to specific girls, talked about getting a “boner,” masturbating, wanting the “sexy kids” to sit in their laps, and openly wondered how the children’s vaginas might smell. In one comment thread, pedophiles could be seen trading contact information for the purposes of sharing images of a girl’s body. In another, they traded links to other videos of underage girls in revealing attire.

That video has close to 210,000 views and was uploaded in October 2019 — months after the comments ban was announced. YouTube disabled the comments section only after HuffPost brought the video to its attention.

It’s remarkable that YouTube relies on journalists and other outside groups to alert it to such issues, said leading digital forensics expert Hany Farid, who teaches computer science at the University of California, Berkeley.

“YouTube is not doing what they promised they’d be doing,” he said. “Somehow, magically, [reporters] can find these problems, but YouTube can’t, despite the fact that it’s their service, and they have all the money and resources.”

Although YouTube no longer has a private messaging feature, there’s nothing stopping predators from writing comments on kids’ videos asking them to share their off-site contact information. Allie typically replies to such requests by providing a link to her public Instagram page and inviting people to message her there.

Her latest videos with comments are flooded with requests from Damien and others asking her to act out “fainting several times” and similar fetishes. In a comment left in January, someone told her to hypnotize another person. “Find someone, anyone: your friends, cousin, sister but it has to be a girl,” the person instructed, to which Allie replied: “I hope to make your videos soon!”

“No form of content that endangers minors is acceptable to us," YouTube says.
“No form of content that endangers minors is acceptable to us," YouTube says.
HuffPost/Getty

A History of Broken Promises

Over the years, YouTube has claimed repeatedly that keeping children safe on its platform is a top priority — one it will strive for by any means necessary. But since vowing last spring to reduce the spread of videos of minors in “risky situations,” the company has actually continued to amplify such videos into virality and to specifically steer them toward users seeking sexual content and footage of partially clothed kids.

“This is a feature, it’s not a bug,” said Farid. “Regardless of what they keep saying, they’re not optimizing for child safety — they’re optimizing for engagement and profit.”

YouTube only takes action in response to critical press coverage, fleeing advertisers and threats of regulation, he said, adding that even then, the company’s executives “only ever make changes that are just enough to get everybody off their backs, but not so much to really address the problem.”

Like other tech platforms, YouTube is shielded from liability for users’ content thanks to Section 230 of the Communications Decency Act, a law passed in 1996. This essentially means it cannot be held legally accountable for the videos it hosts and promotes.

Congress has threatened to change that.

“We are dismayed at YouTube’s slow and inadequate response to repeated stories about child exploitation on its platform,” Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.) wrote in a letter to YouTube’s chief executive officer, Susan Wojcicki, after the Times published its article on the issue. “With YouTube asleep at the wheel, predators have taken the reins.”

The senators urged YouTube to temporarily halt all recommendations of videos primarily featuring minors until it could work out a lasting solution — which it declined to do. Sen. Josh Hawley (R-Mo.) took a more aggressive approach: He introduced a bill that would require YouTube to cut off such recommendations indefinitely or face criminal penalties.

And just this month, a coalition of senators introduced the EARN IT Act, a bipartisan bill that would force tech platforms to “earn” their immunity from liability by adhering to a set of “best practices” for preventing, reducing and responding to child exploitation.

YouTube has lobbied against such legislation, insisting that it’s capable of self-regulation.

“No form of content that endangers minors is acceptable to us. We have explicit policies that prohibit this and while the majority of videos featuring minors on YouTube are not violative and posted innocently, we take an extra cautious approach towards enforcement,” a spokesperson told HuffPost. “We’ve invested heavily in this area, and will continue to work with experts inside and outside of YouTube to provide minors and families the best protections possible.”

Meanwhile, Allie is seeking requests for her next video.

Popular in the Community

Close

What's Hot