In the era of fake news and “alternative facts,” social media giants are facing tremendous pressure to keep disinformation off their platforms. Google-owned YouTube vowed this year to crack down on fake news, but the site is still rampant with wild and sometimes dangerous conspiracy theories ― including many that its algorithm recommends to users.
When I created a new YouTube account and typed “What is QAnon?” into the search bar, the top result was an MSNBC panel breaking down the far-right conspiracy theory movement that sprang to life in late 2017. As the video ended, YouTube’s autoplay function started a new video, guiding me down a path of recommended content. This one featured a man in a MAGA hat suggesting the late former President George H.W. Bush was a Nazi and child trafficker who was secretly executed to protect his family’s legacy. Next was a rant about a “mysterious envelope” some guests at Bush’s funeral were seen holding. Then came a video that reported “with absolute certainty” that the U.S. killed “a fake Bin Laden.” The next one argued 9/11 was an inside job and claimed President Donald Trump’s tweets contain hidden messages showing he’s known all along. Sixth in the automated queue was a segment with infamous conspiracy theorist Alex Jones, posted to a channel that appears to exclusively host Jones’ videos (which supposedly are banned from YouTube).
YouTube’s algorithm “is extremely biased toward conspiracy theories. It promotes a huge amount of false information ― literally the crazier the better for the algorithm to recommend it,” said Guillaume Chaslot, who worked at Google for three years and helped design YouTube’s recommendation algorithm, but left after he found management to have “very little interest” in developing tools to recommend balanced content.
“It’s all about maximizing watch time,” Chaslot explained. “The more watch time you have, the more ads you can show the user,” which translates to more money for Google. YouTube also incentivizes content creators to keep people watching: Those with at least 1,000 subscribers and 4,000 watch-hours in a one-year period can earn money from ads.
The video giant’s dominance in the media world lends undue legitimacy to conspiracy theorists when it recommends their content, Micah Schaffer, a former YouTube policy analyst and community manager, said at a content moderation conference in November. More than half of adult users say the site is an important source for understanding what’s happening in the world, according to a new Pew Research study, and the number of users who turn to YouTube for news nearly doubled to 38 percent in 2018 from 2013. YouTube’s algorithm, which generates more than 70 percent of user traffic, takes fringe content “and pours gas on it,” Schaffer added.
The algorithm is extremely biased toward conspiracy theories ... the crazier the better. Former YouTube engineer Guillaume Chaslot
YouTube was forced to reckon with its conspiracy theory problem after “Pizzagate” in 2016, when an armed man targeted a Washington pizzeria after watching a YouTube video that falsely claimed the eatery was headquarters for a child sex-trafficking ring. The company subsequently committed to hiring additional content review staff, bringing the total to more than 10,000, as well as to adding “authoritative” context to videos covering topics often prone to misinformation. For those topics, YouTube will link to “trusted sources like Wikipedia,” it announced in July, apparently without giving advance notice to Wikipedia. (None of the conspiracy theory videos I watched had such links.)
“While we’ve made good progress, we are still working to improve the news and information-seeking experience on YouTube,” a spokesperson told HuffPost.
To highlight this continued struggle with the promotion and propagation of fake news, HuffPost found five of the wildest conspiracy theories YouTube’s algorithm wanted you to watch in 2018.
Mass Shooting Survivors Are Paid ‘Crisis Actors’
One week after a gunman opened fire in a south Florida high school in February, killing 17 students and staff members, a video falsely accusing a surviving student of being a paid “crisis actor” ― someone trained to portray a disaster victim ― had reached the top spot on YouTube’s trending page, with more than 200,000 views.
“This video should never have appeared in Trending,” a YouTube spokesperson told HuffPost at the time. The company said its system “misclassified” the video because it “contained footage from an authoritative news source.”
Disturbed by a maelstrom of disinformation surrounding the massacre, social media researcher Jonathan Albright traced YouTube’s recommendations from 256 videos on the subject of “crisis actors,” which led him to almost 9,000 conspiracy theory-themed videos with nearly 4 billion combined views, he explained in a Medium post. “The experiences of the least fortunate among us — including tragedy survivors, children and their families — are being used to algorithmically profit from the most impressionable,” he said.
NASA Is Lying To You: The Earth Is Flat
Flat Earth videos have been popular on YouTube for some time now, and this year, the movement held an international conference in Denver. The Daily Beast’s Kelly Weill attended and published a deep dive into YouTube’s echo-chamber effect on users who became Flat Earthers after watching recommended videos on the topic. One man told Weill that autoplay Flat Earth videos “woke him up” to the movement centered around the belief that the earth is, well, flat.
“From the point of view of the algorithm, Flat Earth conspiracy theories are a gold mine,” Chaslot said.
Some theories circulating in YouTube-recommended videos claim NASA has been photoshopping images of a globe-shaped earth to deceive the world. Another with more than 650,000 views says there’s an enormous ice wall at the edge of the earth, as witnessed by a guy named David.
Government Laser Beams Started The California Wildfires
California’s Camp Fire was the deadliest and most destructive wildfire in the state’s history, killing at least 86 people in November and December. As the crisis unfolded, YouTube was also ablaze with conspiracy theories suggesting the government used laser technology known as directed-energy weapons to ignite the fires.
When YouTube users started typing “California fire” into the platform’s search bar, the top suggestions were “conspiracy 2018,” “agenda 21,” and “laser beam,” Motherboard’s Caroline Haskins reported in November. The search term “California wildfire” also led YouTube to suggest “lasers,” “directed energy weapon” and the corresponding acronym “DEW.”
YouTube in the past week was still recommending multiple conspiracy theory videos about California fires, according to AlgoTransparency, an algorithm-watchdog website Chaslot created to track YouTube’s video recommendations from more than 1,000 channels. One warned “a massive avalanche event of awakening” is “coming soon.” In less than 38 minutes, it was interrupted three times by mid-video ads. Another, which also suggests the government has used “silent, invisible” laser beams to ignite fires in California in the past, has 1.3 million views.
QAnon: Uncovering The Liberal Elite’s Deep-State Pedophiles
YouTube users searching for videos on Tom Hanks and Steven Spielberg over the summer were directed at the time to QAnon conspiracy theories calling the Hollywood stars pedophiles, including multiple videos with hundreds of thousands of views, as NBC News reporter Ben Collins pointed out in July. QAnon, an anonymous troll referred to as “Q” who claims to have high-level government clearance, drops coded messages online that often hint at cover-ups, scandals and corruption among the “liberal elite.” Q’s clues have inspired all kinds of madness, though the main theory is that Trump and special counsel Robert Mueller are in cahoots to expose celebrity child molestors. YouTube, Reddit and other social giants have helped the fringe community gain traction and go mainstream. QAnon followers made appearances at several Trump rallies in 2018.
QAnon followers also spread rumors that Cemex, a Mexican cement company, owned a human trafficking site in Arizona. CNN informed YouTube in August that its top autocomplete result for the search term “Cemex” was “Cemex child trafficking,” prompting the video giant to derank those videos. Other wildly popular YouTube videos have spread conspiracy theories about Q’s identity, with some arguing John F. Kennedy Jr. faked his own death to become the group’s leader.
As insane and horrifying conspiracy theories go, “Frazzledrip” likely takes the top prize this year. The theory claims an “extreme snuff film” of Hillary Clinton and her longtime aide Huma Abedin raping and mutilating a young girl is circulating on the dark web. Code named Frazzledrip, the film is rumored to show Clinton and Abedin cutting off the girl’s face and wearing it as a mask, and preparing to drink her blood in a satanic ritual. This is, of course, demonstrably false, but that hasn’t stopped people from posting their own twisted takes on YouTube ― or YouTube from promoting them.
BEFORE YOU GO
How to vote
Vote-by-mail ballot request deadline: Varies by state
For the Nov 3 election: States are making it easier for citizens to vote absentee by mail this year due to the coronavirus. Each state has its own rules for mail-in absentee voting. Visit your state election office website to find out if you can vote by mail.Get more information
In-person early voting dates: Varies by state
Sometimes circumstances make it hard or impossible for you to vote on Election Day. But your state may let you vote during a designated early voting period. You don't need an excuse to vote early. Visit your state election office website to find out whether they offer early voting.My Election Office
General Election: Nov 3, 2020
Polling hours on Election Day: Varies by state/localityMy Polling Place