In a tweetstorm posted on Friday, Guillaume Chaslot discussed YouTube’s decision to stop recommending content about certain conspiracies, including claims of a flat earth:
The news was initially announced in a January blog post. YouTube said it would be “taking a closer look at how we can reduce the spread of content that comes close to ― but doesn’t quite cross the line of ― violating our Community Guidelines.”
“To that end, we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways,” the post said, noting that that the platform would avoid popularizing videos containing erroneous medical advice or false claims on the 9/11 terror attacks.
According to Chaslot, the change involved altering the role of artificial intelligence within the company and ensuring that it doesn’t continue suggesting similar content after a user has viewed a conspiracy-related clip.
“It’s only the beginning of a more humane technology,” he tweeted. “Technology that empowers all of us, instead of deceiving the most vulnerable.”
However, this wasn’t the first sign that YouTube was re-thinking the content it hosts. Perhaps the most notable example occurred last year when it decided to ban Alex Jones, a far-right conspiracy theorist/radio host who infamously and falsely claimed that the Sandy Hook massacre was fake.