A Google representative said Tuesday that hate “has no place on YouTube,” a platform the company runs, during a congressional hearing on the rise of hate crimes and white nationalism just after YouTube users spent more than an hour leaving vile messages on an official video feed of the event.
Alexandria Walden, a public policy counsel at Google focused on free expression and human rights, was among the witnesses called before the House Judiciary Committee. Shortly before she began speaking, the comments section of the committee’s official video stream of the hearing was shut off, having been filled with anti-Semitic, Islamophobic, racist and otherwise hateful notions.
“We are deeply troubled by the increase of hate and extremism in the world,” Walden said. “We take these issues seriously and want to be part of the solution.”
After describing the difficulty in identifying hate speech due to its reliance on context and noting how “borderline” hateful content can be pushed to the sidelines of the platform, Walden stated: “Hate speech and violent speech have no place on YouTube.”
What was happening on the live stream told a different story, however. Commenters were able to leave messages containing slurs for Jewish people and Muslims while others typed “14,” which refers to a white nationalist slogan, and “88,” which signals “Heil Hitler” in white nationalist circles.
Committee chairman Jerry Nadler (D-N.Y.) pointed out that the comments had been turned off during the hearing and read a short selection of them. Rep. Louis Gohmert (R-Texas) cut in to ask whether the comments were somehow a “hoax.”
In November, the FBI reported that hate crimes had risen across the U.S. for the third consecutive year.
The committee’s hearing comes just weeks after a horrific mass shooting at two mosques in Christchurch, New Zealand, left 50 worshippers dead.
Many have suggested the massacre to be a tragic consequence of the spread of white nationalism around the world, aided by social media. The Christchurch shooter wrote a lengthy manifesto suggesting he had been motivated by white nationalist ideals and live-streamed 17 minutes of the shooting to Facebook through a body camera.
YouTube and Facebook were slammed for allowing the shooter’s gruesome video to proliferate across their platforms. Both companies were forced to play a game of whack-a-mole against users who uploaded copies of it en masse, raising questions about the tech giants’ ability to monitor and suppress hateful content.
Facebook Public Policy Director Neil Potts said at the hearing that white nationalist and hateful content was not permitted on the site “under any circumstances.” The social media giant just this week said it would ban white nationalist content.