(Reuters) — Alphabet Inc’s YouTube will lift its suspension on former U.S. President Donald Trump’s channel when it determines the risk of real-world violence has decreased, the company’s CEO, Susan Wojcicki, said on Thursday.
YouTube suspended Trump’s channel for violating policies against inciting violence after the assault on the U.S. Capitol by the former president’s supporters in January.
“The channel remains suspended due to the risk of incitement to violence,” said Wojcicki, speaking in an interview with the head of the Atlantic Council think tank. She said recent warnings by the Capitol police about a potential new attack on Thursday showed that an “elevated violence risk still remains.”
Wojcicki said that YouTube would determine the risk of violence by looking at signals such as government statements and warnings, increased law enforcement around the country and violent rhetoric on the platform itself.
In the aftermath of the Jan 6. riot, social media companies including Twitter Inc and Facebook Inc banned Trump’s accounts on their platforms. Twitter’s ban is permanent, while Facebook has sent the case to its independent oversight board to decide whether Trump’s accounts should be unblocked.
“We will turn the account back on,” Wojcicki said. “But it will be when we see the reduced law enforcement in capitals in the U.S, if we don’t see different warnings coming out of government agencies, those would all be signals to us that it would be safe to turn the channel back on.”
Trump spokesman Jason Miller did not immediately respond to a request for comment.
Security around the Capitol was tight on Thursday after police warned that a militia group might try to attack it to mark a key date on the calendar of the baseless QAnon conspiracy theory.
Under YouTube’s policies, if an account has three “strikes” in a 90 day period it will be terminated. The suspension of Trump’s account, for a minimum of a week, was because it gained a first “strike.” YouTube also indefinitely disabled comments under videos on the channel.
Major social media companies have been under pressure to curb conspiracy theories, violent rhetoric and other abuses on their sites.
A report from the Election Integrity Partnership, a coalition of misinformation researchers, said in a report this week that YouTube provided a space for video misinformation to be shared easily across multiple platforms and that this content functioned to provide “evidence” for misleading narratives.
(Reporting by Elizabeth Culliford in New York and Paresh Dave in Oakland, California. Editing by Matthew Lewis, Alexandra Hudson)