Facebook said Tuesday it would ban all groups, pages and Instagram accounts that openly identify with QAnon, a dramatic escalation in the social media giant’s efforts to rein in the spread of the dangerous conspiracy theory.
The shift strengthens an August decision by the tech company to remove QAnon pages that celebrated violence, although that move was seen by many — including Facebook employees — as too lax to stop the spread of pernicious misinformation.
“Starting today, we will remove any Facebook Pages, Groups and Instagram accounts representing QAnon, even if they contain no violent content,” Facebook said in a statement. “We are starting to enforce this updated policy today and are removing content accordingly, but this work will take time and need to continue in the coming days and weeks.”
The company added: “We’ve been vigilant in enforcing our policy and studying its impact on the platform but we’ve seen several issues that led to today’s update.”
Facebook said that, although it has been active removing QAnon content that celebrated violence, other material “tied to different forms of real world harm” would no longer be welcome on the platform. Facebook pointed to misinformation campaigns around the cause of devastating wildfires on the West Coast that “diverted attention of local officials from fighting the fires and protecting the public.”
The movement surrounding the QAnon conspiracy theory has also been a prime source of falsehoods about the coronavirus pandemic and measures meant to stop the spread of infections, such as mask-wearing.
“QAnon messaging changes very quickly and we see networks of supporters build an audience with one message and then quickly pivot to another,” Facebook said. “We aim to combat this more effectively with this update that strengthens and expands our enforcement against the conspiracy theory movement.”
Support for QAnon, a baseless, convoluted set of conspiracy theories that at its core paints President Donald Trump as a crusader against a network of Satan-worshipping Democrats who traffic children, has surged in recent months since the beginning of the COVID-19 pandemic. The belief has been touted by a growing number of Republican candidates for public office (several elected Republicans have refused to condemn the movement), and hundreds of protests of QAnon supporters erupted across the U.S. earlier this year.
An internal investigation by Facebook released in August found thousands of groups and pages affiliated with the conspiracy theory with millions of members.
Citing the massive spread of QAnon support, Tuesday’s move was lambasted by some media watchdogs as too little, too late.
“Facebook helped the QAnon community grow exponentially — and refused to take appropriate action earlier this year when it would have mattered. Then, in a publicity stunt a few weeks ago, Facebook announced a crackdown would come, one that gave the QAnon community ample time to change their names, page descriptions and hashtags in order to flout Facebook’s enforcement,” Angelo Carusone, the president of the watchdog group Media Matters for America, said in a statement.
“In effect,” he said, “Facebook has made the problem worse: First by helping the QAnon community grow exponentially, then by helping them hide.”
The New York Times notes that QAnon has adapted quickly to Facebook’s efforts to clamp down on its messaging, renaming groups to evade bans and toning down messaging to be less overt. Reuters pointed to examples of pages referring to the movement as “cue” rather than Q to avoid moderation.
Facebook said Tuesday it would continue to update its policies to adapt to any efforts to bypass such bans.
“We expect renewed attempts to evade our detection, both in behavior and content shared on our platform, so we will continue to study the impact of our efforts and be ready to update our policy and enforcement as necessary,” it said.