Two people were shot and killed in the resulting violence Tuesday and another was wounded.
On Wednesday, police arrested 17-year-old Kyle Rittenhouse in his hometown of Antioch, Illinois ― 20 miles from Kenosha ― and charged him with first-degree murder.
Rittenhouse was among hundreds of people to respond to a call to arms on Facebook where, earlier Tuesday morning, a “Kenosha Guard” page asked members to “take up arms and defend out [sic] City tonight from the evil thugs” who are “Nondoubt [sic] ... currently planning on the next part of the City to burn tonight!”
An associated event page called “Armed Citizens to Protect our Lives and Property” saw more than 300 people RSVP and 2,300 others interested in attending. “[L]aw enforcement is outnumbered and our Mayor has failed,” the event description read. “Take up arms and lets defend our CITY! Meet at civic center at 8PM.”
The invite was also picked up and distributed by the far-right conspiracy website InfoWars.
After the double homicide, Facebook decided the Kenosha Guard page and the event violated community standards and removed them.
But the social media platform had the opportunity to remove the page before the shooting. Two separate Facebook users told The Verge they’d reported the page for inciting violence before the event; in both instances, Facebook moderators determined the pages didn’t violate policies and took no action.
Facebook puts the responsibility on victims to anticipate violence against them, search for it and take extraordinary measures to have it removed.
One user told the tech website she reported a specific comment about putting nails in the tires of protesters’ cars, noting the thread was full of people discussing what weapons to bring. Another reported the event itself. In both cases, the users were told the content didn’t violate Facebook policies.
Facebook did not respond to a request for comment from HuffPost.
Facebook’s inaction in this instance is a result of a broader failure at the company. An independent civil rights audit commissioned by Facebook warned in July that the platform is being “weaponized” by outside actors and that its decision-making process is “too reactive and piecemeal.”
Kenosha is far from an isolated incident. In 2019, a different militia group in Texas used Facebook to organize an armed event at a gathering of Muslims in Houston, plotting violence as they had the year prior before they were outnumbered by counterprotesters.
Facebook reluctantly took the page down after the civil rights organization Muslim Advocates repeatedly highlighted the danger.
“We had to be lucky enough to find the event page, alert Facebook of its presence, watch the media cover it, escalate the matter to senior officials and then still wait 28 hours before they took it down,” Madihha Ahussain, Muslim Advocates’ special counsel for anti-Muslim bigotry, said in an emailed statement.
“That event, that group and that call to arms should never have been on Facebook to begin with, but the company puts the responsibility on victims to anticipate violence against them, search for it and take extraordinary measures to have it removed,” Ahussain said.
In May, a study by the Tech Transparency Project warned that hate groups are “thriving” on Facebook despite the company’s assurances that extremists aren’t allowed on the platform.
Facebook and Instagram finally cracked down on QAnon earlier this month after years of allowing the far-right, increasingly violent conspiracy theory movement to flourish on the platforms ― and therefore also in the real world.
A HuffPost investigation in 2018 found Instagram, in particular, has long been quietly and aggressively expanding QAnon’s reach by algorithmically amplifying its content.
“What happened in Kenosha was preventable but Facebook chose to look the other way yet again,” Ahussain said.
“Facebook has blood on its hands.”