Study: White Supremacist Groups Are ‘Thriving’ On Facebook, Despite Extremist Ban

With many Americans vulnerable to fascist ideologies during the pandemic, the study warns, Facebook could be fertile ground for recruitment.
LOADINGERROR LOADING

A new study reported that white supremacist groups are “thriving” on Facebook, despite repeated assurances from the company that it doesn’t allow extremists on its platform.

The watchdog group Tech Transparency Project released a study Thursday that found more than 100 white supremacist groups had a presence on Facebook.

Project researchers identified 221 white supremacist groups — using information collected by Southern Poverty Law Center and the Anti-Defamation League, two of America’s most prominent anti-hate organizations — and searched for those groups on Facebook.

About 50% of the groups were present on the platform, the study said.

Of the 113 white supremacist groups the project found on Facebook, 36% had pages or groups created by active users. The remaining 64% had a page auto-generated by Facebook itself.

“With millions of people now quarantining at home and vulnerable to ideologies that seek to exploit people’s fears and resentments about COVID-19, Facebook’s failure to remove white supremacist groups could give these organizations fertile new ground to attract followers,” TTP’s study said.

A screenshot of a white supremacist group's Facebook page from the study by the Tech Transparency Project.
A screenshot of a white supremacist group's Facebook page from the study by the Tech Transparency Project.
Tech Transparency Project

The study comes after years of rising white nationalism in the U.S. and heightened scrutiny over social media companies’ role in providing online spaces for hate groups to spread their propaganda, to organize and to recruit.

There was a Facebook event page, after all, for the deadly 2017 white supremacist rally in Charlottesville, Virginia. And a white supremacist gunman live-streamed on Facebook as he massacred 51 people at two New Zealand mosques early last year. (The company later said it had removed 1.5 million videos of the mass shooting.)

Facebook has since taken a more aggressive stance in banning white supremacist activity but has been criticized for what extremism researchers describe as a whack-a-mole approach to hate on the platform. The company often takes down content only after inquiries from journalists.

After HuffPost emailed a Facebook spokesperson about TTP’s report this week, project researchers noticed the company had removed pages for 55 white supremacist groups identified in its report.

“We are making progress keeping this activity off our platform and are reviewing content in this report,” a Facebook spokesperson said in a statement to HuffPost, adding that the company has “banned over 250 white supremacist organizations and removed 4.7 million pieces of content tied to organized hate globally in the first quarter of 2020, over 96% of which we found before someone reported it.”

Facebook has a team of 350 people working to develop and enforce its Dangerous Individual and Organizations policy, under which hate and terror groups are banned, the spokesperson added.

A screenshot by the Tech Transparency Project shows how Facebook auto-generates pages for white supremacist groups.
A screenshot by the Tech Transparency Project shows how Facebook auto-generates pages for white supremacist groups.
Tech Transparency Project

Of the 113 white supremacist organizations that the project found on Facebook, 64% had pages that had been created by Facebook itself. Such auto-generated pages occur when an individual user lists a job in his or her profile that doesn’t have a corresponding business page. If one or more users list the Universal Aryan Brotherhood Movement as an employer, Facebook creates a page for the neo-Nazi group.

Auto-generated pages don’t have administrators who can use the pages to communicate with followers, Facebook’s spokesperson said.

An anonymous whistleblower filed a complaint with the Securities and Exchange Commission about Facebook auto-generating pages for hate and terror groups. Sometimes such pages earn thousands of likes, offering hate groups a ready-made recruiting pool, the whistleblower argued.

More than 250 people liked the auto-generated page for the Council on Conservative Citizens — whose white supremacist propaganda inspired Dylann Roof to massacre nine black parishioners at a Charleston, South Carolina, church in 2015.

That page, which included a link to the group’s website, appeared to have been removed by Facebook on Friday.

A screenshot showing another auto-generated white supremacist page.
A screenshot showing another auto-generated white supremacist page.
HuffPost

Facebook also appeared Friday to have removed auto-generated pages for the white nationalist website VDare and the violent neo-Nazi group Northwest Front, both of which were mentioned in TTP’s report.

A user-created page for the neo-Nazi group Aryan Nations, which had been active for just over 10 years, was also removed on Friday.

Still, many white supremacist organizations identified in TTP’s report remain on Facebook. One user-created page for Arktos Media — the fascist European publishing house with close ties to American white supremacist Richard Spencer — is still active. More than 42,000 people have liked the page.

A page for the hate group Fight White Genocide, which has more than 1,000 likes, is still active. A trio of years-old pages or groups called Right Wing Death Squad are also still online.

Facebook has an extensive process — including consultations with academics and various organizations — for developing criteria to determine which groups are considered hate groups, the company’s spokesperson told HuffPost.

But the company does not follow any one organization’s list of hate groups and so may not always agree with hate group identifications made by the Southern Poverty Law Center, the Anti-Defamation League or TTP, the spokesperson added.

A screenshot shows how users can be directed to visit other hate pages.
A screenshot shows how users can be directed to visit other hate pages.
Tech Transparency Project

TTP’s report raises an alarm about the function of so-called related pages on Facebook, which it said can push users into a white supremacist echo chamber.

“TTP’s investigation found that among the 113 hate groups that had a Facebook presence, 77 of them had Pages that displayed Related Pages, often pointing people to other extremist or right-wing content,” the report said. “In some cases, the Related Pages directed users to additional SPLC- or ADL-designated hate groups.”

The user-created Facebook page for American Freedom Union, for example, included a link to a page for the book “White Identity: Racial Consciousness in the 21st Century,” by Jared Taylor, another prominent U.S.-based white supremacist.

Redirects to Life After Hate surfaced in only 6% of searches for white supremacist content.
Redirects to Life After Hate surfaced in only 6% of searches for white supremacist content.
Tech Transparency Project

In 2019, Facebook announced that it would redirect users who searched white supremacist terms to a page for a group that works to rehabilitate extremists.

“Searches for terms associated with white supremacy will surface a link to Life After Hate’s page, where people can find support in the form of education, interventions, academic research and outreach,” the company said at the time.

But TTP’s study found that the Life After Hate link only surfaced in 6% of its 221 searches for white supremacist groups.

“One factor may be that not all of the hate groups listed by SPLC and ADL make their ideologies obvious in their names,” the TTP report conceded, a reference to how fascist groups sometimes use coded language to hide their true motives.

But the project said its researchers were rarely redirected to Life After Hate even when searching obvious white supremacist terms.

“Of 25 groups with ‘Ku Klux Klan’ in their official name, only one triggered the link to anti-hate resources,” the report found.

Popular in the Community

Close

What's Hot