TikTok, Facebook OK'd Ads With Misinformation About Voting: Report

Watchdog investigators submitted ads targeting multiple battleground states in the midterms such as Arizona, Colorado and Georgia.
LOADINGERROR LOADING

Social media platforms Facebook and TikTok failed at enforcing their policies after each being hit with ads containing “blatant” misinformation about the 2022 midterm elections, a report found Friday.

Stemming from an investigation by watchdog Global Witness and New York University’s Cybersecurity for Democracy team, the new report describes researchers’ efforts to post 20 ads with misinformation to Facebook, TikTok and YouTube.

The ads were in both the English and Spanish languages and targeted multiple battleground states in November’s midterms, such as Arizona, Colorado and Georgia.

They reportedly featured several inaccurate claims around extended voting days and primary votes counting in the midterms, among other statements. The groups said they deleted the ads after the platforms made a decision on whether to accept them.

TikTok OK’d those ads, the report said, but would not accept a Facebook-approved ad about mandatory COVID-19 vaccinations for voters.

TikTok — owned by Chinese company ByteDance — fared the worst in the researchers’ investigation, the report said, as the platform approved 90% of ads with disinformation.

The platform’s reported failure comes three years after TikTok imposed a ban on political ads.

A TikTok spokesperson, in a statement to the groups, stated that the platform prohibits and removes election misinformation along with paid political advertising from the app.

“We value feedback from [nongovernmental organizations] ... academics, and other experts which helps us continually strengthen our processes and policies,” the spokesperson said.

Meta’s Facebook platform approved a “significant” number of the ads: 30% in English and 20% in Spanish during one test, and 20% in English along with 50% in Spanish during another, the report said.

A Meta spokesperson told the groups that their report was based on a very small sample size and doesn’t represent the political ads that the company reviews daily from around the world.

The spokesperson wrote that the platform’s ad review process goes through several layers of analysis and detection, as well.

“We invest significant resources to protect elections, from our industry-leading transparency efforts to our enforcement of strict protocols on ads about social issues, elections, or politics – and we will continue to do so,” the spokesperson said.

Global Witness noted other investigations that show all election misinformation ads it tested in Brazil and all hate speech ads it tested in Kenya, Myanmar and Ethiopia sailed past Facebook’s ad approval process.

Google-owned YouTube, on the other hand, found and rejected each ad that the researchers submitted to the platform while also suspending a channel used to post ads, according to the report.

Google, in a statement to The Associated Press on Friday, wrote that the company has “developed extensive measures to tackle misinformation” on its platforms, including false claims about elections and voting.

“In 2021, we blocked or removed more than 3.4 billion ads for violating our policies, including 38 million for violating our misrepresentation policy,” Google wrote.

“We know how important it is to protect our users from this type of abuse – particularly ahead of major elections like those in the United States and Brazil.”

Damon McCoy, a co-director of the Cybersecurity for Democracy team, said in a press release that disinformation has had a major impact on elections and argued that YouTube’s performance in the research isn’t impossible to match.

“All the platforms we studied should have gotten an ‘A’ on this assignment,” McCoy said.

Jon Lloyd, a senior adviser at Global Witness, said companies with social media platforms claim to recognize the problem of disinformation but added that the research shows they aren’t doing enough to curb it.

“Coming up with the tech and then washing their hands of the impact is just not responsible behaviour from these massive companies that are raking in the dollars,” Lloyd said in the press release.

“It is high time they got their houses in order and started properly resourcing the detection and prevention of disinformation, before it’s too late. Our democracy rests on their willingness to act.”

Popular in the Community

Close

What's Hot