The most popular link viewed on Facebook earlier this year was an article that suggested a Florida doctor may have died from a coronavirus vaccine, according to a new report from the social media giant amid growing concerns that Facebook is enabling the spread of COVID-19 misinformation.
The article, which amassed more than 53.8 million views between Jan. 1 and March 31, suggested that the doctor’s death was “possibly the nation’s first death linked to the vaccine.”
The article originally appeared in the South Florida Sun Sentinel in January, and was widely shared by the Chicago Tribune on Facebook. The story was later updated after the vaccine’s role in the 56-year-old man’s death was ruled inconclusive by a medical examiner.
The article’s massive popularity was revealed in a first-quarter “Content Transparency Report” that was shared publicly on Saturday by Facebook spokesperson Andy Stone, after The New York Times reported that the document had been quietly shelved over concerns that it would make the company look bad.
Earlier in the week, Facebook released a separate set of findings, “Widely Viewed Content Report: What People See on Facebook,” that covered content from April 1 to June 30. This report, which listed far more harmless content at the top of its popularity lists, had been labeled as a first-quarter report, according to the Times, but now says “Q2 2021” at the top.
Stone shared a link to an “internal copy” of the previously undisclosed “Content Transparency Report” on Twitter. He said this report’s findings hadn’t been released sooner because “there were key fixes to the system we wanted to make.”
“We’re guilty of cleaning up our house a bit before we invited company. We’ve been criticized for that; and again, that’s not unfair,” Stone tweeted on Saturday. He did not go into detail about the “fixes” that allegedly led to the report being withheld, though a source with Facebook told HuffPost that the issues involved “bugs in some of the queries.” The source did not immediately respond to a request for further details.
When asked about the apparent discrepancy in the description of the various reports as first-quarter and second-quarter, Stone told HuffPost on Sunday that the “Widely Viewed Content Report” released on Aug. 18 covers the second quarter of the year, but was the first “inaugural” report released.
When Facebook revealed its “Widely Viewed Content Report” last week, the company said it was doing so because “transparency is an important part of everything we do at Facebook.”
“Our goal is to provide clarity around what people see in their Facebook News Feed, the different content types that appear in their Feed and the most-viewed domains, links, Pages and posts on the platform during the quarter,” the company said.
There remains skepticism about the accuracy of the public data, however, including from former Facebook employees.
“You can’t trust a report that is curated by a company and designed to combat a press narrative rather than real meaningful transparency,” Brian Boland, a former vice president of product marketing at Facebook, told the Times. “It’s up to regulators and government officials to bring us that transparency.”
Another former Facebook employee, speaking anonymously to The Washington Post due to a nondisparagement clause, likened the company’s report to “ExxonMobil releasing their own study on climate change.”
“You can’t trust a report that is curated by a company and designed to combat a press narrative rather than real meaningful transparency.”
“It’s something to counter the independent research and media coverage that tells a different story,” the former employee said.
There have been growing concerns about the spread of vaccine misinformation on social media platforms, with President Joe Biden last month going so far as to say that social media companies are “killing people.”
“We’re dealing with a life-or-death issue here, and so everybody has a role to play in making sure there’s accurate information,” White House press secretary Jen Psaki said of Facebook in July. “They’re a private-sector company. They’re going to make decisions about additional steps they can take. It’s clear there are more that can be taken.”
Facebook has repeatedly pledged to take more action against anti-vaccine information, but critics and skeptics have said the company’s efforts don’t go far enough, and have called for independent access to user activity data.
“It’s defensible on the part of Facebook that they want to protect the data of an everyday person,” Rachel Moran, a researcher studying COVID-19 social media misinformation at the University of Washington, told Recode. “But in trying to understand actually how much misinformation is on Facebook, and how it’s being interacted with on a daily basis, we need to know more.”
U.S. Surgeon General Vivek Murthy, whose office has described health misinformation as a threat to the nation’s COVID-19 response, has also warned of a crucial lack of data from social media companies.
“The data gap means we are flying blind,” Murthy told Recode earlier this month. “We don’t know the extent of the problem. We don’t know what’s working to solve the problem. We don’t know who’s most impacted by the problem.”