Facebook Whistleblower Testifies About Company’s Harms Amid Leak Fallout

Frances Haugen’s claims are backed by tens of thousands of pages of internal research from the social networking giant.
LOADINGERROR LOADING

Facebook whistleblower Frances Haugen testified before Congress on Tuesday, telling a Senate subcommittee that the social media giant is putting “profits before people.”

Haugen’s claims are supported by tens of thousands of pages of internal Facebook research, which she previously provided to The Wall Street Journal.

The paper used the documents to publish a series of highly damaging articles about Facebook, alleging that, among other things, the company knew Instagram was “toxic” for teenagers even as it pursued strategies to sign up younger and younger children.

In her opening statement, Haugen urged Congress to regulate Facebook. She compared the company to Big Tobacco, which for decades hid research about its products’ true harm.

“The documents I have provided to Congress prove that Facebook has repeatedly misled the public about what its own research reveals about the safety of children, the efficacy of its artificial intelligence systems, and its role in spreading divisive and extreme messages,” Haugen said. “I came forward because I believe that every human being deserves the dignity of the truth.”

Frances Haugen, a former Facebook employee, arrives on Tuesday to testify in front of a Senate subcommittee. “I came forward because I believe that every human being deserves the dignity of the truth," she told lawmakers.
Frances Haugen, a former Facebook employee, arrives on Tuesday to testify in front of a Senate subcommittee. “I came forward because I believe that every human being deserves the dignity of the truth," she told lawmakers.
Tom Williams via Getty Images

Speaking in a calm, measured voice, Haugen took senators’ questions and made a compelling case for federal data privacy laws. “There’s no one currently holding Mark accountable,” she said, referring to Facebook CEO Mark Zuckerberg.

In particular, she highlighted Facebook’s engagement-based ranking systems, which determine who sees what on the platform. The algorithms give greater weight to content that elicits the strongest reactions, thereby boosting increasingly extreme posts.

Haugen also warned that Facebook relies too much on artificial intelligence to screen out harmful content, the vast majority of which evades detection.

Facebook misses 93% to 95% of the hate speech on the platform, Haugen said. She later added that the company’s anti-vaccine screening efforts, which also rely heavily on artificial intelligence systems, will never get more than 10% to 20% of the offending content.

“I’ve spent most of my career working on systems like engagement-based ranking,” Haugen told the subcommittee on consumer protection, product safety and data security. “When I come to you and say these things, I’m basically damning 10 years of my own work.”

In a clear attempt to discredit Haugen’s testimony, Facebook spokesperson Andy Stone chimed in via Twitter shortly after the hearing began. The argument: Because Haugen hadn’t worked directly on some of the subjects she was addressing, she wasn’t qualified to comment.

Sen. Marsha Blackburn (R-Tenn.) read the tweet aloud during the hearing, eliciting stifled laughter from Haugen.

Facebook policy communications director Lena Pietsch doubled down on that strategy in a longer statement after the hearing concluded, dismissing Haugen as a junior-level employee.

Notably, Pietsch, like Stone, didn’t actually contest any of the facts Haugen presented.

Fight for the Future, a nonprofit digital advocacy group that monitors Facebook, told HuffPost that engagement-based ranking is harmful because it amplifies misinformation and engenders social division — but also because it suppresses other important content, like posts about climate change.

“The problem with Facebook’s products is not that they host user generated content. It’s that they use machine learning to show us the content that Facebook thinks we want to see in order to keep us on the platform longer and sell more ads,” Evan Greer, FFTF’s director, told HuffPost in an emailed statement. “What Facebook sells is not an online message board where people can express themselves, it’s surveillance-driven algorithmic manipulation that’s maximized for engagement.”

Popular in the Community

Close

What's Hot