One year ago, Facebook CEO Mark Zuckerberg stood before an audience at a Ritz-Carlton hotel in California and dismissed the idea that misinformation spread on Facebook could have influenced the outcome of the exhausting, divisive election the U.S. had just experienced.
“Personally, I think the idea that fake news on Facebook ― of which it’s a small amount of content ― influenced the election in any way is a pretty crazy idea,” Zuckerberg said at the time.
What a difference a year makes.
Zuckerberg has since said he regretted his earlier statement, explaining in a Facebook post in September, “This is too important an issue to be dismissive.”
And we’ve learned that the misinformation in question was more than just misguided white lies: It was actively planted on Facebook (and on Twitter, Google, Instagram, YouTube and Reddit ― hell, even on Pokémon Go) by Russia-linked actors looking to sow division.
More troubling still, the people behind those lies knew exactly what buttons to push to have the greatest effect. Odd as the ads were, some had shockingly high response rates ― as much as 24 percent in some cases, according to data released by the House Intelligence Committee.
In the swing states of Michigan and Wisconsin, for instance, we’ve learned that Russia paid Facebook to display highly targeted ads to precise demographics in key areas of the state. The ads didn’t endorse one candidate over the other. Instead, they pushed divisive, issue-based messages intended to outrage Americans and turn them against each other. A Facebook analysis in September specifically mentioned ads that touched on gay rights, race issues, immigration and gun rights.
Donald Trump went on to win Wisconsin (which, prior to 2016, had voted for a Democrat in every presidential election since 1984) by just 22,748 votes. And at 10,704 votes, his margin of victory in Michigan was closer still ― indeed, the closest presidential race in Michigan’s history.
Ultimately, Russia paid Facebook (in rubles, no less) $100,000 to run 3,000 targeted ads around the country, buying an audience of around 10 million people ― a population roughly equivalent to that of the entire state of North Carolina.
We’ve also learned that Russia’s digital reach extended far beyond paid ads alone. On top of the 3,000 ads, an additional 80,000 Facebook posts, linked to 120 actively managed Facebook pages, have so far been linked to the Russian government.
Those posts were seen by around 29 million Americans, who helped spread them to at least 126 million Americans overall. An additional 20 million encountered the content on Instagram.
Thanks to Facebook alone, more than half of all eligible voters in the U.S. were exposed to and interacted with Russian propaganda between January 2015 and August 2017. So far, Facebook has resisted calls to notify those who were exposed.
To be fair, it’s unclear just how much influence Russia’s content had, especially in light of the Trump campaign’s own substantial digital advertising push. But the sophisticated targeting should have everyone concerned ― not the least of all Facebook, whose services made it possible.
Yet despite these revelations and Facebook’s repeated pledges to do a better job of self-governing, the internet behemoth still continues to downplay its frightening power both at home and abroad. (For example, human rights groups have denounced Facebook for failing to take down posts that promote violence against Rohingya Muslims, a persecuted minority in Myanmar.)
This is nothing new for Zuckerberg and company. In 2011, Facebook was outraged when the Federal Election Commission considered implementing rules that would have helped avoid the very scenario that played out in last year’s election. At the time, Facebook lawyer Colin Stretch warned the government not to “stand in the way of innovation.”
As recently as this month, Facebook dispatched Stretch to deflect scathing testimony from lawmakers who are once again pursuing regulations, this time with a bit more urgency. The legislation in question, known as the “Honest Ads Act,” would force online advertisers like Facebook to adhere to the same disclosure rules for political ads as traditional media companies.
Stretch (and his counterparts at Twitter and Google) conceded that they could have done more to weed out the deliberate falsehoods peddled on their platforms, and they acknowledged that the foreign interference represented “an existential threat to our democracy.” But once again, they stopped short of actually embracing government regulation itself.