When Vice News reported last month that Twitter had been “shadowbanning” conservative voices, Republicans seized on the story as more evidence of their favorite victimhood narrative: that liberal elites in Silicon Valley are secretly manipulating social media platforms to silence conservatives. Many technology reporters attacked the story, both for factual blunders (what Twitter was doing wasn’t shadowbanning, it was based on user behavior, not content, and it affected prominent leftists as well) and for its framing, which The Verge, New York magazine, The New Republic and many others derided as a gift to bad-faith provocateurs.
These critiques referred to a 2016 Gizmodo story that now stands for many as a watershed moment in the right’s campaign to bully tech platforms into submission. That story, by reporter Michael Nuñez, was headlined “Former Facebook Workers: We Routinely Suppressed Conservative News.” In it were two revelations, a smaller one contained within a larger one. The larger one was that Facebook’s now-defunct Trending news module was precisely the opposite of the neutral, algorithmically generated reflection of user interest that Facebook (and its gullible allies in the tech press) had always claimed. The piece relied on the testimony of Facebook “news curators,” who told Nuñez that the human editorial influence brought to bear on the Trending news menu tended to result in stories that were circulating in the right-wing echo chamber being spiked. This was the smaller revelation, and it handed conservative trolls a bloody shirt to wave.
Within days, disingenuous congressional inquisitors had announced an investigation, and Facebook eventually paraded Tucker Carlson, Glenn Beck, Dana Perino and a host of other right-wing commentators into a meeting with Mark Zuckerberg. Facebook was clearly spooked by the attention, and the consensus today among tech journalists is that by activating conservatives, the Nuñez story set the tone for Facebook’s and Twitter’s refusals for years to excise racist and fascist trolls from their ecosystems.
“Just because [the right-wing complaint] is made in bad faith doesn’t mean it can’t be effective,” The Verge’s Casey Newton wrote. “That was one of the lessons of Facebook’s last two years, in which a Gizmodo story that argued Facebook was ‘suppressing conservative news’ led to the company eliminating human editors from its platform. That helped pave the way for the spread of misinformation on the platform that continues today.” Wired similarly argued that the Trending controversy made Facebook “wary of doing anything that might look like stifling conservative news,” setting the stage for a pre-election “summer of deeply partisan rancor and calumny [that] began with Facebook eager to stay out of the fray.”
Here’s the thing: The Gizmodo story was good. It was solidly, accurately reported by Nuñez, who spent months on it as part of a long-term investigation. It was expertly guided by Gizmodo’s then-editor-in-chief, Katie Drummond, and her deputy Alex Dickinson. I should know because I was their boss. At the time Nuñez was reporting on Facebook, I was the executive editor of Gawker Media, a network of sites that included Gizmodo and whose namesake blog was assassinated by Facebook board member Peter Thiel because he didn’t like its reporting on him and his friends. While I’m glad that his story is widely seen as the first to pull the thread that led to Facebook’s current unwinding, I’m here to defend it against the emerging conventional wisdom that it abetted in the right’s knuckling of the platform.
First off, here’s what the story says: A former Facebook news curator, a self-described conservative, told Nuñez (anonymously) that curators would routinely “blacklist” stories that were actually organically trending on the social network but were generated by conservative news sources: “I’d come on shift and I’d discover that CPAC or Mitt Romney or Glenn Beck or popular conservative topics wouldn’t be trending because either the curator didn’t recognize the news topic or it was like they had a bias against Ted Cruz.”
Nuñez pressed the curator for evidence of the claim, and it turns out that the source kept contemporaneous notes of instances when a “conservative” topic was trending on Facebook but had been spiked. Nuñez reviewed them: “Among the deep-sixed or suppressed topics on the list: former IRS official Lois Lerner, who was accused by Republicans of inappropriately scrutinizing conservative groups; Wisconsin Gov. Scott Walker; popular conservative news aggregator the Drudge Report; Chris Kyle, the former Navy SEAL who was murdered in 2013; and former Fox News contributor Steven Crowder.”
“'Former Facebook Workers: We Routinely Suppressed Conservative News.' It was a headline written for Matt Drudge. It was engineered for direct injection into the veins of the right-wing grievance-mongers.”
Nuñez reached out to other Facebook news curators to try to corroborate what his source, and the contemporaneous notes, were telling him. A different curator confirmed (anonymously) what the first source told him: “It was absolutely bias. We were doing it subjectively. It just depends on who the curator is and what time of day it is.” This second source did not characterize himself or herself to Nuñez as a conservative, but the source described a dynamic wherein the normal, human, subjective judgments made by curators influenced the composition of the Trending module in such a manner as to reduce the visibility of stories of interest to conservatives even when those stories were actually “trending” on Facebook.
This is the rub: Facebook had long claimed that the Trending module was a simple one-to-one reflection of what the Facebook community — in all its glory and depravity — was talking about. “Facebook shows you things in your Trending line-up the same way it shows you things in your News Feed: Algorithms,” reported ReCode in 2014. “Once a topic is identified as trending, it’s approved by an actual human being, who also writes a short description for the story. These people don’t get to pick what Facebook adds to the trending section. That’s done automatically by the algorithm.”
This wasn’t an esoteric distinction. It was core to Facebook’s identity as a neutral platform upon which all the peoples of the world dance together. Facebook couldn’t admit in public that its “community” was logging into Facebook to chew over Fox News memes and racist Daily Caller stories. If it actually showed users what they were really talking about, the Trending news box would be a toilet. So it secretly hired humans to make editorial judgments (and treated them like shit, as Nuñez reported in a different story). That’s called publishing, and Facebook doesn’t want to be a publisher, in part because it doesn’t want to be held responsible for its judgments.
Of course, there’s nothing wrong with using human judgment to dismiss false or unreliable or uninteresting news. That’s what news organizations do every day. And if Facebook’s curators found stories circulating in the right-wing media ecosystem to be false or unreliable or boring, well, that would put them in a league with the vast majority of mainstream news outlets, which have long been accused of “liberal bias” for their general refusal to play ball with the lunatic fringe.
It’s this sort of nuance that critics found missing from Nuñez’s piece. “Vice’s story feels an awful lot like one reported two years ago by Gizmodo, which claimed that Facebook was ‘suppressing conservatives,’” wrote New York’s Brian Feldman. “In reality, Facebook’s editors were making the editorial judgment call that manufactured, misleading, and hyperpartisan stories from conservative outlets — such as ones about Benghazi, in 2016 — were less relevant to Facebook’s users than breaking stories from mainstream outlets.”
But anyone who read to the third paragraph of Nuñez’s story — what editors call the “nut graf,” which explains the general point of the article — would find this:
In other words, Facebook’s news section operates like a traditional newsroom, reflecting the biases of its workers and the institutional imperatives of the corporation. Imposing human editorial values onto the lists of topics an algorithm spits out is by no means a bad thing—but it is in stark contrast to the company’s claims that the trending module simply lists “topics that have recently become popular on Facebook.”
The story’s conclusion hammered the same point:
Facebook’s efforts to play the news game reveal the company to be much like the news outlets it is rapidly driving toward irrelevancy: a select group of professionals with vaguely center-left sensibilities. It just happens to be one that poses as a neutral reflection of the vox populi, has the power to influence what billions of users see, and openly discusses whether it should use that power to influence presidential elections.
I suspect these notes slipped Feldman’s mind because his recollection of the story is dominated by the headline: “Former Facebook Workers: We Routinely Suppressed Conservative News.” It was a headline written for Matt Drudge. It was engineered for direct injection into the veins of the right-wing grievance-mongers, and I knew full well when I wrote it — Nuñez, Drummond, Dickinson, and I all puzzled over it, but I take responsibility — that millions would see it and come to believe its most aggressive interpretation without comprehending the actual reporting. Tapping into the right-wing audience can be a huge traffic boon. Just a few weeks before the story was published, Gawker Media was hit with a $140 million verdict in Hulk Hogan’s Thiel-backed lawsuit; weeks after it was published, it declared bankruptcy. The newsroom needed a win. I didn’t want to stake the story on a more sober headline — something like “Former Facebook Workers Say They Used Editorial Judgment” — if it meant forgoing the traffic and impact that a Drudge hit can bring.
In the summer of 2016, Facebook was a black box. The only understanding about what happened inside was delivered to news outlets by Facebook’s PR shop, which cut off the flow of information to any reporter who deviated materially from the company line. Critical or adversarial coverage was rare, and Nuñez’s success in getting former news curators to speak out, even anonymously, was unprecedented on the beat. As such, we had a circumscribed perspective of the company and its ecosystem. We had a camera obscura’s view of Facebook: through a pinhole, inverted.
So we saw hypocrisy and falsehood in the story Facebook told about its news curation module because it hired a group of journalists to do what journalists do — make judgments, which were informed by their professional sensibilities. But we missed what was happening in the far more consequential News Feed, where an entirely different set of judgments was brought to bear on an incomprehensibly more vast terrain of news and opinions. Those judgments were: Open it up to the highest bidder and close your eyes.
For that system to work the way it was designed to, Facebook had to maintain a veneer of neutrality — i.e., non-complicity in the uses to which bad actors put Facebook’s engine — which is why you saw Zuckerberg recently trying to thread a needle on Holocaust denial. He wants to profit from its popularity on his platform without feeling bad about it.
The news curation story struck such a nerve both for the company and for its users because it put the lie to that posture of non-intervention. If people realized that Facebook did intervene in what stories it felt were worthy of a spot in the Trending Module, by using editors, then perhaps they might begin to interrogate the quieter interventions, too, the ones happening by way of the News Feed’s algorithm, which was privileging divisive, hateful and propagandistic content. The trending module was public, and as such, it needed to be handcrafted in order to reflect the values that the company wanted to project. The News Feed was a private flow, where Facebook’s actual values could be found in the sewage. Hiring editors to moderate that sewage in the trending module was the closest Facebook came in this whole mess to a noble act.
That’s the irony: This small, self-interested gesture at information hygiene alone rendered Facebook vulnerable to the right-wing outrage cycle. Not because Facebook sought to stifle conservative speech — it is by far the most extensive publisher and amplifier of Trumpist propaganda on the planet — but because the Fox News- and Breitbart-driven grievance brigades have been so successful that the mere imposition of value-based editorial standards is in itself an act of, ahem, suppression. Indeed, so successful that that vulnerability — the way that conservatives would inevitably seize on it, had already seized on it, within the organization — was part of what made the whole thing newsworthy to begin with. And so successful that a left-of-center tech site, in packaging its report, couldn’t resist trying to have it both ways by characterizing it as suppression in the headline and as editing in the story.
Nuñez’s story was part of an ad hoc series — a campaign of posts about Facebook that began small, gathered sources and culminated in the coup de grace. One of the early posts in that campaign was a tidbit he picked up from a company all-hands with Zuckerberg and Sheryl Sandberg — an internal poll of employee questions for the pair had surfaced as one of the most popular: “What responsibility does Facebook have to help prevent President Trump in 2017.”
We played it as a potentially dystopian dispatch from within a company that had acquired the power — and perhaps the inclination — to throw a presidential election.
Naturally, we prioritized democratic principles over the inarguable benefit to humanity were Facebook to have actually intervened against Trump. What would happen, we asked, if Facebook allows the political values of its employees and management to influence what news it shows us?
Asking that stupid question is my biggest regret from Gizmodo’s Facebook coverage. We never asked what would happen if it allows greed to determine what news it shows us. Now we know.
John Cook was the final executive editor of Gawker Media.