Facebook wants you, me and the federal government to believe that the selection and prominence of content on Facebook, particularly in its "trending stories" feature, has nothing to do with human judgment or choice.
This has long been Facebook's public posture, and it was its initial knee-jerk response this week to a news story, appearing in Gizmodo, that was sourced to anonymous, and apparently disaffected, contractors hired by Facebook to monitor the selection of news items identified on Facebook as "trending."
The story, claiming that the contractors spiked "conservative" news stories while overriding the company's algorithms to insert "liberal" items in the feed, soon got the attention of Republican members of Congress (who, plainly, have way too much time on their hands).
By Tuesday, Senator John Thune, Republican of South Dakota and chairman of the Senate Commerce Committee, fired off a letter to Facebook demanding to know "if there's any level of subjectivity associated with" Facebook's trending news stories. Facebook responded, defensively, that it was "continuing to investigate whether" there were "any violations" of its neutrality policy and that it looked forward to addressing the Senator's questions.
Facebook is about to make a big mistake.
Rather than disclaim charges of human fingerprints on stories that it publishes or links to, it should defend vigorously its right to include in its "trending" section any news stories it chooses. While disputing the critique that its story choices skew left, it should make loud and clear its claim of exclusive authority to decide whether, where, how and when to publish content.
As a first step, Facebook should tell Congress, as diplomatically as it wishes, to f**k off.
Congress has no more power to inquire into Facebook's decision-making in this area than to second-guess the editorial judgments of the New York Times or Medium.com. Congress can kick and scream all it wants (free speech being a two-way street), but it can't interrogate Facebook, much less dictate its own editorial preferences. The First Amendment vests all editorial authority in Facebook.
Facebook wants to be everybody's friend. Like other highly successful tech companies, it is, by instinct and culture, averse to acknowledging a human or subjective dimension to the choices behind the content that its users see. That stance has served Facebook well through its long run of exponential growth. But its success now threatens competitors, who have tapped their Washington lawyers and lobbyists to put Facebook in regulators' sights.
Facebook's nightmare scenario: To have the government treat it as a public utility to be circumscribed and defanged, rather than as a high-tech innovator to be feted and encouraged. Sound familiar? Think Microsoft in 1998, when the Justice Department, lobbied by the company's competitors in Silicon Valley, was persuaded of the urgent need to use the antitrust laws to deflate Microsoft's lofty ambitions.
Facebook faces a similar inflection point.
To avoid the death by a thousand regulatory lashes that was Microsoft's fate for nearly two decades, Facebook should embrace the protections of the First Amendment. Facebook is not an empty platform to be filled by others. It is a community, a "social network," that is rich in third-party content. And, crucially, that third-party content is curated by Facebook.
Whether the curatorial function is performed by computers or editors in a smoke-filled newsroom doesn't matter. The point is that Facebook is making editorial choices; ergo, Facebook enjoys the highest degree of First Amendment protection.
Peter Scheer, a lawyer and journalist, is executive director of the First Amendment Coalition. The views expressed here do not necessarily reflect the position of FAC's Board of Directors.