Madrigal's Quilt of Horrors, or, How Did Facebook Fail Democracy?

Madrigal's Quilt of Horrors, or, How Did Facebook Fail Democracy?
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

Facebook likes to manipulate, and it creeps in the dark — a very bad combo.

If it weren't a social network, I'd take Facebook for a sociopath. Creeping in the shadows. Taking more than it gives. Beating down our self esteem. And what happened during the 2016 election cycle — unbelievable.

The media was deep in its own hubris, we know that now. But here's a word of advice from someone who knows next to nothing about politics: Dear Zuck and crew, lay off the emojis, sparkles and popups for a minute, tighten your process, and tell us what's up. Seriously. We're trying to figure out what the hell just happened to democracy.

Last week, Alexis Madrigal dropped a bomb in The Atlantic so big, we're looking at the social network we've given our lives to for a decade and wondering, can we ever trust Facebook again? As Madrigal tweeted, "What Facebook Did to American Democracy" is "many threads of a huge story" woven together. Now, it’s time to unravel this quilt of horrors.

Facebook can swing an election. As of June, it has 2 billion monthly users and 20,000+ in its employ. Last month, it handed over 3,000 Russian-linked ads to Congress for an investigation of Russian interference in the 2016 presidential race. The social network claims political neutrality, maintaining that "civic participation is a core value of democracy." One spokesperson says, "We have not and will not use our products in a way that attempts to influence how people vote.” Straight-forward enough.

But what's scary about this tone-deaf statement is how Facebook seems to know very little about how digital information works. Or maybe it knows too much. What’s clear post-2016 is that we need a Facebook watchdog. Or a dozen. Maybe 50, or more — one for each state and two for each party. Facebook played a part in hacking the U.S. election. We've been told it won't happen again, but where does that leave the voters?

In effect, the social network fails miserably at "bringing the world closer together." A 2013 report found Facebook users half as likely as others to share their political views in face-to-face settings. This "spiral of silence" has mutated the social network into an ideological echo chamber, where the tendency of people to favor information that confirms what they believe means they're less likely to connect with those who don't share their views. Given the predictability of online behavior, Facebook is left with a majority of members who are disconnected from the world around them. D+ at best.

Facebook likes to manipulate, and it creeps in the dark — a very bad combo. The social network has engaged in a "hostile takeover" of the news-media ecosystem, anonymously pumping sites with traffic, working up to a type of maladaptive dependency, where editors are the addicts coming back for more. In 2012, the social network imposed its "emotional contagion" study on 700,000 users without them even knowing, an Orwellian attempt to manipulate their emotions by flooding their feeds with selectively-placed memories designed to make them feel happy or sad. Facebook, depression is actually a thing, back off.

Facebook buys information from data brokers. Facebook denies it (of course) but we can see the glitch in The Matrix. Just ask Kashmir Hill, a Special Projects Desk reporter who had a long lost family member suggested to her by Facebook’s seemingly benign but creepy-as-hell-when-you-think-about-it People You May Know feature.

After confirming that it couldn't happen organically (no one who could connect her were users) she asked how it made the connection. Facebook wouldn't tell her. Hill writes, "Facebook gets people to hand over information about themselves all the time; by what principle would it be unreasonable to sometimes hand some of that information back?" A spokesperson later told Hill that at least 100 signals go into friend recommendations, a handful of which were actually known to her. If Facebook can access this kind of data on any user, imagine what it can piece together about a person's political history, or a family’s medical history, or criminal history, and so on.

I can't tell you how many times I've heard someone say they're dropping a friend on Facebook because of a political post, so this is important. When talking about Facebook and the election, it's crucial to look past the "conservative vs. liberal" distraction. I may know next-to-nothing about politics, but I know a few things about communication, civil society, digital rhetoric, cultural theory, and how social movements work.

Though it's clear how conservatives used social media to knock the liberals from the top spot, the manipulation of Facebook by foreign powers (ironic, funny if it weren't so tragic) isn't about Democrats or Republicans. And it isn't about Trump, who has enough to worry about. Madrigal says, "The real problem — for all political stripes — is understanding the set of conditions that led to [the president's] victory. The informational underpinnings of democracy have eroded, and no one has explained precisely how."

What's easy to see, though, despite a presumption of good intentions, is how the social media Godzilla continues to trample citizens and level cities unawares. The 2010 SCOTUS ruling, Citizens United, fundamentally altered the flow of money between elected officials and the rich. Saying that corporations are people and giving them the rights of citizens diminishes the body politic and flat-out cheapens the political process.

With "one person, one vote" ripped from the electorate, American democracy is wide open to more insidious forms of corrosion. Could Facebook be one? Madrigal says:

Add everything up. The chaos of a billion-person platform that competitively dominated media distribution. The known electoral efficacy of Facebook. The wild fake news and misinformation rampaging across the internet generally and Facebook specifically. The Russian info operations. All of these things were known. And yet no one could quite put it all together: The dominant social network had altered the information and persuasion environment of the election beyond recognition while taking a very big chunk of the estimated $1.4 billion worth of digital advertising purchased during the election.

Facebook's "algorithmic trouble" set off an avalanche of misinformation in 2016. What made it worse was an untraceable surge of so-called "dark ads," used to amplify fake news sites set up overseas. Some sites were led by teens who found they could profit from propping up the Manchurian candidate. By November, it was too late. An onslaught of false information from Facebook had already swayed large swaths of voters. The damage was done.

A year later, Facebook still doesn't feel right. Maybe it's the nature of its crimes — the "emotional contagion" study, the intentional masking of referrers, the Beacon program. Even Zuck's "cavalier attitude toward privacy" makes me seriously question his motive for "making the world more open and connected."

We're told that Facebook is on the road to recovery. News Feed says the social network has launched a "company-wide effort to improve the integrity of information on our service." But at what cost and whose expense? I wonder.

Popular in the Community

Close

What's Hot