By now we've all heard about it. Facebook has a fake news problem, a rampant epidemic of phony and outrageous headlines in which a fraction-of-a-penny-per-click gets traded for lies. The problem is Facebook CEO Mark Zuckerberg doesn't believe these accusations nor that Facebook may have unfairly influenced the 2016 U.S. presidential election. He thinks it's a "pretty crazy idea." According to him, more than 99 percent of what people see in their news feed is authentic. But they are not. Just ask the people who work at Facebook, who formed a secret police to investigate and regulate where their employer's actions went wrong. The bottom line is that truth doesn't matter here. And by not doing enough, Facebook has become a silent majority supporting the incentivizing of lies and ill-gotten gains.
Why these phony sites circulated invented news have come to be isn't the issue. We get that they put food on the table for people overseas working to support themselves or their family. According to BuzzFeed News, one Macedonian town alone has 140 U.S. political websites. These sites don't act pro-Trump as much as just follow the action. They learned that Trump supporters crave sensationalist headlines that support their theories and beliefs. In other words, they want to hear what they want to hear and will look for proof in support of it. Democrats apparently don't take the same bait. According to Gizmodo, 38 percent of right-leaning news stories on Facebook contained inaccuracies or falsehoods as compared to 19 percent of left-leaning news stories. Even worse, those numbers skyrocketed for Trump during the lead-up to the presidential election.
One interesting point worth mentioning here is how Facebook even knows and categorizes in the first place whether you are a liberal or conservative. Yes, that's right, they do this. In certain instances, people identify themselves as such. But in most cases, the platform identifies your political leanings based on pages you like, topics you discuss, and your interests. It's another piece of what I call your permanent record, that nasty trail of thousands of bits of information about you that aggregated into your permanent data packet. That information then gets plugged into an algorithm to flood you with content in alignment with your perceived beliefs, skewing your perception of the world.
If only a small percentage of readers used Facebook for their news feed, the impact would be minimal. But 62 percent of Americans get their news from social media; 44 percent of adults get it from Facebook. 79 percent of American adults who use the internet are on Facebook; More than 1.8 billion people total around the world. The end result is you open the door to swap responsible journalism for salacious gossip, real truths for consequences, honesty for a dollar sign.
Facebook profits by selling ad space inside its news feed and brokering deals between advertisers and other online companies. Such actions reveal a winning hand for its shareholders but a loss to the American population. This time, it was our election, next time perhaps something on a world scale. The simple truth is cryptic algorithms that focus on the volume of clicks and not the veracity of the words are bad for society. Flooding users with lies about subjects that matter to them plays on their vulnerabilities and subjectivity, which drives them to fall prey to deceptions. It also unwittingly makes us co-conspirators by sharing lies with our friends. This doesn't bring us together as Zuckerberg wants Facebook to do, but rather polarizes us on opposite ends of the spectrum, comfortably in our filter bubbles, wound tightly to our own beliefs.
This presidential election had enough 'truthiness' issues from the candidates themselves. Back in October, PolitiFact estimated that 26 percent of Hillary Clinton's statements were mostly false and a whopping 70 percent of Trump's proclamations were as well. Simple math shows that if you take Trump's abundance of falsities and combine it with Facebook's false headlines, the truth becomes an anomaly lost in a forest of fiction.
When Facebook promotes false headlines, it does so knowing it does not monitor truth, only the clicks, engagement and profit. The rest is up to you or me. At the end of the day, who are we to know what is and is not true? That's why we have the news. Or so we thought. That's also the criticism behind Fox News preying on people's beliefs that news on TV must be true to forward its own particular agenda.
In Zuckerberg's mind, Facebook is about connections. It's the flow of information. The last thing he wants his company to be labelled as is a media corporation. If it was, the company would face a whole different level of regulations, such as those followed by Comcast or Time Warner. Facebook as a media company puts an onus of responsibility to monitor and filter fact from fiction. Facebook doesn't want such responsibility. Even though it should naturally follow such practices, it would rather have it be a voluntary choice versus a legally-mandated one.
Studies show that roughly 20 percent of American social media users have changed their views on a political or social issue because of something they saw on social media. One in five may not seem like a lot, but within an election in which Trump won battleground states such as Michigan and Wisconsin by less than 30,000 votes each, you open up a Pandora's box that changes politics. Furthermore, according to a study published in Nature, almost 340,000 extra people turned out to vote in the 2010 US congressional elections because of a single election-day Facebook message. That's more of an immediate impact than any TV ad other than perhaps Willie Horton and Daisy Girl could muster.
Back in May we learned that trending news on Facebook was controlled and edited by people and not algorithms. This led to an uproar by conservatives when they learned that stories in line with their beliefs were being suppressed from them. The end result there was that Facebook fired the trending team to appease the conservative outcry, while of course their algorithms remained vigilant. Ironically, that team was also responsible for ensuring that the trending news was at the very least real. The end result to all of this may very well have been to elect a president.
Google and Facebook, per usual, have already responded with supposed actions to help reign in the issue. Google pledged to restrict fake news sites from using its AdSense advertising network. Facebook updated its policy to clearly state that its advertising ban applies to fake news. Yet they both still profit when fake news appears. It's kind of a click-fraud issue for Google - hard to seriously rein something lucrative in when it feeds your bottom line and your true mission is money, not service. How many times can we be duped by these giants whose interests are defined by data, algorithmic manipulation, and money? Their words and actions rarely align. Furthermore, Facebook has a documented history of issues with experiments on members, clickbaits and hoaxes, furthermore calling into question its ability and interest to regulate itself.
Neither company really did that much to handle the problem at its source. The real issue is taking down the lies and keeping them from trending as current news. Much as Facebook itself would never support lies about its own personnel or institutions, neither should it support the sharing of such lies about the foundation of our democracy.
At the end of the day, fake or real news can be distributed on any platform. Yet promoting the Pope as a Trump endorser when nothing could be further from the truth, and allowing that misinformation to reach millions of people, makes us look like Chinese and Russian news manipulators. While a powerful argument can be made that the words and actions of FBI Director Comey caused Hillary Clinton's loss - equally compelling is that Facebook had just as strong an impact influencing voters and causing Trump's election, perhaps more.
Social media sites have a responsibility to the public to not tolerate lies and most important, not profit off of them. It's a vulnerability that will need to be addressed without impacting freedom of speech. Perhaps it is time for us to shake off Facebook's manipulating grip and take our friends to a better place.