How Facebook and Google's Algorithms Are Affecting Our Political Viewpoints

Now that the 2016 presidential election is a regular part of the news cycle and users are sharing their opinions daily, let's dive into all the ways, subtle and overt, that the most-used social network and the most-used search engine affect our political beliefs.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

2015-10-12-1444693736-2876325-Democratsrepublicans1.jpg

Plenty of users take what they read online at face value, which some social experiments have proven. The average user often doesn't check facts or consider whether the source is credible.

"You look at a Wikipedia article and assume that it all must be true," said Christo Wilson, a computer science professor at Northeastern University who researched algorithms and personalization extensively. "Or you search for something on Google and think the results are subjective and correct off the bat."

And then there are algorithms on top of every social network and search engine, providing users with personalized, and ultimately skewed, results. Algorithms are a mystery to researchers. Considered trade secrets, algorithms are best kept confidential. Researchers have a general understanding of how they work, but algorithms are constantly changing so that companies maintain a competitive advantage. Researchers have no way of knowing for sure exactly how Google, Facebook, Twitter, Instagram or any other online platform's algorithms work.

"News-filtering algorithms narrow what we know, surrounding us in information that tends to support what we already believe," Eli Pariser, CEO of Upworthy, wrote on Medium, referencing his experience clicking links and then subsequently seeing related content in his feed.

Now that the 2016 presidential election is a regular part of the news cycle and users are sharing their opinions daily, let's dive into all the ways, subtle and overt, that the most-used social network, Facebook, and most-used search engine, Google, affect our political beliefs.

Facebook algorithm research

Facebook confirmed in a paper in Science that it often shows users news from users with similar political beliefs, and on average, you're about 6 percent less likely to see content that the other political side favors. This means that who you're friends with and their political beliefs influence what you see more than th algorithm does.

The social networking giant has also admitted to carrying out research experiments months before the 2012 presidential election. In one experiment, Facebook increased the number of hard news stories at the top of the feeds of 1.9 million users. According to one Facebook data scientist, that change -- which users were not alerted to -- measurably increased civic engagement and voter turnout, Mother Jones reported. The company has only released select details on these experiments.

In general, Facebook says the average user has access to about 1,500 posts per day but only looks at 300, according to Time. To ensure that those posts are as relevant to that particular user, Facebook says it uses thousands of factors to determine what shows up in any individual user's feed.

There are many nuances. For example, if you comment on someone's wall, you're more likely to see a post from them than if you "Like" a post, according to
researchers in the study "Uncovering Algorithms: Looking Inside the Facebook News Feed." And if you go to a person's timeline, you're more likely to see content from them later.

Most people aren't aware that these kinds of subtleties are affecting algorithms that determine what they see. The majority of people, 62 percent, don't even know that Facebook News Feeds are automatically curated, according to the study.

So if you've got a few friends with a penchant for right or left-wing politics and you engage with them in certain ways on Facebook, there's a likelihood that those posts will be prominent whenever you log on. American users spend nearly as much time on the site per day (39 minutes) as they do socializing with people face-to-face (43 minutes), according to Time. You can see how a person's perception of what their friends think can be warped by what they were seeing online.

Users have some control, with Facebook rolling out curation tools that allow you to "See More" updates from certain users and hide others.

But these user capabilities aren't always easy to find.

"While the levers exist, people don't use them because they're hidden," said Karrie Karahalios, a University of Illinois Urbana-Champaign researcher who co-authored the "Uncovering Algorithms" paper. "And then there are people who don't want to control their stuff; they want it to be intuitive. But if people don't control their feeds, their feeds might control them."

Google's algorithms

Google is uniquely positioned to influence how users think, as algorithms will boost certain sources over others based on how much content the site produces, length of articles, when the piece was published, among other factors. Users are inclined to only click on the first few results, and ultimately, what we see and read on Google can affect the way we think.

Google announced in 2012 that it connects 1 billion users to its aggregated
news content. Compare that with the roughly 40 million unique views per month that the New York Times and Huffington Post sites received that year.

"Search engines are probably the most important way we get our news," said Ben Edelman, a Harvard Business School professor whose research concluded that Google has violated antitrust laws because it compels usage of its own services and products over competitors.

"I think it's very troubling," he said of Google's monopoly over search. "Whatever Google does matters a lot more than other search engines because so many people are using it."

Google's search algorithm can shift the voting preferences of undecided voters by 20 percent or more -- up to 80 percent in some demographic groups -- with virtually no one knowing they are being manipulated, according to experiments by researchers Robert Epstein and Ronald E. Robertson.

Every undecided vote counts, with elections typically won by small margins, under 7.6 percent. In 2012, the margin was only 3.9 percent.

And like Facebook's results, it's not always obvious to users that Google's personalized results are based on many factors. Google has 57 signals it considers when delivering search results, including details such as which kind of computer and browser the person is using as well as where the user is sitting.

Pariser asked two friends to Google the word "Egypt" in 2011 and to send him screen shots of the results. One of his friends' top results were more news-oriented, about the protests and journalist Lara Logan, who had been assaulted in Cairo. The other friend's top results featured the CIA World Factbook entry about the nation and content on travel and vacations, Pariser said in a TED talk.

Despite the discrepancy, Google's algorithms aren't as manipulative as you'd think, according to Wilson.

"We spent a lot of time looking at Google, and what affects users most is their location," he said, referring to Google's ability to provide targeted recommendations for restaurants and other services based on where users are. "We just finished a study, and we found that searching different politicians' names or controversial words like 'gun control' has no impact on search results."

If Google did manipulate users in a significant way, researchers and politically savvy people would notice, according to Ryan Kennedy, a political science professor at the University of Houston who's currently doing research on how Facebook affects voting.

"It's not trivial to do something like that, and people would notice pretty quickly, especially if they're politically savvy," he said. "Google wouldn't want to get in trouble for something like this. They have enough bad press with their privacy policies."

CEOs of tech companies have long defended personalization as being necessary -- a cornerstone of a relevant user experience. Whether you agree or disagree, it's undeniable that personalized services such as Google and Facebook have significant responsibility in shaping users' opinions.

Not everyone likes how Google and Facebook is handling this responsibility.

As Pariser said on Medium, "The Internet is showing us what it thinks we want to see, but not necessarily what we need to see."

Popular in the Community

Close

What's Hot