You May Have Been A Lab Rat In A Huge Facebook Experiment

You May Have Been A Lab Rat In A Huge Facebook Experiment

A newly published paper reveals that scientists at Facebook conducted a massive psychological experiment on hundreds of thousands of users by tweaking their feeds and measuring how they felt afterward.

In other words, Facebook decided to try to manipulate some people's emotional states -- for science.

The research involved Facebook's News Feed -- the stream of status updates, photos and news articles that appears when you first fire up the site. For a week in January 2012, a group of researchers, variously affiliated with Facebook, Cornell University and the University of California, San Francisco, altered the algorithm that determines what shows up in News Feed for 689,003 people. One group was shown fewer posts containing words thought to evoke positive emotions, such as "love," "nice" and "sweet," while another group was shown fewer posts with negative words, like "hurt," "ugly" and "nasty." The findings were published this week in the Proceedings of the National Academy of Sciences, a scientific journal.

The researchers were studying a phenomenon called "emotional contagion," a fancy psychological term for something you've almost certainly experienced: If you spend more time with a happy-go-lucky friend, you end up being more of a ray of sunshine yourself. (Same goes for sadness: Hang with a Debbie Downer, and you likewise become a vector for gloom.) Researchers have found that emotions can be contagious during face-to-face interactions, when a friend's laugh or smile might lift your spirits. But what happens online? Facebook was trying to figure that out.

It turns out that, yes, the Internet is just like real life in this way. People who were shown fewer positive words on Facebook tended to turn around and write posts of their own that contained fewer positive words (and more negative words). And people who were shown fewer negative words tended, in turn, to write posts with fewer negative words and more positive words.

Hypothesis: proven!

In the PNAS article, lead researcher Adam Kramer and his team note that "the effect sizes from the manipulations are small." And in a statement to The Huffington Post, Facebook offered justification for doing the research.

"This research was conducted for a single week in 2012 and none of the data used was associated with a specific person's Facebook account," a company spokesperson told The Huffington Post. "We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process."

Still, reaction against Facebook was swift when the story was picked up by Animal and then the A.V. Club:

Get off Facebook. Get your family off Facebook. If you work there, quit. They're fucking awful.

— Erin Kissane (@kissane) June 28, 2014

The researchers' findings aren't exactly trivial. If positivity begets more positivity online, we may be overblowing the whole idea of "F.O.M.O.," or "fear of missing out" -- the idea that pixel-perfect beach pictures and other evidence of fun fills Facebook friends with jealousy, not joy.

Facebook employs a group of data scientists to study user activity and publish their findings, often pegged to events like Valentine's Day and national elections. But until now, the research has mostly fallen into the category of "observational studies" -- that is, research that involves someone poring over existing data and trying to draw conclusions from it.

The News Feed manipulation, though, is a different beast. It's an experiment, in which scientists create the data by tweaking one variable to see if it affects another. That's what's disconcerting: The "things" being manipulated in this case are people on Facebook -- i.e., basically everyone with an Internet connection.

If you don't remember agreeing to being a Facebook guinea pig, well, you must not have read all of the site's mind-bogglingly complex terms of service when you set up your account. Within those TOS is language specifying that Facebook members consent to having information about them used for “internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

Even though this research was not illegal, Susan Fiske, the Princeton University psychology professor who edited the study for PNAS, was queasy about it. Fiske told The Atlantic:

I was concerned until I queried the authors and they said their local institutional review board had approved it -- and apparently on the grounds that Facebook apparently manipulates people's News Feeds all the time.

"Facebook apparently manipulates people's News Feeds all the time." That's comforting.

Popular in the Community

Close

What's Hot