Facebook recently published the results of its 2012 entitled Experimental evidence of massive-scale emotional contagion through social networks in the journal of the Proceedings of the National Academy of Sciences. In this experiment conducted on ~700,000 its users, Facebook:
test[ed] whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred.
Facebook apologized, claiming the test was product research that had been "poorly communicated" to the users and they "want to do better in the future and are improving [their] process based on this feedback." It's disappointing Facebook did not take full responsibility, state all such experiments will no longer be permitted and clearly apologize for this violation of trust and emotional manipulation of nearly three quarters of a million users. But no matter how Facebook responded, millions would still have walked away feeling deeply troubled by a company with so much access to its users' personal data and such casual disregard for their emotions and privacy.
Considering the powerful and sustained public, regulatory agency and press backlash Facebook received, one would think most social companies would rush to reassure their users that they were not performing such experiments. We would expect promises that they respected their customers and never tampered with data -- especially for corporate gains or experimentation. It is only logical similar companies would want to distance themselves from Facebook's scandal by communicating their respect for their users' privacy and providing assurance that they take every step to deliver their services ethically. After all, these companies need users to remain relevant, and a user base can vanish in a moment (as seen with MySpace).
In a surprising and seemingly unprovoked admission, OKCupid's cofounder Christian Rudder smugly admitted to wholesale manipulation of a portion of its users' data by intentionally misleading people about their compatibility percentages. Instead of apologizing or quietly ceasing his company's user experiments, Rudder brashly justified his company's manipulations and mocked those with concerns as naive and misinformed saying, "you're the subject of hundreds of experiments at any given time, on every site."
Looking for love
For users coming to a site like OKCupid expectation, emotion and self-perception play a huge part in how they interact with the site and with other users. Tinkering with match results does more than just skew behavior, it violates an assumption of trust. OkCupid's experiments were callous and manipulative, affecting far more than what people post online. To be clear, Facebook manipulated people's emotions. But these were not fake posts from fake friends. They were real posts about real things happening in users' friends' lives. While these posts affected people's mood, I do not believe the effect was profound, long-lasting or based on false information.
But OkCupid simply lied, falsifying their results and intentionally mismatching people. This manipulation invariably lead to countless terrible dates, wasted money, increased frustration and quite likely questions as to why these users could not find the love they were seeking. OKCupid cruelly disregarded the one thing most people protect above all else.
Analytic, A/B testing and implied consent
Almost all providers of internet services use tools like Google Analytics or Omniture to monitor user activity, determining how long they stay on a page, which ads or content we click on or when we leave for another site. With A/B testing companies can understand which of their product's value propositions resonate most with a group or with consumers in general. It is how advertising adapts and how products remain relevant. I'm sure it would be of no shock to any of us to read a research paper showing how negative or positive ads language influences click-through rates and ROI. Not only is this okay, it is a vital part of good corporate citizenship.
One could think of A/B testing as manipulation since advertisers are presenting different messaging to see which gets more clicks. This would be true except for one major difference: we know the ads are selling us something. We are informed, and being informed we consent to the ads and understand their messaging should be treated with scrutiny.
Analytics and advertising research is what OKCupid's Christian Rudder is trying to cite when he talks about these "experiments" that "everyone else is doing." But OkCupid's behavior could not be further further from that kind of ethical research. Proper analytics research does not involve misleading the user. OKCupid's users had no way of knowing or suspecting this deception, so how could they appropriately filter the content and provide any form of consent?
The result is manipulation of people's lives, time, money and emotions and an abuse of one's customer base.
Corporate citizenship, ethics and the cost of manipulation
Rudder's glib, staggeringly arrogant response on the matter floored us here at BiTE. We understand OKCupid is a business with a product and as such will conduct research to improve its offerings, but this absolute lack of respect for OKCupid's customers is neither ethical nor acceptable, and is a breach of the principles of corporate citizenship.
Joseph Farrell is EVP of Operations, BiTE interactive