Why Facebook Should Follow Ethical Standards -- LIke Everybody Else

It is not clear that Facebook broke the law -- the regulations apply technically to federal-funded research, but have been universally adopted by researchers as the standard. Facebook should agree to follow these guidelines as well.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

On July 2, 2014, Facebook's Chief Operating Officer, Sheryl Sandberg, defended the company's recent controversial experiment, which manipulated users' newsfeeds to change their moods. But in so doing, she raised more concerns than she answered.

She argued that the company and other social media sites regularly engage in research, and that the practice is thus acceptable. Facebook defenders say that the company could instead simply conduct its experiments secretly and not publish the results.

But past errors do not justify future ones.

Social science can help us all, but depends on trust -- which can be fragile. Social media companies can now strengthen users' trust by agreeing to follow ethical standards.

The company seeks to portray all of its experiments as utterly benign, but that may not always be the case. Playing with people's emotions is very different than asking them how, for instance, they might vote in an upcoming election.

Research suggests that the information individuals receive through social media can significantly affect their moods. Altered mood can in turn affect drug use, weight and appetite, school and work performance, and suicidal thoughts. Depressed teens can be very fragile. It may not take a lot to push them more than one would like. Facebook has apparently conducted hundreds of experiments. We simply don't know how far these studies have gone -- whether any have tried manipulating users' moods over a longer period.

Facebook's researchers at first said that the subjects in its experiment, conducted in 2012, consented when they signed up for Facebook; but it turns out that the company only mentioned the possibility of research in its data use agreement afterwards. The current agreement may not necessarily be sufficient either, however, depending on the experiment.

At present, researchers at almost all universities and pharmaceutical and biotech companies have agreed to follow the federal regulations governing research as a standard in conducting experiments on human beings. We do not allow pharmaceutical or automobile companies to alter their products on their own, and then see whether any consumers get hurt. Internet companies -- which regularly conduct studies on us all -- do not follow accepted ethical standards of research, and have never publicly made any effort to do so. But they should.

Facebook says its employees review its own studies, but it has nowhere indicated what criteria they use. We don't know, for instance, if they have ever rejected or altered experiments because these were too unethical, and if so, what types of studies? Moreover, these employees have a conflict of interest in assessing their own company's experiments. Federal regulations require that institutions conducting experiments have research ethics committees, known as Institutional Review boards (or IRBs) review all studies, following clear ethical guidelines. These boards must, for instance, have an unaffiliated member, to try to avoid conflicts of interest.

Much of the company's research will no doubt be minimal risk, and judged to be ethical and unproblematic. But standard practice is not to allow researchers to make these determinations about their own studies. At times, egregious violations have occurred when social scientists and other researchers have done so. In the Stanford prison experiment, for instance, the psychologist Philip Zimbardo randomly assigned students to play the roles of "guards" and "prisoners" in a mock penitentiary. He assumed the study would be minimal risk. But the guards soon began physically abusing the prisoners.

Social media company researchers are presumably trained social scientists -- psychologists, sociologists and anthropologists. These fields all have established codes of professional ethical standards and conduct that should be followed, and that include stipulations that researchers obtain appropriate informed consent.

It is not clear that Facebook broke the law -- the regulations apply technically to federal-funded research -- but have been universally adopted by researchers as the standard.

Facebook should agree to follow these guidelines as well. Doing so need not be onerous. The company could simply submit its studies an established independent IRB for review. Facebook could also, for instance, not include children in mood-manipulation experiments, which could easily be done, since users' indicate their age.

Much more discussion about these areas is essential. Indeed, the British government is now investigating the company's experiment. I don't think our federal government needs to get involved.

But all of us elsewhere -- whether we use Facebook or not -- deserve a bit more. Isn't our trust worth it?

Popular in the Community

Close

What's Hot