But she stopped short of actually apologizing for the study itself.
“This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated,” Facebook COO Sheryl Sandberg said during an appearance in New Delhi. “And for that communication we apologize. We never meant to upset you.”
Sandberg’s statement is the first public comment from a Facebook executive on the study, which is now under investigation by a British data watchdog for potentially violating data-protection laws.
A Facebook spokesman told The Huffington Post Wednesday that Sandberg was not apologizing for the study itself, and that she was simply acknowledging that the company is investigating its internal review process for research following the public outcry.
Sandberg also sat down for a televised interview Wednesday on NDTV, a news network in India, during which she declined to call the study a "mistake" when asked by a reporter.
"This was one week and it was a small experiment," she said. "It has been communicated as an experience to shift emotions, it's not exactly what it was. It was an experiment in showing people different things to see -- to see how it worked. Again, what really matters here is that we take people's privacy incredibly seriously and we will continue to do that."
The study, recently published in the Proceedings of the National Academy of Science, revealed that for one week in 2012, Facebook altered the algorithm that determines what people see in their feeds. Some people would see primarily positive posts, while others would see mostly negative posts. Researchers then measured the effect of these posts on their mood. The study was conducted to evaluate a phenomenon known as “emotional contagion,” or the idea that you can “catch” a friend’s sunny disposition or negative outlook.
Facebook originally defended the research, saying it was part of the company’s ongoing effort to improve its website. But news of the study prompted a flurry of outraged comments on social media. Critics have raised questions about the study's ethics, blasting Facebook for not getting the informed consent of its participants. Facebook also admitted Monday that the study may have included minors, another ethical red flag.
Lead author Adam Kramer, a data scientist at Facebook, suggested in a Facebook post that the study may not have been worth the public outcry. “In hindsight, the research benefits of the paper may not have justified all of this anxiety,” he wrote.