A group of civil society organizations in Myanmar blamed Facebook for the spread of misinformation that may have put thousands of Rohingya people in danger, saying the company is far from the paragon of moderation its CEO claims it to be.
Earlier this week, Chief Executive Mark Zuckerberg told Vox’s Ezra Klein that Facebook’s systems detected two chain letters that spread across the country last year as the military was forcibly and violently removing nearly 700,000 people from the country’s Rakhine State. One of the fake letters warned of an impending attack by Muslim groups and used a racial slur, another warned Muslims that Buddhists were planning violence themselves.
“The Myanmar issues have, I think, gotten a lot of focus inside the company,” Zuckerberg said in the interview, part of a media blitz meant to counter growing outrage over the company’s use of user data. “I think it is clear that people were trying to use our tools in order to incite real harm. Now, in that case, our systems detect that that’s going on. We stop those messages from going through.”
But in the letter sent this week, six groups said that Facebook took more than four days to address the spread of misinformation and that it detected the letters only after the groups alerted the company themselves.
“In your interview, you refer to your detection ‘systems’. We believe your system, in this case, was us ― and we were far from systematic,” the groups wrote. By the time Facebook addressed the concerns, “thousands, if not hundreds of thousands,” had seen the messages, they said.
“This is not quick enough and highlights inherent flaws in your ability to respond to emergencies,” the letter continues. “Your reporting tools, for one, do not provide options for users to flag content as priority. As far as we know, there are no Burmese speaking Facebook staff to whom Myanmar monitors can directly raise such cases.”
The messages, the groups allege, led to at least three violent incidents. Last month, investigators from the United Nations said Facebook had played a “determining role” in the Rohingya crisis.
A Facebook spokesperson said the company didn’t want the site “to be used to spread hatred and incite violence” and apologized for Zuckerberg’s characterization, saying the groups were indeed the first to report the issues.
“We took their reports very seriously and immediately investigated ways to help prevent the spread of this content,” the spokesperson said in a statement. “We should have been faster and are working hard to improve our technology and tools to detect and prevent abusive, hateful or false content.”
Facebook said it had added more Burmese-language reviewers to help handle reports in the country.
In some developing countries, including Myanmar, Facebook can provide vital communication networks for many people, and many use the site as their only source of news, according to The New York Times. But the social network has also become rife with misinformation and hate speech, and Facebook does not have an office in the country.
Facebook has been contending with an outpouring of frustration in recent weeks after the fallout over the Cambridge Analytica scandal. Facebook has been pummeled with questions over how it handles users’ private data, and Zuckerberg is slated to appear before Congress next week.
Thursday’s letter adds to a growing tide that may force a reckoning at the company, and Zuckerberg himself has said he’d be open to the right kind of government regulation. But the groups who sent the letter said the way the company operates now is not enough and could very well lead to more danger in Myanmar.
“The risk of Facebook content sparking open violence is arguably nowhere higher right now than in Myanmar,” the letter concludes. “If you are serious about making Facebook better, however, we urge you to invest more into moderation ... and ― perhaps most importantly ― we urge you to be more transparent about your processes, progress and the performance of your interventions, so as to enable us to work more effectively together.”
Read the whole letter here.
This article has been updated to include comment from a Facebook spokesperson.