Facebook has reversed its prior ban on videos showing graphic content: It announced that depictions of violence, including decapitations, would be permitted in cases where it is condemned, not celebrated.
The decision prompted immediate backlash from critics who highlighted the hypocrisy of Facebook allowing violent content, while banning breasts. (Facebook routinely censors breastfeeding photos.) Others objected to making gruesome videos available to Facebook’s young users. UK Prime Minister David Cameron called Facebook’s decision “irresponsible.” Tech commentator Graham Cluley complained, “The only boob that I can see is the twit at Facebook who believes it’s worse for someone to see a picture of a naked female breast than to see a video of a person being beheaded.”
Yet the only thing worse than seeing a video of a person being beheaded is not being able to see it at all.
Facebook has, perhaps counterintuitively, done a good thing by permitting awful things to circulate on its site. The change is a boost for free speech and political discourse on a site that's been frustratingly inconsistent in how it handles controversial content. Though the decision is likely to bring serious headaches for Facebook, it also promises to make the social network a more valuable communication platform for political groups and activists.
“Facebook has been criticized for stifling political speech in the past, and this is a step in the right direction," said Zeynep Tufekci, an assistant professor at the University of North Carolina, Chapel Hill, whose research focuses on the intersection of technology, sociology and politics.
Activists -- particularly those operating in countries without a free press -- have come to rely on social media, including Twitter, YouTube and Facebook, as indispensable ways to document and spread awareness about human rights abuses and atrocities. Yet these groups not only have to contend with censorship by the government, but they also have to worry about the very site they’re trying to use to bypass such censorship: Facebook.
Syrian activist and rebel groups, for example, frequently use Facebook to publicize war atrocities in the hopes of rallying support domestically and abroad. Yet Facebook’s prior ban on graphic content could often get them barred for doing so, Syria observers note. Liwa al-Islam, Farouq Battalions and an opposition group in Damascus have all had their pages removed, though they quickly reappear under different names. Sometimes, the violent videos they posted really did run afoul of Facebook’s rules. Other times, political enemies have used Facebook’s policy to have opposition pages silenced and posts removed.
A Syrian man cries in a hallway of the Dar El Shifa hospital in Aleppo, Syria after his daughter was hit during a Syrian Air Force strike over a school where hundreds of refugees had taken shelter.( Thursday, Oct. 4, 2012 AP Photo/ Manu Brabo)
Jillian York of the Electronic Freedom Foundation noted, “Generally, activist groups can operate well on Facebook -- so long as they do not have enemies there to report them.”
Though the social network's overall policy is certainly hypocritical, it's worth noting how much is at stake for some of these groups. When Facebook bans activists' accounts, it removes their way of getting information out of their countries, and it erases a crucial historical record of what happened.
“We revolutionaries in Syria, we can deliver humanitarian violations only through Facebook and YouTube,” wrote Nezar, who manages the Facebook and YouTube accounts belonging to Local Council of the city of Daraya, an opposition group based in the Damascus suburbs, in a Skype chat. “Showing videos of decapitation is not acceptable, but Facebook’s management has deleted a lot of Facebook pages showing human rights violations.”
Nezar noted that Facebook had threatened to block the Daraya Local Council's account because of complaints by other users, but the group avoided getting shut down by "delet[ing] all the pictures documenting human rights violations."
Facebook’s policy change is likely to make the social network a more reliable platform on which political groups can operate, and a more valuable source of information for researchers, journalists and governments trying to gain insight into otherwise closed-off countries. Already, reporters and think tanks have used tweets, Facebook posts and YouTube videos to make critical insights. In its report on the alleged chemical attacks in Syria in August, Human Rights Watch relied heavily on evidence collected off of YouTube and Facebook. Even Barack Obama referred to "social media accounts" in his speech on the toxic sarin gas attack in Syria.
Thanks to Facebook's policy update, activist groups should be able to “create narratives that are much more powerful and show the full extent of the atrocities taking place,” predicted Cliff Lampe, an associate professor at the School of Information at the University of Michigan.
Certainly, there are risks to allowing graphic videos -- even those that condemn violence -- to have a presence on Facebook. They could help terrorists terrorize; shock vulnerable viewers; and be psychologically scarring. But there may be a way to mitigate the dangers. For example, YouTube, which has allowed violent videos of “news value,” includes a warning on graphic videos. A Facebook spokeswoman told the BBC it was considering a similar move and could alert members "that the image they are about to see contains graphic content." On Tuesday evening, the social network announced it was tightening its policies to more closely monitor and restrict which violent videos were allowed to remain on the site. Facebook wrote it would "take a more holistic look at the context surrounding a violent image or video" and would consider whether the individual had share it "responsibly," such as by "accompanying the video or image with a warning."
Experts also suggest Facebook should rethink its process for reviewing controversial content: A more thorough moderation system might help ensure graphic videos that glorify violence aren't allowed. Facebook’s moderators often lack the knowledge of a region’s politics that is necessary to arbitrate, or may be lacking the language skills to determine the context for a graphic video, analysts note. Others say Facebook's staff reviews reports too quickly, meaning that innocent posts are deleted erroneously.
“Facebook’s moderation has been more of a shotgun than a scalpel,” Lampe said.
There may be selfish motivations underpinning Facebook’s relaxed policy on violent content as well. After all, it's unlikely Facebook would make itself a target for backlash without something to show for it.
“My concern,” noted York, “is that it’s all about money as opposed to free speech."
Facing competition from sites like Twitter and YouTube, Facebook has been steadily trying to position itself as a source for breaking news. It introduced hashtags earlier this year, made it possible to embed Facebook posts, rolled out verified profiles and introduced new tools for news organizations, all in a push to prove Facebook is the place where “conversations are happening.”
Yet newsy content is frequently violent and disturbing, and requires a more open platform on which to circulate -- crucially, one that doesn't flinch at documentations of decapitations, or shootings. Permitting beheading videos suggests an effort to make Facebook the go-to social network on which to share important, albeit graphic, events. Even if Facebook has its own interests at heart, however, others may benefit in turn.
This article has been updated to include Facebook's comment.