From Cursing To Murder, Facebook Opens Up About The Site's 'Community Standards'

For years, Facebook had “community standards” for what people can post, but only a relatively brief and general version was publicly available.
|

MENLO PARK, Calif. (Reuters) - Facebook Inc on Tuesday released a rule book for the types of posts it allows on its social network, giving far more detail than ever before on what is permitted on subjects ranging from drug use and sex work to bullying, hate speech and inciting violence.

Facebook for years has had “community standards” for what people can post. But only a relatively brief and general version was publicly available, while it had a far more detailed internal document to decide when individual posts or accounts should be removed.

Now, the company is providing the longer document on its website to clear up confusion and be more open about its operations, said Monika Bickert, Facebook’s vice president of product policy and counter-terrorism.

“You should, when you come to Facebook, understand where we draw these lines and what’s OK and what’s not OK,” Bickert told reporters in a briefing at Facebook’s headquarters.

Facebook has faced fierce criticism from governments and rights groups in many countries for failing to do enough to stem hate speech and prevent the service from being used to promote terrorism, stir sectarian violence and broadcast acts including murder and suicide.

At the same time, the company has also been accused of doing the bidding of repressive regimes by aggressively removing content that crosses governments and providing too little information on why certain posts and accounts are removed.

New policies will, for the first time, allow people to appeal a decision to take down an individual piece of content. Previously, only the removal of accounts, Groups and Pages could be appealed.

Facebook is also beginning to provide the specific reason why content is being taken down for a wider variety of situations.

Facebook, the world’s largest social network, has become a dominant source of information in many countries around the world. It uses both automated software and an army of moderators that now numbers 7,500 to take down text, pictures and videos that violate its rules. Under pressure from several governments, it has been beefing up its moderator ranks since last year.

Bickert told Reuters in an interview that the standards are constantly evolving, based in part on feedback from more than 100 outside organizations and experts in areas such as counter-terrorism and child exploitation.

“Everybody should expect that these will be updated frequently,” she said.

The company considers changes to its content policy every two weeks at a meeting called the “Content Standards Forum,” led by Bickert. A small group of reporters was allowed to observe the meeting last week on the condition that they could describe process, but not substance.

At the April 17 meeting, about 25 employees sat around a conference table while others joined by video from New York, Dublin, Mexico City, Washington and elsewhere.

Attendees included people who specialize in public policy, legal matters, product development, communication and other areas. They heard reports from smaller working groups, relayed feedback they had gotten from civil rights groups and other outsiders and suggested ways that a policy or product could go wrong in the future. There was little mention of what competitors such as Alphabet Inc’s Google do in similar situations.

Bickert, a former U.S. federal prosecutor, posed questions, provided background and kept the discussion moving. The meeting lasted about an hour.

Facebook is planning a series of public forums in May and June in different countries to get more feedback on its rules, said Mary deBree, Facebook’s head of content policy.

FROM CURSING TO MURDER

The longer version of the community standards document, some 8,000 words long, covers a wide array of words and images that Facebook sometimes censors, with detailed discussion of each category.

Videos of people wounded by cannibalism are not permitted, for instance, but such imagery is allowed with a warning screen if it is “in a medical setting.”

Facebook has long made clear that it does not allow people to buy and sell prescription drugs, marijuana or firearms on the social network, but the newly published document details what other speech on those subjects is permitted.

Content in which someone “admits to personal use of non-medical drugs” should not be posted on Facebook, the rule book says.

The document elaborates on harassment and bullying, barring for example “cursing at a minor.” It also prohibits content that comes from a hacked source, “except in limited cases of newsworthiness.”

The new community standards do not incorporate separate procedures under which governments can demand the removal of content that violates local law.

In those cases, Bickert said, formal written requests are required and are reviewed by Facebook’s legal team and outside attorneys. Content deemed to be permissible under community standards but in violation of local law - such as a prohibition in Thailand on disparaging the royal family - are then blocked in that country, but not globally.

The community standards also do not address false information - Facebook does not prohibit it but it does try to reduce its distribution - or other contentious issues such as use of personal data.

Support HuffPost

At HuffPost, we believe that everyone needs high-quality journalism, but we understand that not everyone can afford to pay for expensive news subscriptions. That is why we are committed to providing deeply reported, carefully fact-checked news that is freely accessible to everyone.

Whether you come to HuffPost for updates on the 2024 presidential race, hard-hitting investigations into critical issues facing our country today, or trending stories that make you laugh, we appreciate you. The truth is, news costs money to produce, and we are proud that we have never put our stories behind an expensive paywall.

Would you join us to help keep our stories free for all? Your will go a long way.

Support HuffPost

Before You Go

10 Ways Facebook Messes With Your Life
It Can Mess With Your Sleep(01 of10)
Open Image Modal
Heavy social media use can upset sleep patterns, studies have found. And not getting enough sleep can cause you to check Facebook compulsively.

The result is an exhausting feedback loop that could leave you fried.
(credit:Erin Patrice O'Brien via Getty Images)
It Can Make You Depressed(02 of10)
Open Image Modal
Spending too much time on Facebook could stir up feelings of envy, according to a study published in 2015. Envy, in turn, could make you depressed.

“We found that if Facebook users experience envy of the activities and lifestyles of their friends on Facebook, they are much more likely to report feelings of depression,” study co-author Dr. Margaret Duffy, a University of Missouri journalism professor, said in a press release.

But, simply being aware that people are presenting their best selves -- and not necessarily their real selves -- on social media could help you feel less envious.
(credit:sturti via Getty Images)
It Can Drain Your Smartphone Battery(03 of10)
Open Image Modal
Facebook's Android and iPhone apps are real battery sucks. Facebook has said it's addressing the problem. In the meantime, deleting the app from your smartphone could boost your battery by up to 20 percent.

Here's how to do it.
(credit:milindri via Getty Images)
It Can Sap Your Focus(04 of10)
Open Image Modal
The average attention span is decreasing, according to research. Constant distractions created by our "digital lifestyles" could be changing our brain chemistry and sapping our focus. Yikes! (credit:David Malan via Getty Images)
It Can Ruin Your Relationship(05 of10)
Open Image Modal
Social networks bring people together, but they can also drive a wedge between married couples, according to psychologists. Constantly checking Facebook can ruin intimate moments, and the ability to connect with old flames online can spark extra-marital trysts. (credit:Vincent Besnault via Getty Images)
It Can Make You Socially Awkward(06 of10)
Open Image Modal
Our dependence on social media could be making it more difficult to connect with others in person. “I think it’s the death of an actual civilized conversation,” Justine Harman, features editor at Elle.com, told The Huffington Post in an interview in 2014.

What's more, most of your Facebook friends don't really care that much about you.
(credit:Fuse via Getty Images)
It Can Be A Huge Waste Of Time(07 of10)
Open Image Modal
The more time you spend on Facebook, the worse you feel, according to behavioral science research. That's because Facebook feels to many people like a waste of time.

“It appears that, compared to browsing the Internet, Facebook is judged as less meaningful, less useful, and more of a waste of time, which then leads to a decrease in mood,” Christina Sagioglou and Tobias Greitemeyer, behavioral scientists at the University of Innsbruck in Austria, wrote in a paper published in 2014.

Facebook doesn't always make us feel crummy. But, if it does, it's time to do something else.
(credit:Gianluca D'Auri Muscelli via Getty Images)
It Can Create An Echo Chamber(08 of10)
Open Image Modal
Critics of social media have long suggested that Facebook's algorithm -- which determines the posts you see based on posts you've clicked -- can create "echo chambers" online. Being exposed to content you already understand or agree with can insulate you from diverse views, critics argue.

But Facebook disagrees, saying last year that it was not responsible for creating echo chambers. Either way, Facebook still plays a big role in how people consume information online.
(credit:Facebook)
It Tracks (And Shapes) Your Behavior(09 of10)
Open Image Modal
Facebook uses complex machine learning algorithms to decide what you see on the site. If it notices you like posts related to soccer, for instance, it might surface more soccer posts in your feed. But it doesn't always get this right.

Eventually, it may get better at understanding people's preferences -- so much better that some experts fear how precisely future marketing and political campaigns will be able to target people. We might even come to "question whether we still have free will," Illah Nourbakhsh, a robotics expert at Carnegie Mellon University, told HuffPost in an interview.
(credit:Facebook)
It Knows When You Go To Bed At Night(10 of10)
Open Image Modal
Turns out, Facebook has enough information about you that it can be used to track when you turn in for the night and when you wake up in the morning. Danish software developer Soren Louv-Jansen developed a tool that used Facebook data to let people observe their friends' sleep patterns.

Though Facebook asked him to take down this tool, the stunt pointed to a larger issue of data privacy: We all reveal a huge amount of personal information online, and we can't always control how others use it.
(credit:Tara Moore via Getty Images)