Advertisement

Facebook publishes its community standards playbook

It’s also letting you appeal decisions if you think Facebook made a mistake.

Facebook has had community standards for awhile now, but it hasn't always been clear on what is or isn't allowed on its platform. This has occasionally led to some serious confusion. Last year, ProPublica unearthed a Facebook internal training document that appears to prioritize "white men" over "black children," and later discovered that community moderators were often wildly inconsistent on what they considered hate speech. It's since apologized for these errors, and today, Facebook is hoping to clear up things even further: It's publishing its internal community enforcement guidelines for the very first time.

To be clear, the community standards themselves have not changed. Instead, what Facebook is doing is updating them with more detail as to how they're enforced. According to the company, the guidelines published today are exactly the same as the ones used by the company's 7,500 or so moderators around the world. It's apparently another a part of the company's renewed effort to be more transparent with its users.

"We want to give people clarity," said Monica Bickert, Facebook's VP of Global Policy Management. "We think people should know exactly how we apply these policies. If they have content removed for hate speech, they should be able to look at their speech and figure out why it fell under that definition."

"The other reason we're publishing this is to get feedback on these policies," she continued. "[Getting] real world examples or examples on how an issue manifests itself in the community is helpful."

Facebook

To go along with this announcement, Facebook is also expanding its appeals process. Until now, if you've had a specific post or photo removed for violating community guidelines, you didn't have the option to appeal that decision. But now, you do. You'll be given the option to "Request Review," and Facebook's Community Operations team will look at the request within 24 hours. If a mistake has indeed been made, Facebook promises to restore the post or photo.

"We're going to offer appeals for posts and photos not only if we remove the post and photo, but also if you report a photo and post and we don't remove it," said Bickert. "You'll have the opportunity to say hey, 'Take another look at this.'"

The community standards document is a fairly lengthy one, but it essentially covers six distinct categories: Violence and criminal behavior, safety, objectionable content, integrity and authenticity, respecting intellectual property and content-related requests. Some of the guidelines seem fairly straightforward; for example, a threat of credible violence could result in a takedown, and even a report to the authorities. Masquerading as someone else is clearly defined as wrong, as are posts involving child nudity and trafficking in illegal goods.

But other guidelines, like the ones around hate speech, are a lot more nuanced. Facebook states that it doesn't allow hate speech on its platform, and defines it as a "direct attack on people based on what we call protected characteristics -- race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity and serious disability or disease." It also offers some protection for immigrants, and defines an attack as "violent or dehumanizing speech, statements of inferiority, or calls for exclusion or segregation." Examples include violent speech (which would be a Tier 1 offense), expressions of disgust (a Tier 2 offense) or a call to exclude (a Tier 3 offense).

Facebook stock

That might sound pretty clear, but the reality is that it's a lot more granular than that. For example, Facebook says that while you can't attack a person, you can still criticize an organization, a country, and even a religion. So for example, you can't say "Scientologists are evil," but you can say "Scientology is evil." If that sounds a little like splitting hairs, well, even Facebook would agree that it sometimes runs into these tricky definitions.

Additionally, while Facebook's hate speech policies cover the above protected characteristics, it doesn't always cover subsets. In a New York Times quiz last year, the paper posited that "Female sports reporters need to be hit in the head with hockey pucks" would not be considered hate speech under Facebook's policies, because while gender is a protected category, occupation is not. Facebook did tell the Times that the statement would likely still be flagged for the violent threat (which is under a different policy), but it's troubling that simply targeting female sports reporters doesn't count.

Still, Facebook took pains to say that these policies are not static. According to Facebook, the content policy team meets every two weeks with various other teams within the company like engineering or operations. Depending on the issue, they would also meet with teams from legal, public policy, diversity, child and women's safety, government relations and external stakeholders like academics, researchers, counterterrorism experts and hate organization experts. With the issue of abortion, for example, they might meet with both pro-choice and pro-life groups to get a fuller understanding of the topic.

In a recent blog post, Facebook also says that it attempts to protect against human bias with extensive training as part of the on-boarding process. "Our reviewers are not working in an empty room; there are quality control mechanisms in place, and management on site, that reviewers can look to for guidance," it states. Facebook also conducts weekly audits to check on the decisions. But even then, mistakes are made. "Even if you have a 99.9 percent accuracy rate, you'll still have made many mistakes every day," said Bickert." To help counter this, Facebook hopes to beef up its safety and security team to 20,000 people this year -- up from 7,500.

Cambridge Analytica probe

One of the biggest issues with Facebook's community standards remains. It's one set of guidelines for the whole world, which doesn't always apply to local laws. For example, Germany has much stricter laws around hate speech, so a post that would be legal elsewhere in the world would have to be made unavailable in Germany. "Our standards are global," said Bickert. "But there are times when we have to be very local in our application."

"We do think cultural context is important," she continued. "When we are hiring reviewers to cover certain languages, we have native Portugese speakers from Brazil and from Portugal, because of the different ways language is used."

This cultural context is all the more important as Facebook growis in popularity in the developing world, where it's often used as a tool of misinformation and false rumors can sometimes result in violent riots. Facebook says it's catching up and attempting rectify the situation, but it's understandably hard to be patient when lives are at stake.

To that end, the company will be holding several public summits around the world in the coming months. The first three forums will be in England, Paris and Berlin and will be held in mid May. There'll also be subsequent summits in India, Singapore and the US. "They're going to be very interactive," said Bickert. "We want to get their feedback and incorporate them, and make sure the team is taking them into account in policy development and updates to community standards."

"There will always be people who will try to post abusive content or engage in abusive behavior," said Bickert. "[Revealing our guidelines] is our way of saying, 'These things are not tolerated.'"