Facebook on Tuesday released information about the types of posts it allows on its social network, giving more details on what is permitted on subjects ranging from drug use and sex work to bullying, hate speech and inciting violence.
The 27-page document governs the behavior of more than 2 billion users, giving Facebook's definitions of hate speech, violent threats, sexual exploitation and more.
Facebook published the policies to help people understand where the company draws the line on nuanced issues, Monika Bickert, vice president of global policy management, said in a blog post. The company will for the first time give people a right to appeal its decisions.
The release of the content policies comes just days after Chief Executive Officer Mark Zuckerberg testified to Congress, where he faced frequent questions about the company's practices. They included lawmakers asking if Facebook unfairly takes down more conservative content than that from liberals or why bad content -- such as fake profiles and posts selling opioid drugs -- stay up even though they have been reported.
The policies are detailed when it comes to specific problems the U.S. has experienced -- especially gun violence. The profiles of mass murderers are taken down if they've killed four people at once, as defined by whether they were convicted or identified by law enforcement with images from the crime, or whether they were took their own life or were killed at the scene or aftermath.
One of Facebook's definitions about harassment involves "claims that a victim of a violent tragedy is lying about being a victim, acting/pretending to be a victim of a verified event, or otherwise is paid or employed to mislead people about their role in the event." That definition applies only when a message is sent directly to a survivor or immediate family member -- as occurred with victims of the Sandy Hook and Parkland shootings.
Facebook's internal guidelines are used to enforce the 'Community Standards' used by the company - what stays online and what comes down.
The content policy team at Facebook is responsible for developing the company's Community Standards. Facebook has people in 11 offices around the world, including subject matter experts on issues such as hate speech, child safety and terrorism.
In order to identify potential violations of its standards, Facebook uses a combination of artificial intelligence and reports from people to identify posts, pictures or other content that likely violates those standards. These reports are reviewed by Facebook's Community Operations team, who work 24/7 in over 40 languages. Facebook currently has 7,500 content reviewers, more than 40% the number at this time last year.
Over the coming year, Facebook is going to build out the ability for people to appeal decisions. As a first step, the company is launching appeals for posts that were removed for nudity/sexual activity, hate speech or graphic violence.
Here's how it works:
- If your photo, video or post has been removed because it violates Facebook's Community Standards, you will be notified, and given the option to request additional review.
- This will lead to a review by Facebook's team (always by a person), typically within 24 hours.
- If Facebook has made a mistake, the company will notify you, and your post, photo or video will be restored.
In May, Facebook will launch Facebook Forums: Community Standards, a series of public events in Germany, France, the UK, India, Singapore, the US and other countries where the company will get people's feedback directly.