Google's Youtube announced a series of policy and product changes that update how the company is tackling harassment on YouTube.
Youtube has been always removed videos that explicitly threaten someone, reveal confidential personal information, or encourage people to harass someone else. Moving forward, Youtube's policies will go a step further and not only prohibit explicit threats, but also veiled or implied threats. This includes content simulating violence toward an individual or language suggesting physical violence may occur. Beyond threatening someone, there is also demeaning language that goes too far. To establish a consistent criteria for what type of content is not allowed on YouTube, Youtube is building upon the framework it uses for its hate speech policy. Youtube says it will no longer allow content that maliciously insults someone based on protected attributes such as their race, gender expression, or sexual orientation. This applies to everyone, from private individuals, to YouTube creators, to public officials.
However, YouTube said videos that include harassment language in certain contexts, such as a documentary or a scripted satire, will not be removed. Neither will clips featuring or discussing powerful people “like high-profile government officials or CEOs of major multinational corporations.” YouTube will decide when videos meet these exceptions or not.
Harassment sometimes takes the shape of a pattern of repeated behavior across multiple videos or comments, even if any individual video doesn’t cross Youtube's policy line. To address this, Youtube is tightening its policies for the YouTube Partner Program (YPP) to get even tougher on those who engage in harassing behavior. Channels that repeatedly brush up against Youtube's harassment policy will be suspended from YPP, eliminating their ability to make money on YouTube. Youtube may also remove content from channels if they repeatedly harass someone. If this behavior continues, Youtube will take more severe action including issuing strikes or terminating a channel altogether.
Comment section is an important place for fans to engage with creators and each other. At the same time, comments are often where creators and viewers encounter harassment. To combat this Youtube removes comments that clearly violate its policies – over 16 million in the third quarter of this year, specifically due to harassment. The policy updates will also apply to comments, so Youtube expects this number to increase in future quarters.
Beyond comments that Youtube removes, the company also empower creators to further shape the conversation on their channels and have a variety of tools that help. When Youtube is not sure a comment violates its policies, but it seems potentially inappropriate, the company gives creators the option to review it before it's posted on their channel. Earlier this year, Youtube began to turn this setting on by default for most creators. Last week Youtube began turning this feature on by default for YouTube’s largest channels with the site’s most active comment sections and will roll out to most channels by the end of the year.
Creators can opt-out, and if they choose to leave the feature enabled they still have ultimate control over which held comments can appear on their videos. Alternatively, creators can also ignore held comments altogether if they prefer.