For years Facebook's work has included using photo-matching technology to stop people from sharing known child exploitation images, reporting violations to the National Center for Missing and Exploited Children (NCMEC), requiring children to be at least 13 to use the company's services, and limiting the people that teens can interact with after they sign up.
The company is exploring applying the same technology to its Instagram app.
In addition to photo-matching technology, Faceboo says it is using artificial intelligence and machine learning to proactively detect child nudity and previously unknown child exploitative content when it's uploaded.
Facebook's Community Standards ban child exploitation and to avoid even the potential for abuse. The company says it takes action on nonsexual content as well, like seemingly benign photos of children in the bath. With this approach, in the last quarter alone, Facebook says it removed 8.7 million pieces of content on Facebook that violated the company's child nudity or sexual exploitation of children policies, 99% of which was removed before anyone reported it. Facebook also removes accounts that promote this type of content. Trained teams with backgrounds in law enforcement, online safety, analytics, and forensic investigations, review content and report findings to NCMEC. In turn, NCMEC works with law enforcement agencies around the world to help victims, and Faecbook is helping the organization develop new software to help prioritize the reports it shares with law enforcement in order to address the most serious cases first.
Facebook also collaborates with other safety experts, NGOs and companies to disrupt and prevent the sexual exploitation of children across online technologies.
Under pressure from regulators and lawmakers, Facebook has vowed to speed up removal of extremist and illicit material.