Facebook's latest Transparency Report for the first half of 2019 shows that the company removed millions of user posts in the past six months for violating its terms of service regarding child pornography, drug sales and terrorism.
The scope of this report initially focused on the nature and extent of government requests Facebook receives for user data.
In the first half of 2019, government requests to Facebook for user data increased by 16% from 110,634 to 128,617. Of the total volume, the US continues to submit the largest number of requests, followed by India, the UK, Germany and France.
In the US, Facebook received 50,741 requests, representing an increase of 23% more requests than last half, which is consistent with trends over time. Of all US requests, 66% included a non-disclosure order prohibiting Facebook from notifying the user. In addition, as a result of transparency updates introduced in the 2016 USA Freedom Act, the US government lifted the non-disclosure orders on 11 National Security Letters (NSLs) we received between 2014 and 2018. These requests, along with the US government’s authorization letters, are available now. Lastly, Facebook says it as completed the internal review of its US national security reporting metrics and continues to review its systems in order to ensure accounting is consistent.
When content is reported as violating local law, but doesn’t go against Facebook's Community Standards, the company may limit access to that content in the country where it is allegedly illegal. During this reporting period, the volume of content restrictions based on local law decreased globally by 50% from 35,972 to 17,807. This decrease follows an unusual spike last half in which we restricted 16,600 items in India based on a Delhi High Court order. Of the total volume, 58% of restrictions originated from Pakistan and Mexico.
Facebook also monitors and reports on the number of deliberate internet disruptions caused by governments around the world. During this reporting period, Facebook identified 67 disruptions of Facebook services in fifteen countries, compared to 53 disruptions in nine countries in the second half of 2018.
Facebook continues to report on the volume and nature of copyright, trademark and counterfeit reports Facebook receives each half — as well as the amount of content affected by those reports. During this reporting period, Facebook took down 3,234,393 pieces of content based on 568,836 copyright reports, 255,222 pieces of content based on 96,501 trademark reports and 821,727 pieces of content based on 101,582 counterfeit reports.
Other highlights from the report:
- Facebook removed 11.6 million pieces of content related to child pornography in the quarter ended in September. Facebook says its algorithms identified 99% of that content. Instagram removed another 754,000 pieces of content, with an automatic detection rate of just under 95%. By comparison, in the first quarter,
- Facebook removed just 5.8 million pieces of content related to child porn or exploitation.
- Facebook removed 4.4 million pieces of content related to drug sales in the third quarter, and another 2.3 million related to firearm sales. That was up from 841,000 and 609,000 pieces respectively six months earlier.
- Facebook said that 80% of the hate speech it removed from the service in the third quarter was detected by its software systems. That’s up from 68% of the hate speech removed in the first three months of the year.
- Terrorism content is slightly harder to identify on Instagram than on Facebook. Facebook proactively identified 98.5% of all terrorism content - including 99% related to Al Qaeda and ISIS. Instagram removed 92.2% of terrorism content using software algorithms.
- Facebook also said it removed 1.7 billion fake accounts in the third quarter -- 500 million fewer fake accounts than it took down in the first quarter when it eliminated a record 2.2 billion. The company says this decline is due to better preventative measures it has put in place which prevent “millions of attempts to create fake accounts” every day.
You can see the full report for more information.