Published: 20th November 2020
Only 10 or 11 of every 10,000 content viewed on Facebook is on hate speech: Report
In its Community Standards Enforcement Report for September 2020 quarter, Facebook said it is including the prevalence of hate speech on its platform globally "for the first time"
Social media giant Facebook has for the first time disclosed the prevalence of hate speech on its platform, saying that out of every 10,000 content views in the third quarter, 10-11 were hate speech.
Facebook, which has 1.82 billion daily users globally, has drawn flak in the past for its handling of hate speech on the platform in India, which is among its biggest markets.
In its Community Standards Enforcement Report for September 2020 quarter, Facebook said it is including the prevalence of hate speech on its platform globally "for the first time".
"In Q3 2020, hate speech prevalence was 0.10 per cent, 0.11 per cent or 10 to 11 views of hate speech for every 10,000 views of content," it added.
Facebook said due to its investments in artificial intelligence, the company has been able to remove more hate speech and find more of it proactively before users report it.
"Our enforcement metrics this quarter, including how much hate speech content we found proactively and how much content we took action on, indicate that we're making progress in catching harmful content," it added.
Prevalence, on the other hand, estimates the percentage of times people see violating content on its platform, Facebook explained.
During the third quarter, Facebook took action on 22.1 million pieces of hate speech content, about 95 per cent of which was proactively identified.
On Instagram, the company took action on 6.5 million pieces of hate speech content (up from 3.2 million in June quarter), about 95 per cent of which was proactively identified (up from about 85 per cent in the previous quarter), it added.
The latest Community Standards Enforcement Report provides metrics on how Facebook enforced its policies from July to September, and includes metrics across 12 policies on Facebook and 10 policies on Instagram.
Facebook Vice President (Integrity) Guy Rosen said the company is also updating its Community Standards website to include additional policies that require more context and can't always be applied at scale.
These policies often require specialised teams to gather more information on a given issue in order to make decisions, he added.
The Community Standards Enforcement Report is published in conjunction with Facebook's biannual Transparency Report.
The Transparency Report shares numbers on government requests for user data, content restrictions based on local law, intellectual property takedowns and internet disruptions.
During the first six months of 2020, government requests for user data increased by 23 per cent from 140,875 to 173,592, it said.
Of the total volume, the US continues to submit the largest number of requests, followed by India, Germany, France, and the UK, it added.
During the period, the volume of content restrictions based on local law increased globally by 40 per cent from 15,826 to 22,120.
The increase was in part related to COVID-19 related restrictions, it said.
Also, in the first half of 2020, the company identified 52 disruptions of Facebook services in nine countries, compared to 45 disruptions in six countries in the second half of 2019.