scorecardresearch
Thursday, Sep 29, 2022

Facebook says it removed millions of content on child nudity, terrorism and more

Facebook removed millions of pieces of content in Q2 and Q3 2019 for violating its content policies. This includes content that violated its suicide and self-injury regulated goods policies, child nudity and sexual exploitation of children, policies against hate speech.

Facebook revealed that the same detection systems are used on Facebook as well as Instagram to detect and remove harmful content. (Image: Reuters)

Facebook removed millions of pieces of content in Q2 and Q3 2019 for violating its content policies. This includes content that violated its suicide and self-injury regulated goods policies, child nudity and sexual exploitation of children, policies against hate speech, and more.

The move comes as Facebook CEO Mark Zuckerberg is facing criticism from its own employees as well for the company’s stand on political ad policy of letting politicians lie in advertisements.

Apart from this, Facebook shared for the first time the data on how it has enforced its content policies on Instagram.  In the November 2019 edition of its Community Standards Enforcement Report, Facebook revealed that the data from Instagram is from four main areas including child nudity and child sexual exploitation; regulated goods, which include illicit firearm and drug sales; suicide and self-injury; and terrorist propaganda.

In Q3, Facebook removed close to 11.6 million posts for violating its child nudity and sexual exploitation of children policy, while on Instagram, it was 754,000 pieces of content, respectively. Some 512,000 posts were removed from Instagram in Q2 as well.

Subscriber Only Stories
Terror links to training sites to targeted killings: Govt’s case against PFIPremium
G-20 presidency is an opportunity to position India as the voice of the G...Premium
UPSC Key-September 28, 2022: Why you should read ‘Election Symbols’ or ‘D...Premium
Rahul’s Yatra: Wanting on plans, banking on connect, an effort to s...Premium

In Q2 and Q3, 4.5 million pieces of content related to suicide and self-injury were removed. While on Instagram, 1.7 million such posts were removed. Close to 841,000 posts of drug sale content and 2.3 million pieces of firearm sales content in the third quarter were removed from Facebook. On Instagram, it was 1.5 million pieces of drug sale content and 58,600 posts of firearm sales content, respectively.

Facebook’s VP Integrity Guy Rosen said in the report that the same detection systems are used on Facebook as well as Instagram to detect and remove harmful content, though different metrics are used across the two platforms. The social media company mainly relies on AI to tackle the issue of harmful content.

First published on: 14-11-2019 at 12:49:34 pm
Next Story

Treatment being given to Chidambaram not satisfactory, has already lost 8-9 kgs: Family

Latest Comment
Post Comment
Read Comments
Advertisement
Advertisement
Advertisement
Advertisement