scorecardresearch
Follow Us:
Tuesday, September 21, 2021

Explained: Why Apple is delaying its software that scans for child abuse photos

While the move is being welcomed by child protection agencies, advocates of digital privacy, and industry peers, are raising red flags suggesting that the technology could have broad-based ramifications on user privacy.

Written by Pranav Mukul , Edited by Explained Desk | New Delhi |
Updated: September 7, 2021 8:44:43 am
In this file photo, the logo of Apple is illuminated at a store in the city center in Munich, Germany (AP)

Following criticism from privacy advocates and industry peers, Apple has delayed the launch of its software that would detect photographs depicting child abuse on iPhones. The programme was announced last month and was slated for launch in the US later this year.

What is Apple’s software and how would it have worked?

Apple last month said it would roll out a two-pronged mechanism that scans photographs on its devices to check for content that could be classified as Child Sexual Abuse Material (CSAM). As part of the mechanism, Apple’s tool neuralMatch would check for photos before they are uploaded to iCloud — its cloud storage service — and examine the content of messages sent on its end-to-end encrypted iMessage app. “The Messages app will use on-device machine learning to warn about sensitive content while keeping private communications unreadable by Apple,” the company had said.

neuralMatch compares the pictures with a database of child abuse imagery, and when there is a flag, Apple’s staff will manually review the images. Once confirmed for child abuse, the National Center for Missing and Exploited Children (NCMEC) in the US will be notified.

What were the concerns?

While the move is being welcomed by child protection agencies, advocates of digital privacy, and industry peers, are raising red flags suggesting that the technology could have broad-based ramifications on user privacy. It is believed that it’s nearly impossible to build a client-side scanning system that is only used for sexually explicit images sent or received by children, without such a software being tweaked for other uses. The announcement had put the spotlight once again on governments and law enforcement authorities seeking a backdoor into encrypted services. Will Cathcart, Head of end-to-end encrypted messaging service WhatsApp, had said: “This is an Apple-built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries, where iPhones are sold, will have different definitions on what is acceptable”.

Why has Apple backtracked?

In a statement, Apple said it would take more time to collect feedback and improve proposed child safety features after the criticism of the system on privacy and other grounds both inside and outside the company.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” it said.

According to Reuters, Apple had been playing defense on the plan for weeks and had already offered a series of explanations and documents to show that the risks of false detections were low.

Newsletter | Click to get the day’s best explainers in your inbox

📣 The Indian Express is now on Telegram. Click here to join our channel (@indianexpress) and stay updated with the latest headlines

For all the latest Explained News, download Indian Express App.

  • The Indian Express website has been rated GREEN for its credibility and trustworthiness by Newsguard, a global service that rates news sources for their journalistic standards.
0 Comment(s) *
* The moderation of comments is automated and not cleared manually by indianexpress.com.
Advertisement
Advertisement
Advertisement
Advertisement