scorecardresearch
Thursday, Dec 01, 2022
Premium

Explained: Why Apple is delaying its software that scans for child abuse photos

While the move is being welcomed by child protection agencies, advocates of digital privacy, and industry peers, are raising red flags suggesting that the technology could have broad-based ramifications on user privacy.

In this file photo, the logo of Apple is illuminated at a store in the city center in Munich, Germany (AP)

Following criticism from privacy advocates and industry peers, Apple has delayed the launch of its software that would detect photographs depicting child abuse on iPhones. The programme was announced last month and was slated for launch in the US later this year.

What is Apple’s software and how would it have worked?

Apple last month said it would roll out a two-pronged mechanism that scans photographs on its devices to check for content that could be classified as Child Sexual Abuse Material (CSAM). As part of the mechanism, Apple’s tool neuralMatch would check for photos before they are uploaded to iCloud — its cloud storage service — and examine the content of messages sent on its end-to-end encrypted iMessage app. “The Messages app will use on-device machine learning to warn about sensitive content while keeping private communications unreadable by Apple,” the company had said.

neuralMatch compares the pictures with a database of child abuse imagery, and when there is a flag, Apple’s staff will manually review the images. Once confirmed for child abuse, the National Center for Missing and Exploited Children (NCMEC) in the US will be notified.

What were the concerns?

While the move is being welcomed by child protection agencies, advocates of digital privacy, and industry peers, are raising red flags suggesting that the technology could have broad-based ramifications on user privacy. It is believed that it’s nearly impossible to build a client-side scanning system that is only used for sexually explicit images sent or received by children, without such a software being tweaked for other uses. The announcement had put the spotlight once again on governments and law enforcement authorities seeking a backdoor into encrypted services. Will Cathcart, Head of end-to-end encrypted messaging service WhatsApp, had said: “This is an Apple-built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries, where iPhones are sold, will have different definitions on what is acceptable”.

Subscriber Only Stories
Crisis and anger: Reading China’s ‘Zero-Covid’ and anti...Premium
How to read Q2 GDP dataPremium
Arun Singhal: ‘India will keep importing fertilizers from Russia as long ...Premium
From bonds to banks: Large industry drives credit growthPremium

Why has Apple backtracked?

In a statement, Apple said it would take more time to collect feedback and improve proposed child safety features after the criticism of the system on privacy and other grounds both inside and outside the company.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” it said.

According to Reuters, Apple had been playing defense on the plan for weeks and had already offered a series of explanations and documents to show that the risks of false detections were low.

Advertisement

Newsletter | Click to get the day’s best explainers in your inbox

First published on: 06-09-2021 at 12:51:20 pm
Next Story

JNU organises Korean language teacher training programme

Latest Comment
Post Comment
Read Comments
Advertisement
Advertisement
Advertisement
Advertisement
close