scorecardresearch
Sunday, Feb 05, 2023
Advertisement
Premium

Apple to check iPhones for child abuse pics; a ‘backdoor’, claim digital privacy bodies

As part of the mechanism, Apple’s tool neuralMatch will check for photos before they are uploaded to iCloud — its cloud storage service — and examine the content of messages sent on its end-to-end encrypted iMessage app.

apple, Primephonic, apple music, spotify, MacBook(Image source: Apple)

Apple is rolling out a two-pronged mechanism that scans photographs on its devices to check for content that could be classified as Child Sexual Abuse Material (CSAM). While the move is being welcomed by child protection agencies, advocates of digital privacy, and industry peers, are raising red-flags suggesting the technology could have broad-based ramifications on user privacy.

As part of the mechanism, Apple’s tool neuralMatch will check for photos before they are uploaded to iCloud — its cloud storage service — and examine the content of messages sent on its end-to-end encrypted iMessage app. “The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple,” the company said.

neuralMatch will compare the pictures with a database of child abuse imagery, and when there is a flag, Apple’s staff will manually review the images. Once confirmed for child abuse, the National Center for Missing and Exploited Children (NCMEC) in the US will be notified. At a briefing Friday, a day after its initial announcement of the project, the Cupertino-based tech major said it will roll out the system for checking photos for child abuse imagery “on a country-by-country basis, depending on local laws”.

However, this move is being seen as building a backdoor into encrypted messages and services. In a blog post, California-based non-profit Electronic Frontier Foundation noted: “Child exploitation is a serious problem, and Apple isn’t the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor”.

Subscriber Only Stories
How a toilet campaign changed lives, helped women in Telangana’s Narayanpet
Conspiracy, data theft with ex-staff: Digital India firm accuses pvt company
After CUET, DU admissions drop 25%; enrollment of girls sees decrease
What’s in a name? Plenty

The non-profit added that it was “impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children”. “That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change”.

In its statement, Apple has noted that the programme is “ambitious” and “these efforts will evolve and expand over time”.
Apple’s move has put the spotlight once again on governments and law enforcement authorities seeking a backdoor into encrypted services, and experts are looking for signs that establish if Apple has changed direction in a fundamental way from its stance as an upholder of user privacy rights.

So much so that less than a year ago, Reuters had reported that the company was working to make iCloud backups end-to-end encrypted, essentially a move that meant the device maker could not turn over readable versions of them to law enforcement. This was, however, dropped after the FBI objected. The latest project is being seen as almost making a full circle, with the proposed system potentially setting the stage for the monitoring of different types of content on iPhone handsets.
Criticising Apple’s decision, Will Cathcart, head of Facebook-owned messaging service WhatsApp said in a tweet: “I read the information Apple put out yesterday and I’m concerned. I think this is the wrong approach and a setback for people’s privacy all over the world. People have asked if we’ll adopt this system for WhatsApp. The answer is no”.

Advertisement

“This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable,” he argued.

Globally, Apple has around 1.3 billion iMessage users, of which 25-30 million are estimated to be in India, while WhatsApp has two billion global users, around 400 million of which are from India.

This also comes in the wake of the Pegasus scandal, where Israeli private cyber-offensive company NSO Group exploited the loopholes in apps such as iMessage and WhatsApp to provide its government customers access to the devices of their targets through installing a spyware. These targets include human rights activists, journalists, political dissidents, constitutional authorities and even heads of governments.

Advertisement

In India, through the IT Intermediary Guidelines, the government has sought traceability of originator of certain messages or posts on significant social media intermediaries. While companies like WhatsApp have opposed traceability, experts suggest that Apple’s decision could set a potential precedent to provide the government entry into encrypted communication systems.

First published on: 08-08-2021 at 00:50 IST
Next Story

Panel points out NCLT vacancies to MCA: How appointment delays are affecting resolutions

Latest Comment
Post Comment
Read Comments
Advertisement
Advertisement
Advertisement
Advertisement
close