scorecardresearch
Follow Us:
Monday, September 20, 2021

Apple says feature to find child images doesn’t create backdoor

Apple Inc defended concerns about its upcoming child safety features, saying it doesn’t believe its tool for locating child pornographic images on a user’s device creates a backdoor that reduces privacy

By: Bloomberg |
Updated: August 7, 2021 11:46:10 am
Apple, Apple Car, Apple BMW executive, Ulrich Kranz, Apple electric car, Apple car launch date, Apple car launchApple said it isn’t breaking end-to-end encryption with a new feature in the Messages app that analyzes photos sent to or from a child’s iPhone for explicit material, nor will the company gain access to user messages (Image source : REUTERS/Mike Segar/File Photo)

Apple Inc defended concerns about its upcoming child safety features, saying it doesn’t believe its tool for locating child pornographic images on a user’s device creates a backdoor that reduces privacy. The Cupertino, California-based technology giant made the comments in a briefing Friday, a day after revealing new features for iCloud, Messages and Siri to combat the spread of sexually explicit images of children.

The company reiterated that it doesn’t scan a device owner’s entire photo library to look for abusive images, but instead uses cryptography to compare images with a known database provided by the National Center for Missing and Exploited Children.

Some privacy advocates and security researchers were concerned after Apple’s announcement that the company would scan a user’s complete photo collection — instead the company is using an on-device algorithm to detect the sexually explicit images. Apple said it would manually review abusive photos from a user’s device only if the algorithm found a certain number of them. The company also said it can adjust the algorithm over time.

Apple said it isn’t breaking end-to-end encryption with a new feature in the Messages app that analyzes photos sent to or from a child’s iPhone for explicit material, nor will the company gain access to user messages. Asked on the briefing if the new tools mean the company will add end-to-end encryption to iCloud storage backups, Apple said it wouldn’t comment on future plans. End-to-end encryption, the most stringent form of privacy, lets only the sender and receiver see a message sent between them.

On Thursday, the Electronic Frontier Foundation said Apple is opening a backdoor to its highly touted privacy features for users with the new tools. “It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children,” the EFF said in a post on its website. “As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.”

Apple said the system had been in development for years and wasn’t built for governments to monitor citizens. The system is available only in the US, Apple said, and only works if a user has iCloud Photos enabled.

Dan Boneh, a cryptography researcher tapped by Apple to support the project, defended the new tools. “This issue affects many cloud providers,” he said. “Some cloud providers address this problem by scanning photos uploaded to the cloud.

Apple chose to invest in a more complex system that provides the same functionality, but does so without having its servers look at every photo.”

📣 The Indian Express is now on Telegram. Click here to join our channel (@indianexpress) and stay updated with the latest headlines

For all the latest Technology News, download Indian Express App.

  • The Indian Express website has been rated GREEN for its credibility and trustworthiness by Newsguard, a global service that rates news sources for their journalistic standards.
Advertisement
Advertisement
Advertisement
Advertisement