scorecardresearch
Sunday, Nov 27, 2022

Apple’s iOS 15.2 beta update adds Child Safety feature for Messages app

iOS 15.2 beta version: The latest update adds a Communication Safety feature for the Messages app, which as the name implies is aimed at keeping children safer online. 

ios 15.2, ios 15.2 update, apple child safety, apple csam, child exploitation scan, apple csam scan, apple news, apple, apple privacy featuresiOS 15.2 beta update brings a Safety feature for Messages app (Image source: Apple)

Apple has started rolling out the iOS 15.2 beta update, which brings one of the Child Safety features that the company announced earlier this year, though there is a slight modification. The latest update adds a Communication Safety feature for the Messages app, which as the name implies is aimed at keeping children safer online.

The new feature isn’t enabled by default and one will have to activate it manually on the Messages app. Once the feature is enabled, the app can reportedly detect nudity in images that are sent or received by children. Furthermore, if a nude image is sent to anyone, it will automatically be blurred and the child will receive warnings about the content, as per a report by Macrumors.

The company will also reportedly offer resources to contact someone they trust for help. If a child received a nude image, then the app will ask the child to not view the photo. It is worth noting that when Communication Safety was first announced, Apple asserted that if a child views a nude image in Messages, then the parents of children under the age of 13 will get the option to receive a notification for the same.

But, Apple appears to have removed this notification option as it could pose a risk to children if they are involved in a situation where there is parental violence or abuse. Apple appears to have found a better solution to this and will help offer guidance from a trusted adult.

Subscriber Only Stories
UPSC Key-November 21, 2022 to November 25, 2022: Why you should read ‘Ass...Premium
Team Melli and the Socrates legacyPremium
Goalkeeper who played for India in ‘World Cup’ now drives aut...Premium
NIA to Home Ministry: Break nexus, move gangsters from north to jails in ...Premium

The company says that the Messages app analyses image attachments to check for nudity in photos and that this will not affect user privacy as the messages will remain end-to-end encrypted. Apple will still have no access to Messages.

Besides, Apple announced one more safety feature a few months back, which is called anti-CSAM (child sexual abuse imagery detection). This is different from the Communication Safety feature and is expected to be rolled out in the future.

With this feature, the Cupertino giant aims to detect child sexual abuse and trafficking in iCloud photos. But, the launch of this feature was delayed as Apple said it first address the complaints filed by privacy advocates. The anti-CSAM feature is to find child sexual abuse images by scanning a user’s iCloud Photos against a list of known CSAM, which raised privacy concerns. If the feature detects enough matches, it will alert Apple’s moderators, who can then disable the account and report the images to legal authorities.

First published on: 12-11-2021 at 11:04:28 am
Next Story

Punjab House passes 15 Bills

Latest Comment
Post Comment
Read Comments
Advertisement
Advertisement
Advertisement
Advertisement
close