The pandemic has pushed millions of children and teenagers online like never before. While they are benefitting from the advantages of online classes and interactive sessions at a time when there is still no clarity on when schools will be back to normal, it has also exposed them to the dark side of the world wide web. Now, Google has announced a set of measures to keep users under the age of 18 safe online.
Google is working on multiple fronts to keep children and teenagers safe on its platforms. For one, it is giving minors ‘more controls over their digital footprint’. Then it is also ‘tailoring products experiences’ for this segment, and has announced advertising changes as well as digital well-being tools.
In a significant move, Google has announced that users under the age of 18, and their parents or guardians, will, as per its new policy, be able to request removal of their photos from its image results. While this will just remove the image from the search results and not the website hosting it, it will certainly go a long way in reducing the exposure of the minor online.
On YouTube, any video uploaded by a minor will now be in private mode by default. Along with this, YouTube will bring in more digital well-being content for this segment, while also teaching them about commercial content on the platform.
When it comes to Google Search, ‘safe search’, which filters out explicit results, will be turned on by default for all users under the age of 18. At present, this is so only for users under 13. This will also be turned on by default for school accounts on the Google Suite.
With Google Assistant, minor users will have ‘default protections’ to prevent them from encountering mature content, including on smart displays.
Moreover, the underage will soon find that their location history is turned off by default with no option to turn it on without supervision.
For apps, Google Play will let parents know which apps follow their family policies, making it easier for them to decide if the same is appropriate for their children.
Also, age-sensitive ads will not longer be shown to teens and advertisers will not be able to target their products to users under the age of 18 using age, gender or location.
Along with new Digital Wellbeing tools that allow users to filter out content from news, podcasts and the web, Google will also be rolling out easy-to-understand material for minors to figure out the best data practices.
The search giant and other big tech companies have been under pressure to put in place better protection for minor users.
In the US, the 1998 Children’s Online Privacy Protection Act, which restricts tracking and targeting to children under the age of 13, could soon be updated to cover those under 18. The Google update seems to align with this. Incidentally, in 2019, Google had to pay a $170-million fine for violating this very Act, and for collecting data of children without the consent of parents.
Facebook-owned Instagram has recently brought in changes that made a lot of features private by default for teenagers. Like Google, Instagram too banned ads targeted on interests or activity of children, but allowed the same for age, gender and location.
Newsletter | Click to get the day’s best explainers in your inbox