
Snapchat is getting a bunch of new features aimed at protecting teenagers from potential online risks. The company says it will be implementing a new strike system and detection technologies to remove accounts that market or promote inappropriate content to teens.
As part of its latest efforts, the upcoming features will protect teens from being contacted by people they haven’t interacted with in real life and will provide a much-improved age-appropriate viewing content experience.
Snap Inc., the parent company of Snapchat says a pop-up warning will appear for teenagers if people who they don’t have mutual contacts with or someone they don’t know tries to add them. The app will also offer a quick way for teens to report or block strangers.
The platform already requires 13 to 17-year-olds to have several mutual friends if they want to add someone. It looks like Snapchat is raising the bar, with teenagers now requiring more common friends before they can add a stranger to keep them safe from things like violence, self-harm, misinformation, sexual exploitation, and pornography.
Snapchat’s new ‘Strike System’ immediately removes detected or reported age-inappropriate content. It also introduced new in-app content to address issues like responsible sharing, online safety and mental health. Developed in partnership with ‘Young Leaders for Active Citizenship’, it will be featured in the ‘Stories’ section. These features will be rolling out to everyone in the coming weeks.
For parents, Snapchat has started a YouTube series that explains how their kids can stay safe on the platform. In the last few months, the company has added several generative AI-powered features like the My AI chatbot, new lenses, the ability to try outfits using AR and more.