The enforcement of a new law to promote online safety of children in the United Kingdom has attracted widespread criticism from politicians, tech companies, digital rights advocacy groups, free-speech campaigners, and content creators, among others.
Certain provisions of the UK’s Online Safety Act (OSA) took effect on July 25. These provisions require companies behind websites that are accessible in the UK to prevent under-18 users from seeing harmful content including pornography and material related to self-harm, eating disorders, or suicide. It also requires them to give minors age-appropriate access to other types of content pertaining to bullying and abusive or hateful content.
To comply with these provisions of the OSA and stay online in the country, platforms have implemented age verification measures to check the ages of users on their services. This includes social media platforms Reddit, Bluesky, Discord, and X; porn websites like Pornhub and YouPorn; and sites like Spotify which is also requiring users to submit face scans to access explicit content.
In response, VPN (Virtual Private Network) apps have become the most downloaded on Apple’s App Store in the UK over the past few weeks. Proton VPN experienced an 1,800 per cent spike in UK daily sign-ups, according to a report by BBC.
Since the UK is one of the first major democratic countries after Australia to impose such strict content controls on tech companies, it has become a closely watched test case and might influence online safety regulation in other countries like India as well.
“Since 25th July, users in the UK have certainly experienced a different version of the internet than that they were previously used to,” Paige Collings, Senior Speech and Privacy Activist, Electronic Frontier Foundation (EFF) told The Indian Express.
“The OSA was first introduced in 2017 and politicians debated this legislation for more than four years and under four different Prime Ministers. Throughout this time, experts from across civil society, academia, and the corporate world flagged concerns about the impact of this law on both adults’ and children’s rights, but politicians in the UK decided to push ahead and enact one of the most contentious age verification mandates that we’ve seen,” she added.
In an attempt to make the UK the ‘safest place’ in the world to be online, the Online Safety Act was signed into law in 2023. The sweeping legislation includes provisions that place the burden on social media platforms and search services to take down illegal content as well as adopt transparency and accountability measures.
However, according to the British government’s own website, the most stringent provisions in the OSA are aimed at enhancing the online safety of children.
These provisions apply to any website that “is likely to be accessed by children”, even if the companies that own these sites are located outside the country. Companies had until April 16 to assess and determine if their websites were likely to be accessed by children based on guidance published by the Office of Communications (Ofcom), which is the regulator overseeing the implementation of OSA. The deadline for companies to complete their assessment of the risk of harm to children was July 24, 2025.
Sites that fall within the scope of the Act must take steps to prevent under-18 users from seeing harmful content which is defined in three categories, as per the OSA:
– Primary priority content: Pornographic content; Content which encourages, promotes or provides instructions for suicide; self-harm; or an eating disorder or behaviours associated with an eating disorder.
– Priority content: bullying content; abusive or hateful content; content which depicts or encourages serious violence or injury; content which encourages dangerous stunts and challenges; and content which encourages the ingestion, inhalation or exposure to harmful substances.
– Non-designated content: This is any type of content that presents a material risk of significant harm to an appreciable number of children in the UK as long as the harm does not stem from the content’s potential financial impact; the safety or quality of goods featured in the content; or the way in which a service featured in the content may be performed.
Online service providers in-scope of the Act can address these risks by implementing a number of measures, which includes, but is not limited to:
– Robust age checks: Services must use “highly effective age assurance to protect children from this content. If services have minimum age requirements and are not using highly effective age assurance to prevent children under that age using the service, they should assume that younger children are on their service and take appropriate steps to protect them from harm.”
– Safer algorithms: Services “will be expected to configure their algorithms to ensure children are not presented with the most harmful content and take appropriate action to protect them from other harmful content.”
– Effective moderation: All services “must have content moderation systems in place to take swift action against content harmful to children when they become aware of it.”
The consequences of non-compliance may be severe. Online platforms found in breach of the new rules face penalties of up to £18 million or 10 per cent of global turnover, whichever is greater. It may also constitute a criminal offence.
Ofcom on Thursday, July 30, launched probes into the compliance of at least four companies that collectively own and operate 34 pornography sites, according to a report by Reuters.
In March and April, the regulator sent letters of enforcement to three websites operating outside the UK, demanding that these sites conduct risk assessments and provide a report to Ofcom about the presence of CSAM (child sexual abuse material), hate speech, and other types of illegal content on their platforms, as per Politico.
One of these websites is a far-right platform called Gab that has gone completely dark in the UK to avoid financial and criminal penalties. Kiwi Farms, an online forum that has been criticised for enabling harassment and cyberstalking, is also on the list.
But a growing number of people are also turning to VPNs in order to evade the strict content controls and age checks put in place by websites. VPNs essentially bypass local network providers and route a smartphone or PC’s internet traffic to another country. Apps offering VPN services made up half of the top 10 most popular free apps on the UK’s App Store for iOS last week.
Since the new age verification rules took effect, Proton has recorded a 1,800 per cent increase in daily sign-ups while Nord said there has been a 1,000 per cent increase in purchases of VPN subscriptions from users in the UK. “We would normally associate these large spikes in sign-ups with major civil unrest. This clearly shows that adults are concerned about the impact universal age verification laws will have on their privacy,” Proton said.
“Until now, kids could easily stumble across porn and other online content that’s harmful to them without even looking for it. Age checks will help prevent that. We’re now assessing compliance to make sure platforms have them in place, and companies that fall short should expect to face enforcement action,” Ofcom said earlier this week. Over eight per cent of children between 8-14 years had visited a porn site or app over a month-long period, as per an Ofcom study.
It also acknowledged that age checks were not a silver bullet. Oliver Griffiths, Ofcom group director for online safety, said that the age verification rules were not foolproof just as under-18s are sometimes able to buy alcohol in UK stores, as per Financial Times.
“There are opportunities for people to use VPNs, but this is part of a broader system approach,” Griffiths added.
In 2017, 14-year-old Molly Russell died by suicide, and the coroner’s report found that she had died from an act of self-harm while suffering from depression and “the negative effects of online content.” This ruling was the first-of-its-kind as it established a direct link between online content and a child’s suicide. It is credited with spurring much of the momentum behind the Online Safety Act.
The Molly Rose Foundation, a charity established by the family of the British teenager, has argued that the OSA does not go far enough to prevent more deaths like Molly’s. It has called for stronger measures such as requiring platforms to proactively search for, and take down, depressive and body image-related content.
The codes of practice for protection of children under the OSA have been met with pushback from all quarters. “Since the rollout of age checks this past Friday, we’ve seen a renewed energy of contempt and frustration that UK politicians—with Ofcom—decided to continue on this path,” Collings said.
“As we’ve been saying for many years now, this approach is reckless, short-sighted, and will introduce more harm to the very children that it is supposed to protect. More than 400,000 people in the UK have also spoken up by signing a petition calling for the repeal of the OSA. Internet users and tech policy experts are all saying the same thing, and we urge the UK government to listen,” she said.
Talking about the enterprise cost of complying with the OSA, Collings said that the “situation is certainly worse for smaller and independent providers, who are forced into making a decision on whether to implement age verification measures, block users in the UK, or shut down completely.”
Nigel Farage, leader of the far-right Reform UK party, has also argued that the OSA should be repealed. However, the British government has indicated that it has no plans to do so.
The Wikimedia Foundation, the non-profit entity behind Wikipedia, has filed a legal challenge against certain provisions of the OSA, arguing that the regulations endanger the site and the global community of volunteer contributors who create the information on the site.
Elon Musk-owned X has said that the OSA encourages over-censorship by threatening platforms with fines and setting timetables that are unnecessarily tight. “When lawmakers approved these measures, they made a conscientious decision to increase censorship in the name of ‘online safety’. It is fair to ask if UK citizens were equally aware of the trade-off being made,” the social media platform said in a statement.
Advocating for a balanced approach that protects liberty, encourages innovation and safeguard children, X said, “It’s safe to say that significant changes must take place to achieve these objectives in the UK.”
When asked about a viable path forward beyond age-checks, Collings said, “The scramble to find an effective age verification method shows us that there isn’t one, and the case of safety online is not solved through technology alone. Children deserve a more intentional and holistic approach to protecting their safety and privacy online, where the specific harms have intentional and targeted approaches rather than a catch all legislation that lets an abundance of harms slip through the net.”
Amid US President Donald Trump’s tariff threats, there have also been reports speculating that 10 Downing Street may look to use the online safety rules as a bargaining chip to secure a new digital trade partnership with Washington.
However, UK technology secretary Peter Kyle has insisted that the rules are not up for negotiation.
The UK Department for Science, Innovation and Technology further said it expected the new rules to be “robustly implemented by tech companies.” “Platforms have clear legal obligations and must actively prevent children from circumventing safety measures, including blocking content that promotes VPNs or other workarounds targeting young users,” it added.