The Department for Science, Innovation and Technology has released updated guidance on how new child protection laws under the Online Safety Act will operate, as key provisions officially take effect across the United Kingdom.
The reforms, described by Technology Secretary Peter Kyle as “the most significant step forward in child safety since the internet was created,” impose new legal duties on digital platforms to restrict underage access to harmful content and ensure safer online experiences for children.
Under the new regulations, websites hosting pornography or other high-risk content—including material related to self-harm, suicide, and eating disorders—must now implement robust age verification systems to ensure under-18s cannot view it. Acceptable methods include:
Simple self-declaration of age is no longer deemed sufficient.
A report from Ofcom revealed that children as young as eight have accessed online pornography, and that 16% of teenagers have encountered content that promotes disordered eating within the past month.
Platforms that do not enforce these measures risk enforcement action from Ofcom, including fines of up to £18 million or 10% of global turnover, whichever is greater.
Beyond content access, the rules also bar digital platforms from enabling direct messaging between adult strangers and children. Recommendation systems must not promote accounts for children to connect with adults they do not know.
In an effort to address privacy issues, the government emphasized that age verification must comply with UK data protection law. This includes minimising data collection and avoiding long-term storage of personal information.
Many age estimation tools in use today can confirm a user’s age without identifying or storing the user’s data, according to the Information Commissioner’s Office (ICO) and industry groups like the Age Verification Providers Association (AVPA). The AVPA reported an additional 5 million daily age checks since the rollout of the new provisions.
The government acknowledged that while Virtual Private Networks (VPNs) remain legal in the UK, platforms must not facilitate their use to evade age checks. This includes taking down or blocking content that promotes VPNs to children. Sites that do not act to prevent circumvention may face regulatory consequences.
The Act does not ban legal adult content, including pornography. Rather, it mandates age-gating and risk assessments to prevent minors from accessing such material. Content that poses no high risk to children, including political discourse or discussions about sexuality, is not subject to automatic removal or restriction.
Platforms are required to take a “risk-based and proportionate” approach to child protection, while also upholding obligations to protect freedom of expression, the government said.
Statements of support from key child protection and online safety organizations accompanied the government’s announcement:
Chris Sherwood, Chief Executive of the NSPCC, said the Act “can be a vehicle for significant and lasting change,” helping to curb the spread of harmful content and improve online safety algorithms.
Lynne Perry, CEO of Barnardo’s, called for “robust enforcement” to ensure platforms meet their obligations.
Internet Matters, an advocacy group, pointed to new research indicating that three in four children aged 9–17 have encountered harm online, reinforcing the urgency of the new law.
Ofcom will take the lead on enforcing the Act, including the new age assurance duties and child protection measures. The regulator is also empowered to investigate violations and impose penalties where appropriate.
Additional information and full guidance on the changes are available at gov.uk.