Ireland’s Online Safety Code officially took effect this week, ushering in a new era of digital regulation aimed at protecting children from harmful online content, including pornography and violence. Under the new rules, platforms headquartered in Ireland must implement robust age assurance measures beyond simple self-declarations to restrict access to adult material.
Finalized in October 2024 by Coimisiún na Meán, Ireland’s media regulator, the code now applies to designated video-sharing platforms (VSPs) with EU headquarters in Ireland. It mandates compliance with a broad set of safety obligations, particularly the section titled “Age Assurance and Adult-Only Video Content,” which requires platforms to ensure that adult content “cannot normally be seen by children.”
Notably, age verification methods based solely on user-declared ages are deemed insufficient. Instead, platforms are encouraged to adopt technologies such as facial recognition, ID uploads, or cognitive age tests, provided they are privacy-respecting and do not retain sensitive data longer than necessary.
Failure to comply may result in fines of up to €20 million or 10% of global annual turnover, whichever is greater.
“This marks a significant shift from a self-regulatory model, which in many respects hasn’t worked,” said John Evans, Commissioner of Digital Services. “Holding regulated entities to account is at the core of our mandate.”
The code applies to platforms already designated by Coimisiún na Meán, including Facebook, Instagram, YouTube, TikTok, LinkedIn, X, Pinterest, Tumblr, Reddit, and Udemy. Platforms such as Snapchat, which do not have their EU headquarters in Ireland, are not directly covered; however, cross-border enforcement will be coordinated with other EU regulators under the Digital Services Act (DSA).
In May 2025, Reddit was de-designated after relocating its EU operations to the Netherlands, limiting the code’s jurisdiction.
The Online Safety Code is part of Ireland’s broader digital governance strategy. It will run in parallel with the EU’s Digital Services Act (DSA), which also includes age assurance provisions and has led to separate enforcement proceedings against several major adult sites for alleged non-compliance.
While the code addresses a broad range of online harms, including cyberbullying, the promotion of self-harm, and dangerous challenges, it does not currently regulate recommender algorithms, which determine the content users see. Ofcom has indicated that it will address these concerns through DSA enforcement mechanisms.
The online safety charity CyberSafeKids welcomed the new rules but flagged key gaps, particularly for gaming platforms like Roblox, which are currently unregulated under the code despite their popularity among young users.
“Parts of the code remain too vague and lack specific timeframes for handling harmful content,” a spokesperson said. “There’s an urgent need for broader coverage to ensure safety across all digital environments.”