New regulations adopted this week by Australia’s eSafety Commissioner will soon require major search engines to implement mandatory age verification measures, making the country one of the first to introduce such broad requirements for general web searches.
The regulations, set to take effect at the end of 2025, stem from three industry self-regulatory codes that are now formally accepted by the Office of the eSafety Commissioner. Classified under a “Schedule 3” code of conduct, the rules will apply primarily to major platforms like Google and Microsoft’s Bing, which dominate search engine use in Australia.
Under the new rules, search engines that provide user accounts must “implement appropriate age assurance measures” and enable safety features such as “Safe Search” by default for users identified as children. Those seeking to turn off Safe Search or access certain materials will need to confirm their age through additional verification steps.
The goal is to limit access to material deemed “harmful to minors,” such as pornography and high-impact violence. If a user is not logged into an account or their age cannot be verified, search engines will be required to blur sensitive images by default.
Julie Inman Grant, Australia’s eSafety Commissioner, emphasized that these measures are part of a “layered safety approach” aimed at protecting minors at multiple digital access points. “It’s critical to ensure the layered safety approach, which also places responsibility and accountability at critical chokepoints in the tech stack,” she said, citing app stores and device-level controls as key enforcement points.
The rules coincide with broader child safety efforts, including a pending under-16 social media ban passed in November 2024. That measure is also set to take effect in late 2025, marking a comprehensive digital policy shift aimed at reducing youth exposure to harmful online content.
While some parents and child safety advocates have welcomed the regulation, critics warn that such measures could have unintended consequences. Privacy experts argue that age assurance methods risk compromising personal data, and civil liberties organizations have voiced concerns over digital surveillance and censorship.
Critics also question the technical feasibility of enforcing the rules. VPNs, anonymous browsing, and international hosting of explicit content make it challenging to prevent access solely through age verification systems. Some sites not based in Australia may choose not to comply with the new standards altogether.
Still, regulators maintain that the move is a necessary step. “We need industry to be building in guardrails that prevent their chatbots from engaging in this type of behavior with children,” Inman Grant added, pointing to broader concerns about AI and unmoderated content online.
As the countdown to implementation begins, search engine companies will now be expected to develop and test systems capable of complying with the age assurance mandate by the end of 2025.