Fleshbot Loading...
Loading...

Tech Companies Propose ‘Safety Codes’ for Online Child Protection in Australia

LEGAL NEWS STRAIGHT

A coalition of Australian tech industry groups has unveiled a draft set of “safety codes” aimed at protecting children from exposure to online pornography and other potentially harmful content. The proposal, released on Monday, could become enforceable under Australia’s Online Safety Act if approved by eSafety Commissioner Julie Inman Grant.

The draft codes were developed by the Digital Industry Group Inc. (DIGI), the Australian Mobile Telecommunications Alliance, the Consumer Electronics Suppliers Association, and the Interactive Games and Entertainment Association. DIGI, a not-for-profit industry association advocating for digital safety, privacy, and cybersecurity, is spearheading the initiative.

DIGI represents major global tech companies, including Apple, Google, Meta, Microsoft, TikTok, X, and others, all of whom have significant investments in Australia’s digital economy. The proposed safety codes are part of a broader effort to increase online safety, particularly for minors, in response to directives under the Online Safety Act passed in 2021.

The draft safety codes set out measures to block or restrict access to material deemed harmful to minors. If adopted, they would apply to content categorized as “Class 1C and Class 2,” which includes pornography, simulated gambling in video games, and other “high-impact material,” such as content depicting nudity, violence, drug use, crime, or racism. These types of content typically fall under R18+, X18+, or RC (Refused Classification) ratings in Australia.

The codes outline requirements for digital platforms to:

  • Implement age assurance measures, ensuring that children are not exposed to age-inappropriate content.
  • Develop tools allowing users to block explicit images from being sent to them.
  • Provide clear mechanisms for reporting breaches, including instances of sexual extortion, grooming, or non-consensual sharing of intimate images.
  • Establish terms and conditions to prohibit harmful behavior.

DIGI’s Director of Regulatory Affairs and Research Policy, Jennifer Duxbury, described the codes as a “critical step forward” in enhancing online safety for children.

“These draft safety codes represent a joint effort between companies and the government to strengthen safeguards for minors,” she stated. “The key principles of these codes are to protect and support children, give parents more control over what their children access, and ensure privacy and online safety.”

The Online Safety Act of 2021 granted the eSafety Commissioner significant authority as Australia’s main regulator of online content. Julie Inman Grant, who currently holds the position, has actively pushed for stronger measures to protect minors online. The proposed codes reflect Inman Grant’s directive to digital platforms to prevent children’s access to harmful content and to implement tools that block explicit images.

The codes also align with efforts to combat broader online safety issues, such as sexual extortion, grooming, and the sharing of non-consensual images. As part of her regulatory mandate, Inman Grant has engaged with global tech companies, civil society groups, and even religious organizations like the National Center on Sexual Exploitation (NCOSE), formerly known as Morality in Media, to enhance child safety on digital platforms.

Public consultation on the draft safety codes is open until November 22, 2024, with feedback being collected through OnlineSafety.org.au. DIGI is actively encouraging input from stakeholders, including consumer organizations, civil society, academics, parents, and community members.

“We welcome feedback from all sectors to ensure that these codes are effective and reflective of community expectations,” Duxbury said. “Once finalized, the safety codes will make an invaluable contribution to protecting children from online pornography and other harmful content.”

If ratified, the safety codes could have significant implications for how digital platforms operate in Australia. The codes would enforce stricter measures on tech companies to protect minors, aligning with ongoing international efforts to regulate online content and enhance child safety.

The proposal follows a global trend toward increasing accountability for tech companies in regulating harmful content. While supporters see the draft codes as essential for child protection, critics may raise concerns about privacy, potential overreach, and enforcement challenges. As the November 22 consultation deadline approaches, a diverse range of voices is expected to shape the final version of the codes.