Back

New social media minimum age laws: What enforcement will really look like

The Comtrac Team

Dec 9, 2025

0

Min Read

Australia’s Online Safety Amendment (Social Media Minimum Age) Bill 2024 comes into force on 10 December, introducing a mandatory minimum age of 16 for certain social media platforms. While much of the public conversation has centred on age-verification technologies and impacts on families, the real test of this reform will be how breaches are investigated and enforced. 

This legislation places the entire compliance burden on major platforms—not on children or parents. The government has made clear: there will be no penalties for young people or families who access restricted platforms. Instead, the focus is on ensuring that platforms take reasonable, proactive, and continuously improving measures to keep under-16s out. 

So how will authorities enforce these rules? And what will investigations look like in practice? 

A new compliance landscape for social media platforms 

From 10 December 2025, designated platforms must: 

  • Take “reasonable steps” to verify user ages - The rules don’t actually require platforms to prove someone’s age they just need to take “reasonable steps” to make a well-informed assessment. Age verification is the most accurate tool available, and in many cases it involves providing government-issued ID like a driver’s licence.  But the government has been very clear: ID checks can’t be the only option. The law specifically prevents platforms from relying solely on government ID as the mandatory method for confirming a user’s age. 

  • Deactivate existing underage accounts - Platforms are required to identify and deactivate any accounts belonging to users under 16. This isn’t limited to new sign-ups; it also applies to accounts that were created before the law came into effect.

  • Demonstrate ongoing compliance through documentation, reporting, and cooperation with the Office of the eSafety Commissioner. 

Failure to meet these obligations can attract fines of up to $49.5 million

While full enforcement approaches are not publicly detailed, our understanding — based on how similar “reasonable steps” obligations are investigated in areas like privacy and data protection — is that simply having the technology will not be enough. 

We expect that investigators will focus not just on whether platforms possess age-verification tools, but on how those tools are actually implemented, monitored, and evidenced. In other regulatory contexts, authorities routinely look for audit trails, internal decision-making records, system performance logs, and proof of ongoing governance. It’s reasonable to assume a similar approach here. 

In other words, it’s not just about having the capability. It’s about demonstrating hat the capability was used in a way that meets the “reasonable steps” standard. This is likely to be central to both investigations and any eventual enforcement action. 

No penalties for kids or families, but that doesn’t remove risk for platforms

Importantly, the law avoids criminalising children or forcing parents to police access. The government’s position is clear: the obligation lies with the platform, not the family. The law only says that platforms must take "reasonable steps" to prevent under-16s from holding an account. If people bypass the ban, it's not a fine-able offence. 

But the absence of individual penalties makes platform compliance even more critical. If children can freely bypass weak verification systems, that weakness becomes a compliance failure—and a potential multi-million-dollar liability. 

Enforcement will focus on systemic failures, not isolated incidents

The upcoming minimum-age rules represent one of the most significant shifts in Australia’s online safety landscape. And while public debate continues around privacy, practicality, and the role of parents, regulators are preparing for a very different challenge:

How to ensure that online services can actually prevent under-16s from gaining access.

Success will hinge on rigorous investigations, transparent processes, and the capacity to demonstrate, beyond simple policy statements, that platforms took reasonable, measurable, and auditable steps to protect under-16s.


Discover how Comtrac can streamline your regulatory investigations and enhance enforcement outcomes. Book a demo today to see how Comtrac can help regulatory agencies protect the community and ensure compliance with critical standards.