Australia Investigates Major Tech Firms for Compliance with Under-16 Social Media Ban
Australia’s online safety regulator has initiated a formal inquiry into Meta, Snap, TikTok, and YouTube, led by Google. The investigation stems from concerns that these companies might not be adhering to a new law banning children under 16 from accessing social media platforms.
The eSafety Commissioner’s Office shared its first compliance report, revealing “serious concerns” regarding the enforcement of this groundbreaking legislation, which was implemented on December 10, 2025. The report highlighted notable deficiencies in how these major platforms prevent underage users from signing up. Specifically, it pointed out that the mechanisms in place allow children to circumvent age restrictions by attempting verification multiple times.
This assessment marks the first detailed evaluation of how tech giants are responding to Australia’s ambitious measures aimed at safeguarding young individuals from online dangers. As this legislation gains attention, other regions may consider similar protections for minors.
The report indicated that some platforms demand that children prove their age, despite previous self-identifications as minors. This gap has enabled a considerable number of kids to keep using social media, even though the ban is in place. While the eSafety Commissioner mentioned a decrease in accounts held by users under 16 over the past four months, she emphasized that many children still manage to access these sites.
eSafety Commissioner Julie Inman-Grant remarked, “These platforms still have the ability to be compliant today, and we ensure that businesses operating in Australia follow safety laws. They can choose to comply or face severe consequences, including significant reputational damage globally.”
In response to the inquiry, Meta, which owns Facebook and Instagram, reiterated its commitment to the ban. A spokesperson stated, “We are dedicated to following Australia’s social media regulations and working constructively with eSafety and the government. Identifying age online poses challenges across the industry, especially at the crucial 16-year-old mark, where there’s an inherent ‘natural margin of error.’ We believe effective age verification and parental approval via app stores is essential to protect our youth, not just on the main platforms, but across the multitude of apps available, many of which may lack robust safeguards.”
Google has yet to comment on the matter.
This investigation arrives at a particularly delicate moment for social media companies. Recently, a U.S. lawsuit found Meta and Google liable for damages concerning a 20-year-old woman’s mental health issues, which she attributed to social media addiction. This ruling escalates discussions about whether these tech firms are approaching a “Big Tobacco” moment, potentially losing long-standing legal protections regarding the content and user experiences they offer.
The eSafety investigation encompasses platforms like Facebook, Instagram, Snapchat, TikTok, and YouTube. For the regulatory effort to succeed, officials must demonstrate that these platforms did not take adequate measures to prevent under-16s from maintaining accounts. Outcomes from these cases could shape future enforcement of Australia’s social media ban and establish compliance standards for global tech companies operating within the nation.
