Australia’s social media age restrictions are already working — and they haven’t even started yet

ABC Opinion

Timothy Koskie

Posted 7 Nov 20257 Nov 2025, updated 11 Nov 2025

We can’t be sure what will happen when Australia’s social media age restrictions finally come into effect, but they have already begun to change the online media landscape.

When we consider Australia’s predilection for self-regulation — from home building to advertising standards — it is perhaps understandable that some would baulk at attempts by the federal government to impose guardrails on social media companies. But submissions to the Joint Select Committee on Social Media and Australian Society indicated an appetite for action commensurate with the extent to which social media platforms saturate our lives.

Critics of the proposed age restrictions rightly point to the risk areas that this policy fails to cover — such as at-risk youth who use social media to escape domestic violence, or the accessibility of misinformation and other problematic content even when users are not signed in — to say nothing of the numerous ways young users can bypass the age restrictions altogether.

The government’s social media ban is also inconsistent when it comes to what it chooses to exclude — such as Roblox, a platform that is popular with young people and poses a demonstrable source of risk. Furthermore, the American Psychological Association pointed out that social media use can offer certain benefits to young users, from forms of social connection to opportunities for self-expression and education.

Those in favour of the restrictions counter that risks arise not from the intentions of young users but from the misinformation, cyberbullying and other malicious content that can be directed at them through social media platforms — which is precisely the kind of content the e-Safety Commissioner is seeking to mitigate.

But whatever headaches and hopes we associate with the policy itself, it is worth noting that the prospect of the ban has already had an effect on what these social media platforms are offering well before its implementation next month, and not just here in Australia. I believe this demonstrates the way that even imperfect policies can result in potentially effective changes far beyond their scope.

The link between legislation and self-regulation

Self-regulation has often been a tempting option for governments that are both politically inclined to adopt industry-based solutions and struggling to regulate companies whose international reach makes their legal liability harder to pin down. Social media platforms and civil liberties advocates alike have likewise opposed regulation, pointing to the tendency of authoritarian states to limit citizens’ access to the internet.

And yet there is a great deal of evidence to suggest that self-regulation on its own can result in little more than what could be called “symbolic policy”. This is when a great deal of visible activity and public discussion goes into policies that don’t ultimately arrive, don’t do what they are supposed to do or are easily violated without consequence.

A prime example of this is when the Digital Industry Group Inc (DIGI), which represents social media platforms and other digital firms, presented its voluntary misinformation code to restrict and respond to growing threats of misinformation online — only to have X, one of the chief culprits of online misinformation, publicise its decision not to adhere to it.

Ultimately, what appears to drive self-regulation is the credibility of the threat of government. For instance, after the passage of the Cyber Security Act 2024, the Australian Institute of Company Directors updated its “governance principles” to reflect the risk of further regulation. By contrast, an ongoing discussion over regulation appears to have done little to empower Tabcorp to adhere to its own policies on gambling ads.

The platforms’ rapid response

While the submissions made by the tech companies themselves to the Joint Select Committee on Social Media and Australian Society pointed to unforeseen risks and likely consequences in their effort to forestall regulations such as age restrictions, citizens were considerably less reticent. They articulated it in different ways, but a unifying theme of their submissions was a clear call to action. To quote one citizen’s submission: “Let’s start moving towards healing our families and protecting the generations ahead, any action is better than nothing.”

The federal government’s response has been both muscular and maximalist in its determination to include a wide range of platforms under the age restrictions. And this approach has already achieved results. One of the first companies to respond was Meta, which announced “teen accounts” for Instagram in September 2024 before the restrictions were passed in Parliament.

TikTok and Snapchat have likewise expanded their age-related account controls, with TikTok using a “teen safety center” to highlight and consolidate their initiatives. While YouTube already had a child-focused version of the app, it further restricted access to streaming for teens in July, months before the age restrictions were set to begin.

Roblox, meanwhile, has so far managed to avoid being included in these age restrictions by adding extensive restrictions of their own. Underage users are limited to private accounts with no social features (like local and private chatting) available by default, and Roblox has expanded its moderation efforts to isolate and restrict problematic content.

Policy or precedent?

What we sometimes fail to notice when scrutinising new policies is that the guardrails and penalties they establish are only a part of their overall effect. They also influence the kinds of behaviours that get incentivised. They can highlight or reflect a broader culture change that is underway — such as growing social unease with the ways social media platforms are intervening in our lives.

In this way, the Australian government’s age restrictions appear to be of a piece with more general global sentiment. France implemented its own age verification and parental consent requirements in 2023, and in the same year the UK passed its Online Safety Act — though this is largely focused on online content itself rather than the safety of users.

While the Trump administration may threaten allies that attempt to reign in platforms like X, individual US states are implementing age restrictions that are similar to our own.

Australia’s age restrictions may not prove a perfect fit for our media and social environment, but it has already made its mark on platforms and the public — both here and around the world.

Timothy Koskie is a Post-Doctoral Research Associate with the Centre for AI Trust and Governance at the University of Sydney.

To read the full article click here

Share this article

More News

Will Australia’s social media ban start a global trend?

Australia is the first country in the world to ban children from using social media, with under-16s told to log off for good, but will the new laws work as intended?

Albanese’s learnt how to win elections. His next challenge? To win trust

Prime Minster Anthony Albanese’s recent speech to the UK Labour Party Conference in Liverpool put trust front and centre. He argued that centre-left governments build lasting social change by delivering on their commitments and staying in office long enough to do so. “Delivering change is more difficult than demanding it,” he said. “Working within the system is tougher than railing against it.”

YouTube’s AI editing scandal reveals how reality can be manipulated without our consent

Disclosure, consent and platform power have become newly invigorated battlefields with the rise of AI. The issue came to the fore recently with YouTube’s controversial decision to use AI-powered tools to “unblur, denoise and improve clarity” for some of the content uploaded to the platform. This was done without the consent, or even knowledge, of the relevant content creators. Viewers of the material knew nothing of YouTube’s intervention.

Werewolf exes and billionaire CEOs: why cheesy short dramas are taking over our social media feeds

What can you do in 60 seconds? In short dramas, or “micro dramas”, that’s enough time for a billionaire CEO to fall in love with his contracted wife, or for a werewolf mafia boss to break a curse. These vertically framed, minute-long serials are reshaping the way we consume screen entertainment.