Australia - Social Media Restrictions - is more needed for gaming
- Ben Akroyd
- Dec 7
- 3 min read
Australia’s forthcoming Online Safety Amendment (Social Media Minimum Age) Act 2024 is a landmark piece of legislation. Effective from December 10, 2025, it imposes a mandatory minimum age of 16 for users on major social media platforms. With non-compliant platforms facing fines of up to $50 million, this is a clear message to Big Tech regarding accountability for youth mental health and online safety.
The government’s rationale is centered on mitigating harms from predatory algorithms, cyberbullying, and inappropriate content within highly centralized social networking services. Crucially, the legislation's broad scope defines an "age-restricted social media platform" based on its function—enabling social interaction and allowing user-posted material. This is why services like Twitch and Kick, which are intrinsically linked to the gaming and livestreaming ecosystem, have been included in the ban.
The Critical Disconnect: Gaming’s Exclusion
Despite the inclusion of streaming services, a significant and concerning exemption exists in the subsidiary legislative rules: online games are explicitly excluded from the age-restricted social media platform definition.
This decision effectively immunizes platforms like Roblox, Fortnite, Minecraft, and other highly social digital worlds from the immediate and stringent age-assurance and compliance burden now placed on Meta, TikTok, and YouTube.
For the esports and gaming industry, this creates a profound regulatory double standard. While the Australian government correctly identifies the functional similarities between social media and modern streaming platforms (Twitch, Kick), it fails to apply this same logic to social gaming environments, which possess:
Pervasive Social Interaction: Games are now critical social hubs, featuring direct messaging, voice chat, and large-scale, persistent public lobbies. For many minors, these platforms represent their primary digital social sphere.
Unstructured User-Generated Content (UGC): Platforms like Roblox thrive on UGC, which, while highly creative, presents significant moderation challenges and exposes young users to unvetted content and community-driven harm, identical to the risks cited for traditional social media.

An Inadequate Regulatory Framework for Gaming
The decision to exempt online games leaves the sector primarily governed by the outdated National Classification Scheme. This scheme, designed for physical media sales, is structurally incapable of regulating the dynamic, live, and socially rich nature of modern online gaming.
The key points of failure are:
Classification vs. Conduct: The classification system addresses content ratings (M, MA 15+), not real-time user-to-user interaction and conduct, which is the core concern of the social media ban.
Age-Assurance Gap: While social media platforms must now invest heavily in complex, privacy-preserving age-verification technology, gaming platforms are under no equivalent mandate, leading to a massive disparity in platform governance and youth protection standards across the digital ecosystem.
Financial Risk: The classification system does not adequately address the risks associated with in-game gambling mechanics, loot boxes, and aggressive microtransaction marketing—all areas where children are financially and psychologically vulnerable.
The Looming Compliance Challenge
The government has flagged a review of the legislation within two years, with a mandate to consider extending the ban to online games.
For major gaming companies and ecosystem stakeholders, this provides a short window of regulatory reprieve, but it is accompanied by a shadow of future compliance risk. The exclusion of online games now looks less like a permanent policy stance and more like a tactical omission to manage the initial legislative complexity.
Our takeaway for the industry is clear:
Proactive Risk Mitigation: Gaming companies must not wait for the two-year review. They should be proactively investing in robust, best-practice age-assurance technologies and enhancing in-game moderation tools.
Unified Safety Standards: The legal inclusion of Twitch and Kick sets a precedent. The entire gaming ecosystem should align its Platform Safety Standards with the emerging expectations of the eSafety Commissioner, preparing for a future where a unified minimum age for all interactive digital social services is mandated.
Failure to address the regulatory fault line between social media and social gaming leaves children exposed and puts the industry at risk of a far more disruptive and rushed regulatory overhaul in the near future.
This current exemption is not a 'free pass'; it’s a temporary stay before the inevitable expansion of Australia’s pioneering online safety regime.



