Social media platform Snapchat announced on Monday that it has disabled 415,000 accounts in Australia that are believed to belong to users under the age of 16, in adherence to the country's stringent age-based social media regulations. This action comes as global regulators increasingly scrutinize platform compliance regarding digital protection for minors, with nations such as France reportedly considering analogous legislation.
The new Australian legislation, which took effect on December 10, mandates that platforms like Snapchat, Meta, and TikTok take reasonable steps to prevent underage users from holding accounts, with potential fines reaching Aus$49.5 million for non-compliance. Snapchat indicated that while they are actively blocking accounts daily, the current method relies on imperfect age estimation technology.
Snapchat stated in an online release that the estimation accuracy is only correct within a two-to-three-year margin. This technological limitation means some individuals under 16 might retain access, while others aged just over the threshold could be incorrectly barred from the service. The eSafety regulator previously reported that tech firms had already removed 4.7 million accounts globally by the end of January.
Joining Meta, Snapchat urged Australian authorities to mandate age verification processes at the application store level rather than relying solely on internal platform checks. The company suggested that a centralized verification system would establish more consistent safeguards and significantly raise the barrier for circumvention.
However, Snapchat expressed reservations about the overall legislative approach, arguing that blocking access for younger users who rely on the platform to connect with close friends and family does not inherently enhance their online safety. The company maintains that cutting off communication channels may not align with objectives for user well-being.
Australia's enactment of this world-first regulation is attracting attention from other jurisdictions examining heightened digital safety measures for minors. Specifically, France's National Assembly recently advanced legislation aimed at banning children under 15 from utilizing social media platforms, reflecting a growing international consensus on addressing online risks.
For Snapchat, the immediate challenge remains balancing regulatory demands with the functional reality of its messaging-focused user base. The coming months will reveal the effectiveness of these self-imposed and mandated technological barriers against determined attempts to bypass age gates.