SYDNEY: Meta, ByteDance and Snap told Australian lawmakers on Tuesday they will comply with a new law that bans users under the age of 16 from holding social-media accounts, but warned the measure will be difficult to police and may have unintended consequences.
The legislation, due to take effect on 10 December, requires platforms to take “reasonable steps” to block or deactivate accounts for people aged under 16 or face fines of up to A$49.5 million.
Tech firms at a parliamentary hearing urged caution over enforcement and the potential for the law to push young people to less regulated corners of the internet.
Firms to deactivate accounts, contact users
Mia Garlick, Meta’s policy director for Australia and New Zealand, told the hearing the company would begin contacting users it had confirmed were under 16 — about 450,000 across Facebook and Instagram — to give them options for deleting content or having it stored until they turn 16.
“On paper, the goal from our perspective… would be to remove those under 16,” Garlick said, while acknowledging the company was still solving “numerous challenges” and faced “significant new engineering and age assurance challenges”.
Ella Woods-Joyce, TikTok’s public policy lead for Australia, said TikTok would “comply with the law and meet our legislative obligations” and was on track to do so.
TikTok told the hearing it had about 200,000 under-16 accounts in Australia and would offer affected users a choice to delete their photos and data or have their accounts deactivated and restored when they turn 16.
Snap’s Jennifer Stout, senior vice-president of global policy and platform operations, said the company did not agree with the policy but would abide by it.
Snap estimates it has around 440,000 under-16 accounts in Australia and will also contact those account holders to prepare them for the change.
“We don’t agree, but we accept and we will abide by the law,” Stout said.
Companies indicated they would use automated behaviour-tracking tools — often described as “age assurance” mechanisms — to identify accounts that claim to be held by older users but whose behaviour suggests they are under 16.
Enforcement and unintended harms
While the platforms say they will comply, several warned the ban is hard to enforce and could be counterproductive.
“Experts believe a ban will push younger people into darker corners of the internet where protections don’t exist,” TikTok’s Woods-Joyce told senators, calling the measure “blunt”.
Meta and TikTok both said identifying and removing under-16 accounts presented difficult technical and privacy trade-offs.
The companies stressed the law requires “reasonable steps” rather than blanket age verification of all users, a point government officials have previously emphasised.
YouTube, which also falls under the new rules, cautioned that Australia’s approach was “well intentioned but poorly thought through”.
Rachel Lord, YouTube’s local spokeswoman, said the legislation would be extremely difficult to enforce and “does not fulfil its promise of making kids safer online”.
Industry witnesses and some experts at the hearing described the law as “vague”, “rushed” and likely to pose operational problems for international platforms.
The Online Safety Act’s designers and Australia’s e-safety regulator have said platforms will not be required to verify the age of every user, but must take reasonable steps to detect and deactivate underage accounts. Companies that fail to comply face the AU$49.5 million penalty.
Australia’s online watchdog has suggested that a broader set of services, including messaging app WhatsApp, streaming platform Twitch and gaming site Roblox, could also be subject to the ban, raising questions about where lines will be drawn between social networking, messaging and gaming.
There is international interest in the Australian move, which officials describe as among the strictest age restrictions worldwide.



