Surprise! Big Tech has been a bit rubbish at enforcing Australia’s kids social media ban

Regulator ‘moving into an enforcement stance’ and investigating Meta, YouTube, TikTok and Snapchat as millions continue to doomscroll

by · The Register

Australia’s eSafety Commission is “moving into an enforcement stance” after finding that Meta, YouTube, TikTok and Snapchat haven’t done enough to comply with the nation’s social media minimum age (SMMA) obligation, which bans social media outfits from providing their services to children under 16 years of age.

The Commission today delivered its first report on compliance with the ban, which finds that regulated social media operators “have taken some steps to comply with the SMMA obligation” and blocked around five million accounts.

However, eSafety also surveyed 898 Australian parents and found "around 7 in 10" reported that their child still had an account. The regulator also “observed poor practices by some platforms” and listed the following four examples of their bad behavior:

  • Messaging to children aged under 16 on some platforms has encouraged them to attempt age assurance even where their declared age prior to 10 December 2025 was under 16.
  • In some cases, platforms have enabled children aged under 16 to repeatedly attempt the same age assurance method to ultimately obtain a 16+ outcome.
  • Pathways for reporting age-restricted accounts have generally not been accessible and effective, particularly for parents.
  • Some platforms appear not to have done enough to prevent children aged under 16 from having accounts. However, eSafety is continuing its investigations to enable it to form a concluded view as to whether any platform has not taken reasonable steps to comply with the SMMA obligation.

The report offers examples of how social media platforms have failed to meet their obligations.

The one that caught The Register’s eye describes a 12-year-old who two years ago signed up for a social media account and falsely claimed they were 14 at the time. The platform now thinks the user is 16 but they are really 14.

In discussions with a parent, the child said the platform had not attempted to verify their age. The parent therefore requested the platform close their child’s account. The platform responded by asking for a legal letter to prove the parent’s status, a costly exercise the parent chose not to pursue.

The 14 year old remains able to use their ill-gotten social media account.

That sort of scenario has led eSafety to investigate what it describes as “potential non-compliance by five platforms – Snap, TikTok, Facebook, Instagram and YouTube.” The regulator wants to wrap up those probes and make a decision about any enforcement action “by the middle of 2026.”

“These investigations will require giving further legally enforceable information-gathering notices to assess whether the steps taken by platforms are reasonable, identifying gaps, and assessing the totality of all steps taken by a platform,” the report states. “eSafety will not hesitate to take enforcement action where it has sufficient evidence of non-compliance. This will include assessing whether the evidence provides reasonable grounds for commencing civil penalty proceedings.”

Australia’s government made the social media ban a key part of its policy platform, and other nations have used it as an example for their own efforts or policies. Indonesia, the world’s fourth-most-populous country, this week enacted a similar ban and several more nations are working on similar regulations.

eSafety’s report will therefore likely be widely read as it shows how social media operators dodge bans. One passage in the report may interest regulators around the world, as it sees eSafety observe expected “short-term increases in downloads of some emerging apps, but we have not seen any significant migration to non-compliant platforms or other online services that are not required to comply with the SMMA obligation.”

That’s noteworthy because opponents of social media bans have sometimes argued such regulations could see kids adopt social services run from jurisdictions beyond regulatory reach. eSafety’s evidence suggests that’s not happening, because “the profusion of online services young people may be migrating to do not have a critical mass of their peers established on these smaller, less entrenched services. ®