The world's biggest social media companies are not doing enough to keep children in Australia off their platforms, according to the country's internet regulator. The legislation, which prohibits everyone under the age of 16 from using ten popular platforms, has been under scrutiny by the eSafety Commission due to significant compliance concerns with Facebook, Instagram, Snapchat, TikTok, and YouTube.
Advocates have defended the ban as a necessary measure to protect minors from harmful content and addictive algorithms, which has sparked interest from other countries, including the UK, to consider similar regulations. In its first report since the law was enacted late last year, the eSafety Commission flagged multiple poor practices from the five key platforms involved, such as allowing minors to repeatedly attempt age verification methods and insufficient measures to prevent new accounts from being created by under-16s.
Since the law's genesis on December 10, 2022, the regulator noted that 4.7 million accounts had been restricted or removed, yet concerns persist that many children still access these platforms despite the ban. Many students at a Sydney school indicated they had bypassed age verification checks, suggesting a gap in the law's enforcement and compliance among major social media platforms.
While parents largely support the ban, critics argue that the focus should shift towards educating children about digital risks rather than outright prohibitions. Moreover, the ban potentially marginalizes minority groups who could benefit from community-building online. The eSafety Commissioner indicated that while some steps had been taken, a substantial overhaul of entrenched social media practices will take time and community involvement to achieve lasting change.
Advocates have defended the ban as a necessary measure to protect minors from harmful content and addictive algorithms, which has sparked interest from other countries, including the UK, to consider similar regulations. In its first report since the law was enacted late last year, the eSafety Commission flagged multiple poor practices from the five key platforms involved, such as allowing minors to repeatedly attempt age verification methods and insufficient measures to prevent new accounts from being created by under-16s.
Since the law's genesis on December 10, 2022, the regulator noted that 4.7 million accounts had been restricted or removed, yet concerns persist that many children still access these platforms despite the ban. Many students at a Sydney school indicated they had bypassed age verification checks, suggesting a gap in the law's enforcement and compliance among major social media platforms.
While parents largely support the ban, critics argue that the focus should shift towards educating children about digital risks rather than outright prohibitions. Moreover, the ban potentially marginalizes minority groups who could benefit from community-building online. The eSafety Commissioner indicated that while some steps had been taken, a substantial overhaul of entrenched social media practices will take time and community involvement to achieve lasting change.



















