In a striking revelation, a report by Australia’s eSafety regulator indicates more than 80% of children aged 12 and under used social media or messaging services intended for individuals aged 13 and over. This trend raises significant issues regarding online safety, particularly as the government prepares to introduce a social media ban for those under 16 by the end of the year.
The report highlights the prevalence of platforms such as YouTube, TikTok, and Snapchat among underage users. While these companies maintain a minimum age requirement of 13 for account creation, the findings suggest a troubling gap in effective age verification measures. Notably, children aged 8 to 12 were surveyed, revealing that 84% had accessed at least one social media platform since last year. Furthermore, over half of these children reported using a parent or carer’s account, while a third had their own profiles, often set up with parental assistance.
Despite this apparent violation of age restrictions, only 13% of children had their accounts terminated for not meeting the age requirement. The eSafety commissioner, Julie Inman Grant, emphasized the shared responsibility of online safety, involving social media companies, tech manufacturers, and caregivers.
The report criticizes the current inconsistency in how social media platforms assess user ages, particularly during account sign-up processes. Without rigorous verification methods, children can easily bypass restrictions by providing false information, thus exposing them to potential risks online. Some platforms, including Snapchat, TikTok, Twitch, and YouTube, stated they use various tools to identify underage users after account creation; however, this reactive approach means children might remain at risk during initial usage.
As Australia grapples with these pressing concerns, the upcoming legislative changes aim to mitigate the risks associated with underage social media use, potentially reshaping how younger generations engage with digital platforms.
The report highlights the prevalence of platforms such as YouTube, TikTok, and Snapchat among underage users. While these companies maintain a minimum age requirement of 13 for account creation, the findings suggest a troubling gap in effective age verification measures. Notably, children aged 8 to 12 were surveyed, revealing that 84% had accessed at least one social media platform since last year. Furthermore, over half of these children reported using a parent or carer’s account, while a third had their own profiles, often set up with parental assistance.
Despite this apparent violation of age restrictions, only 13% of children had their accounts terminated for not meeting the age requirement. The eSafety commissioner, Julie Inman Grant, emphasized the shared responsibility of online safety, involving social media companies, tech manufacturers, and caregivers.
The report criticizes the current inconsistency in how social media platforms assess user ages, particularly during account sign-up processes. Without rigorous verification methods, children can easily bypass restrictions by providing false information, thus exposing them to potential risks online. Some platforms, including Snapchat, TikTok, Twitch, and YouTube, stated they use various tools to identify underage users after account creation; however, this reactive approach means children might remain at risk during initial usage.
As Australia grapples with these pressing concerns, the upcoming legislative changes aim to mitigate the risks associated with underage social media use, potentially reshaping how younger generations engage with digital platforms.