Research indicates that over 80% of children aged 12 and under engaged with platforms meant for users older than 13, highlighting a need for stronger age verification measures.
**Growing Concern: Young Australian Children Committing to Social Media Use**

**Growing Concern: Young Australian Children Committing to Social Media Use**
A recent study by Australia's eSafety regulator reveals alarming statistics about underage social media use, prompting discussions on impending regulations.
Recent findings from a report by Australia’s eSafety regulator have sparked significant discussions about the social media habits of young children in the country. The study revealed that more than 80% of Australian children aged 12 and below utilized social media or messaging applications last year, which are typically intended for users aged 13 and older. The most frequented platforms included popular names like YouTube, TikTok, and Snapchat. These revelations come on the heels of Australia’s plans to introduce a potential social media ban for individuals under the age of 16, anticipated by the end of this year.
The report scrutinized several leading companies including Discord, Facebook and Instagram (operated by Meta), Reddit, Snap, TikTok, and Twitch, although the platforms did not immediately respond to inquiries regarding the findings. Generally, these services mandate that users must be at least 13 years of age to create an account, although exceptions exist. YouTube, for instance, offers a Family Link feature which permits access for younger users under guardian supervision, alongside its child-specific platform, YouTube Kids, which was notably absent from the report’s analysis.
eSafety Commissioner Julie Inman Grant stated that the insights from the report would play a crucial role in shaping forthcoming measures. She emphasized that ensuring online safety for children is a collective responsibility involving social media firms, device manufacturers, educators, parents, and legislators alike.
In surveying over 1,500 Australian children aged between eight and 12, researchers uncovered that a staggering 84% reported having engaged with at least one social media service. Notably, more than half of these children accessed such services through a parent's or guardian's account, while a third possessed their own accounts, often with parental aid in setting them up. Encouragingly, only 13% of those with accounts reported them being deactivated by social media companies due to being underage.
The report pointed out concerning inconsistencies among the examined services when it comes to enforcing age verification protocols. It reported a prevalent absence of effective measures at the account registration stage, allowing underage users to misrepresent their ages and create accounts. Furthermore, the responses from platforms regarding their age verification strategies revealed varied approaches. While Snapchat, TikTok, Twitch, and YouTube claimed to utilize technologies to identify underage users, these measures often rely on user interactions, which may inadvertently expose children to potential online risks before they can be flagged.
In conclusion, the eSafety report underscores an urgent need for enhanced mechanisms to protect young users and foster safer online experiences in the ever-evolving social media landscape in Australia.
The report scrutinized several leading companies including Discord, Facebook and Instagram (operated by Meta), Reddit, Snap, TikTok, and Twitch, although the platforms did not immediately respond to inquiries regarding the findings. Generally, these services mandate that users must be at least 13 years of age to create an account, although exceptions exist. YouTube, for instance, offers a Family Link feature which permits access for younger users under guardian supervision, alongside its child-specific platform, YouTube Kids, which was notably absent from the report’s analysis.
eSafety Commissioner Julie Inman Grant stated that the insights from the report would play a crucial role in shaping forthcoming measures. She emphasized that ensuring online safety for children is a collective responsibility involving social media firms, device manufacturers, educators, parents, and legislators alike.
In surveying over 1,500 Australian children aged between eight and 12, researchers uncovered that a staggering 84% reported having engaged with at least one social media service. Notably, more than half of these children accessed such services through a parent's or guardian's account, while a third possessed their own accounts, often with parental aid in setting them up. Encouragingly, only 13% of those with accounts reported them being deactivated by social media companies due to being underage.
The report pointed out concerning inconsistencies among the examined services when it comes to enforcing age verification protocols. It reported a prevalent absence of effective measures at the account registration stage, allowing underage users to misrepresent their ages and create accounts. Furthermore, the responses from platforms regarding their age verification strategies revealed varied approaches. While Snapchat, TikTok, Twitch, and YouTube claimed to utilize technologies to identify underage users, these measures often rely on user interactions, which may inadvertently expose children to potential online risks before they can be flagged.
In conclusion, the eSafety report underscores an urgent need for enhanced mechanisms to protect young users and foster safer online experiences in the ever-evolving social media landscape in Australia.