Australia's Government Warns Social Media Firms as Minor Usage Surges

Australia's Government Warns Social Media Firms as Minor Usage Surges

Implementing effective age restrictions on social media platforms has proven challenging in Australia, a nation leading efforts to limit access for minors under 16. Although a law enacted on December 10 requires social media platforms to verify user ages, a recent report from the safety regulator, eSafety, reveals that a “significant percentage” of minors continue to use these services.

In the initial weeks following the law's implementation, platforms reported deleting or blocking 4.7 million accounts belonging to minors. However, this number loses significance in light of the overall statistics: nearly 70% of minors remain active on sites like , Instagram, Snapchat, and TikTok, with almost half still maintaining accounts on YouTube. While the percentage of minors on social media dropped from 49.7% to 31.3%, eSafety cautions that many have yet to undergo proper age verification, raising concerns about the effectiveness of the regulation.

Global Efforts to Safeguard Minors

Australia's pioneering approach to social media regulation has inspired other nations, including , , and the , to consider similar measures. The driving force behind these actions is the alarming data regarding minors' mental . A study commissioned by the Australian government found that 96% of aged 10 to 15 use social networks, with 70% encountering harmful content ranging from misogynistic messages to videos promoting eating disorders and suicidal ideation. This surge in exposure is largely attributed to the platforms' designs, which encourage prolonged screen time and the consumption of detrimental content.

Challenges in Implementation

Despite the clear motivation for these regulations, the means of enforcement remain less straightforward. The onus has largely been placed upon the social media platforms to ensure that users under 16 do not gain access. Following the law's enactment, Meta, the parent company of , Instagram, and Threads, began closing down accounts of teenagers. Meta has stated that mistakenly expelled individuals can appeal by providing official identification or a selfie video for age verification.

However, the eSafety report identifies systemic issues within the current verification processes. These flaws include the ability to alter declared ages, repeated verification attempts, and registration using false information, all of which create loopholes that minors can exploit to circumvent regulations.

International Trends and Responses

Julie Inman Grant, head of eSafety, has indicated that the organization is compiling evidence for potential legal actions against the platforms. She emphasized the need for these companies to demonstrate they have implemented “reasonable” measures to prevent minors under 16 from registering accounts.

In a parallel effort, Indonesia has recently enacted similar regulations but has accused major technology firms of inadequate cooperation in enforcing the age restrictions. The Indonesian Government has called upon companies like Meta Platforms and Google to enhance their systems for detecting and disabling accounts belonging to minors under 16 years of age. Indonesian Communication and Digital Affairs Minister Meutya Hafid highlighted that these platforms have not adequately communicated their strategies for compliance, prompting formal notices to TikTok and Roblox as well.

As Australia and Indonesia lead the charge, other countries are also assessing measures to restrict minors' access to social networks. This international trend aims to strengthen age verification processes and mitigate the risks associated with social media use among children and adolescents.