Digital Protection or Digital Exclusion: Balancing Protection, Privacy, and Participation
By Talia Al Hammouri
Published
Topic
Legal Commentary
See
Disclaimer: Views expressed herein are solely those of the author and do not necessarily reflect the views of other writers or the Law Student Review

I INTRODUCTION
The rapid growth of social media platforms has significantly transformed the way young individuals interact, engage in public discourse, and modern social and political discussions. Nevertheless, rising concerns about the psychological effects of excessive social media use have led governments worldwide to consider implementing stricter regulations on online platforms. In Australia, these concerns resulted in the enactment of the Online Safety Amendment (Social Media Minimum Age) Act 2024 (Cth), which introduces a national minimum age requirement of sixteen for users of certain social media platforms.[1] The legislation modifies the Online Safety Act 2021 (Cth)[2] and mandates platform providers to take reasonable steps to prevent individuals under the age of sixteen from creating or maintaining social media accounts. While the intent behind the reforms is to diminish potential dangers to younger users, they also raise complex legal and policy challenges regarding privacy, enforceability, and the freedom of communication in a digital context. Although the Act demonstrates a legitimate attempt to safeguard children from online harm, its efficacy and proportionality remain questionable due to the challenges associated with practical enforcement.
II DISPROPORTIONATE RESTRICTION
The Online Safety Amendment (Social Media Minimum Age) Act 2024 (Cth)[3] is framed as a protective measure, yet its primary mechanism is exclusion. By classifying “age-restricted users” as individuals under sixteen and prohibiting their access to certain platforms, the Act does not regulate harmful conduct within digital spaces, rather it removes a category of users from those spaces altogether. Rather than making social media safer, the legislation sidesteps the issue by restricting participation. Social media platforms serve as essential venues for communication and civic engagement. Thus, restrictions on access may raise issues related to the implied freedom of political communication recognised by the High Court.[4]
Further, this approach reflects a broader tendency in digital regulation to prioritise risk elimination over meaningful engagement with harm. Section 63B indicates that the aim of the system is to “reduce the risk of harm” to young users.[5]However, the legislation fails to clarify what constitutes “harm,” nor does it differentiate among various levels or forms of risk. As a result, all under 16 users are treated as equally vulnerable, regardless of their capacity to engage responsibly online. This blanket approach is difficult to justify in a digital environment where young people are not merely passive consumers, but active participants in communication, and civic engagement. Social media platforms function as contemporary public forums and are not merely sites of passive consumption, but dynamic environments in which young people actively articulate their identities and perspectives. Research demonstrates that these platforms provide opportunities for adolescents to share ideas, engage in creative expression, and develop a sense of self through the creation of content, including writing, video production, and digital storytelling.[6] Thus, social media empowers young users to “express their identities, share their thoughts, and showcase their creativity” through content created by them, whilst also enabling interaction that contributes to confidence and self-perception.[7]
Beyond individual expression, these platforms play a critical role in enabling social connection and belonging. Young people use social media to engage with peers, participate in communities of shared interest, and access diverse perspectives that may not be available in their offline environments.[8] This is particularly significant for marginalised groups, for whom digital spaces can provide support networks and opportunities for visibility that are otherwise limited.[9] Studies further indicate that social media use is positively associated with creativity, collaboration, reinforcing its role as a space of productive participation rather than mere entertainment.[10]
III ENFORCEABILITY AND THE LIMITS OF “REASONABLE STEPS”
A central issue undermining the effectiveness of the Online Safety Amendment (Social Media Minimum Age) Act’s dependence on the requirement that platform providers take “reasonable steps” to prevent underage access,[11] is an issue weakening its efficacy. This obligation is imposed under s 63D, which establishes a civil penalty for non-compliance.[12]The use of the term “reasonable steps,” however, is not defined within the Act, creating significant uncertainty in both interpretation and enforcement.[13] Rather than prescribing specific compliance mechanisms, the Act delegates responsibility to the eSafety Commissioner to formulate guidelines regarding what constitutes reasonable steps.[14] These guidelines, however, are expressly not legislative instruments and therefore lack binding legal force.[15] As a result, platform providers are left to interpret their obligations within a context that is legally enforceable but substantively ambiguous. Thus, this raises concerns regarding regulatory coherence as without clear statutory benchmarks, compliance becomes dependent on evolving administrative expectations rather than fixed legal obligations in the absence of defined statutory benchmarks.[16] As a result, this raises the possibility of uneven implementation and could encourage minimal compliance tactics meant to evade accountability rather than provide meaningful protection.
The severity of the penalties associated with non-compliance; up to 30,000 penalty units, further complicates this issue,[17] as although these consequences may promote compliance, they do not eliminate the uncertainty surrounding what constitutes adequate conduct. This creates a regulatory paradox as platforms face significant liability exposure without clear guidance on how to meet their obligations.[18] This enforceability problem is further compounded by the global expansion of social media platforms. The practical ability of domestic regulators to ensure compliance is limited by the fact that many providers operate across many jurisdictions.[19] Comparative regulatory regimes, like the data protection law of the European Union, show that institutional capacity and cross-border collaboration are frequently necessary for effective enforcement in the digital world.[20]
IV PRIVACY TENSIONS AND REGULATORY CONTRADICTIONS
The Act attempts to compromise age verification with privacy protections by restricting the collection of certain forms of personal information. Section 63DA prohibits the collection of specified information for the purpose of compliance, while s 63DB further restricts the use of government-issued identification and digital identity services. These provisions reflect a clear legislative intent to minimise privacy intrusions.
However, these protections create tension within the regulatory framework as platforms are required to prevent underage access, yet are simultaneously limited in their ability to verify user age using reliable identification methods. The Act permits the use of “alternative means” of verification, but does not define what these entail or establish minimum standards of effectiveness.[21] Weak age-verification systems are highly vulnerable to circumvention as research has shown that minors often falsify their age in order to gain access to limited services, especially when self-declaration or readily manipulated data is used for verification.[22] In this context, by prohibiting the use of more reliable verification tools, the Act's limitations may compromise its own goal in this situation. Further, privacy obligations arise under s 63F, which requires that personal information collected for age verification be used only for that purpose and subsequently destroyed.[23] While this aligns with data minimisation principles under Australian privacy law,[24] it may reduce the ability of regulators.d.¹⁸
V CONCLUSION
The Online Safety Amendment (Social Media Minimum Age) Act 2024 (Cth) reflects a legitimate attempt to address the risks associated with youth engagement in digital environments. However, its reliance on exclusion, combined with uncertainty in enforcement and tensions between privacy and verification, raises substantial concerns regarding its effectiveness and proportionality. A more nuanced regulatory approach is one that addresses harmful conduct directly while preserving access to digital spaces, and may better balance the competing interests of protection and participation in the digital age.
VI FOOTNOTES
[1] Online Safety Amendment Act 2024 (Cth) s 5.
[2] Ibid.
[3] Ibid.
[4] Lange v Australian Broadcasting Corporation (1997) 189 CLR 520, 560-61.
[5] Ibid 63B.
[6] E Zimmermann et al, ‘Using Social Media to Promote Life Skills Among Adolescents’ (2025).
[7] National Academies of Sciences, Engineering, and Medicine, Social Media and Adolescent Health (2023).
[8] Raising Children Network, ‘Social Media: Benefits and Risks for Teenagers’ (2025).
[9] KL McAlister et al, ‘Social Media Use in Adolescents: Bans, Benefits, and Emotion Regulation’ (2024) JMIR Mental Health.
[10] MA Gulzar, ‘Social Media Use and Student Engagement’ (2022) 41 Behaviour & Information Technology 1.
[11] Online Safety Amendment Act 2024 (Cth) s63D.
[12] Ibid.
[13] Ibid.
[14] Ibid s 27(1)(qa).
[15] Ibid s 27(6).
[16] Julia Black, ‘Decentring Regulation’ (2001) 54 Current Legal Problems 103.
[17] Online Safety Amendment Act 2024 (Cth) s 63D.
[18] Cary Coglianese, ‘The Limits of Performance-Based Regulation’ (2017) 50 University of Michigan Journal of Law Reform 525.
[19] Dan Jerker B Svantesson, Solving the Internet Jurisdiction Puzzle (Oxford University Press, 2017).
[20] Paul M Schwartz, ‘Global Data Privacy: The EU Way’ (2019) 94 NYU Law Review 771.
[21] Ibid s 63DB.
[22] Sonia Livingstone et al, EU Kids Online (2014).
[23] Online Safety Amendment Act 2024 (Cth) s 63F.
[24] Privacy Act 1988 (Cth) such 1 cl 6 (‘APP 6’), cl 2 (‘APP 11.2’).