On 28 Nov., the Australian Parliament passed the Online Safety Amendment (Social Media Minimum Age) Bill 2024 to amend the Online Safety Act 2021, giving businesses one year to comply before it comes into effect. The law introduces a legal minimum age of 16 to hold an account with certain social media platforms and requires these platforms to introduce reasonable age-gating practices. Specific privacy protections will require limited use and retention of personal information collected to comply with the law, and robust penalties for noncompliance.
Intended to protect children's safety, health and well-being through corporate responsibility
The new law intends to mitigate negative impacts and harm to children through social media use by limiting their exposure during critical stages of their development. The explanatory memorandum highlighted that children on social media are vulnerable to harmful products or content; harassment, including by predators; and things that impact sleep, stress levels and attention. This includes addictive features, persistent notifications and alerts. These harms have resulted in negative adolescent health and well-being, including life satisfaction.
The law places responsibility on certain social media platforms to self-regulate and to prevent children from holding accounts or profiles where the government considers these harms are likely to occur. The explanatory memorandum explains major social media platforms exclude children under the age of 13 in their terms of service and says "setting a minimum age removes ambiguity about when the 'right' time is for their children to engage on social media and establishes a new social norm."
Applies to age-restricted social media platforms
The law applies to "age-restricted social media platforms," which are defined as electronic services for the sole or significant purpose of enabling online social interaction between two or more end users, including linking or posting material for social purposes. Legislative rules can set out further conditions of coverage and specify electronic services that are within scope or exempted. In the explanatory memorandum, the government indicated the law is intended to apply to social media companies like Snapchat and Facebook and not to messaging, online gaming, and services primarily for education and health support, like Google Classroom or YouTube.
Introduction of age-restriction requirements and specific privacy protections
A covered platform must take reasonable steps to prevent age-restricted users, defined as an Australian child who has not reached 16 years old. Since users are logged-in account holders, children can continue to access products and content on these platforms through a logged-out state. The law is purposely technology-agnostic to remain relevant with evolving age-verification technology, and it offers flexibility to select age-verification and/or age-assurance practices. Government-issued identification material or an accredited service under the Digital ID Act 2024, such as MyID, may be used if those means are considered reasonable and alternative means for age assurance are available to users.
Legislative rules can restrict the types of information a covered platform may collect to comply with the law. Existing requirements to collect personal information that is reasonably necessary for a business function or activity from under the Privacy Act 1988 apply. Covered platforms are restricted from using personal information for purposes other than compliance with the law, unless permitted under the Privacy Act 1988 or explicit consent from the individual is obtained, which must be voluntary, informed, current, specific and unambiguous. The platform must then destroy the information after using or disclosing it for the purposes for which it was collected.
Significant penalties and enforcement
Failure to comply with age-restriction requirements will result in fines up to 30,000 penalty units, or AUD9.5 million as of 2024. Noncompliance with privacy protections will be considered an interference with the privacy of the individual for the purposes of the Privacy Act and will be subject to its penalty scheme, up to AUD50 million as of 2024. The information commissioner will hold new information-gathering powers and may issue notifications to a platform and release information publicly if it is satisfied that the platform has contravened the law.
Striking the right balance to achieve its goals?
The controversial law has come with mixed reviews about whether the strict ban adequately addresses its intention to protect children's safety, health and well-being, or causes more harm and introduces new risks. Is the law adequately comprehensive, given its specific scope and exclusions as outlined above? Does the law instead compromise children's autonomy and accessibility to the digital world and all its benefits, including learning and growing through a community?
Do we not give enough credit to digital-savvy children who will find alternative avenues to connect online, making the ban futile? Does the law introduce significant risks and burdens on social media platforms to collect and use personal and sensitive information and to establish a bespoke age restriction process? Are there other ways to protect children online that need to be tackled, including through existing privacy and safety laws, age-appropriate accessibility and safety by design practices, and public education and advocacy?
The eSafety Commissioner administers the Online Safety Act and provides a suite of educational content and services on children's online safety, including for educators and parents. In a LinkedIn post, Privacy Commissioner Carly Kind highlighted that laws, not bans, make kids safer online. This includes privacy laws and the upcoming Children's Online Privacy Code, which is one piece of the puzzle to shape an online environment that is a better place for children.
International legal counterparts, notably the U.K. Online Safety Act and the EU Digital Services Act, cover a broader range of entities and introduce a a more comprehensive set of privacy and safety requirements beyond age restrictions. Legislative rules will provide a greater definition of the new law, and information commissioner guidance is likely to follow. The law is also required to be subject to an independent review within two years of coming into effect.
Whether the ban is the right move remains to be seen, and certainly other avenues to achieve its goals must be explored.
Ilana Singer, CIPP/E, CIPM, CIPT, is a senior advisor, privacy and data, at SEEK.