There is global momentum to bolster children's privacy protections — including in Australia. In countries such as the U.K. and the U.S., privacy reforms have sought to advance the best interests of the child, create a safe space for children to thrive online and empower responsible parents and caregivers. Australia's current privacy regulatory landscape requires an overhaul to pursue these goals.
According to the Australian Community Attitudes to Privacy Survey, released in August 2023, protecting their child's personal information is a major concern for 79% of Australian parents, with only 50% of parents feeling they can protect their child's privacy. For 91% of parents, privacy is of high importance when deciding whether to provide their child with access to digital devices and services.
The increased use of artificial intelligence, generative AI technology and social media apps by children presents new privacy risks and exposes them to harms including economic exploitation and exposure to dark patterns or explicit material, bullying and harassment — such as deepfakes and doxxing — emotional distress, and threats to physical safety.
There has been a rise of educational technology in Australian schools, with a recent Human Rights Watch report finding 89% of global educational apps and websites assessed, including those used by Australian schools, put at risk or directly violated children's privacy by using personal information for purposes unrelated to education — including extensive tracking and sharing of personal information with advertising technology companies.
Recent regulatory changes and proposed privacy reforms in Australia intend to address emerging privacy risks and potential harms to children, with a focus on digital harms.
Privacy Act
Australia's Privacy Act 1988 does not currently specifically address children's privacy. Along with state health information laws, the act does require that an individual has capacity to consent, if it is used as a lawful basis to collect, use or disclose personal information (used in limited scenarios). Capacity is determined by the organization or agency handling the personal information.
As there is no prescribed age to determine capacity, the Office of the Australian Information Commissioner recommends that, as a general rule: An individual under the age of 18 has the capacity to consent if they have the maturity to understand what's being proposed; if a child lacks maturity, a parent or guardian may be able to consent on their behalf; and if it is not practical to assess the capacity of individuals on a case-by-case basis, it can be assumed that an individual over the age of 15 has capacity.
There is a permitted exemption under the Privacy Act for organizations providing a health service to collect from, use or disclose to a responsible person (including a parent or guardian) a child’s health information. The OAIC recommends, as a general rule, when determining to disclose health information, a health service provider should also consider the child's degree of autonomy, and understanding of the relevant issues, circumstances and the nature of the information being handled. An example is shared where a child explicitly asks for information to be kept in confidence, such as a pregnancy or mental illness, as a reason not to disclose their health information to a parent. If the child is determined to lack capacity, they may still be able to contribute to decisions and should be involved in the decision-making process to the extent possible.
eSafety Commissioner and the Online Safety Act
The eSafety Commissioner has pioneered Safety by Design to make digital spaces safer and more inclusive to protect vulnerable persons, such as children. Its broad advocacy and powers relating to online safety intersect with privacy goals to protect children from harms online, but the eSafety Commissioner highlighted they are two related, but distinct, concepts.
Australia's Online Safety Act 2021 gave the eSafety Commissioner new powers to protect the online safety of all Australians, with particular focus on protecting children from online abuse and exposure to harmful content. Under the act, the eSafety Commissioner can require online service providers to report on how they comply with Basic Online Safety Expectations. The expectations include protections for children from content that is not age appropriate. Reasonable steps online service providers may take to meet these expectations are based on the nature of the business, but may include ensuring default privacy and safety settings of services targeted at, or used by children, are robust and set to the most restrictive level.
In March 2023, as part of a government response, the eSafety Commissioner submitted to the government a roadmap on age verification, in particular to prevent and mitigate harm to children from online pornography.
Proposed Privacy Act reforms and Children's Online Privacy Code
In 2019, the Australian Competition and Consumer Commission released its Digital Platforms Inquiry Report and in February 2023 the Attorney-General's Department publicly released the Privacy Act Review Report, both of which raised the need to better protect children's privacy online. In response to the attorney-general's report, the Australian government in September 2023 recognized "(c)hildren are particularly vulnerable to online harms. Children increasingly rely on online platforms, social media, mobile applications and other internet connected devices in their everyday lives."
The government agreed to include in its proposed Privacy Act reforms that a child would be defined as an individual under 18 and to introduce a Children's Online Privacy Code that applies to online services "likely to be accessed by children." To the extent possible, the scope of the code is to align with the U.K. Age-Appropriate Design Code. The government's intention to create a code that is consistent with and adapted from global standards will be practical for Australian entities with a global presence to comply with.
The government also agreed in principle that:
- The act should codify that valid consent must be given with capacity, including children.
- Collection notices and policies must be clear and understandable to its audience, particularly any information addressed specifically to a child.
- Entities should regard the best interests of the child when determining whether the collection, use or disclosure of personal information relating to a child is fair and reasonable in the circumstances.
- A right to de-index online search results containing personal information about a child should be introduced.
- Direct marketing to a child should be prohibited unless the personal information used for direct marketing was collected directly from the child and the direct marketing is in the child's best interests.
- Targeting to a child should be prohibited, with an exception for targeting that is in the child's best interests.
- Trading in the personal information of children should be prohibited.
Keeping the momentum going
The office of the eSafety Commissioner has said it will work with the tech industry to develop codes to help online service providers comply with obligations under the new Online Safety Act, which may include further children's privacy-related requirements. On 20 Feb., Australia and the U.K. signed an Online Safety and Security Memorandum of Understanding to advance online safety, including to work together to protect children’s safety and privacy through regulatory coherence between the two countries.
Australian Prime Minister Anthony Albanese also recently proposed bringing forward legislation in response to the Privacy Act review, including laws to prevent doxxing. A recent open letter from children's privacy advocates to the attorney general, minister for families and social services, and minister for communications voices concern over the "extensive process" for privacy reform in Australia, and its urgency to protect children's privacy in a data-driven economy.
Further consultation and legislative proposals are anticipated, and the government agreed those developing a Children's Online Privacy Code will be required to consult broadly with children, parents, experts, advocates, industry and the eSafety Commissioner. Australian businesses providing services that are likely to be accessed by children should start to evaluate their privacy practices and keep a watch on how local and global standards evolve.