What to Expect from the Review of Australia’s Privacy Act
The author thanks Anna Johnston and Alex Kotova (Salinger Privacy) for their review and comments and Gabriela Zanfir-Fortuna, Josh Lee Kok Thong, Lee Matheson, and Isabella Perera (FPF) for their support with editing this post.
On February 16, 2023, Australia’s Attorney-General’s Department (AGD) released a final report (Review Report) on its multi-year review of Australia’s main privacy law, the Privacy Act 1988 (Privacy Act). The Review Report presents over a hundred concrete proposals to reform the Privacy Act.
A common trend across many of these proposals is the AGD’s desire to bring the protections in the Privacy Act, which was first enacted over 30 years ago, closer to those provided by other major data protection laws internationally, in particular, the GDPR. Notably, the Review Report proposes:
- Broadening the Privacy Act’s definition of “personal information;”
- Introducing the concepts of “controllers” and processors” into the Act;
- Adding a right to be forgotten, a right to erasure, and a right to object to certain forms of processing, including automated decision making (ADM), on top of the Act’s existing rights to access and correction; and
- Including a new requirement to conduct a Privacy Impact Assessment (PIA) for high-risk activities.
Other notable proposals in the Review Report include heightened transparency requirements for automated decision-making (ADM), and the introduction of a direct right of action for breaches of the APPs and separate statutory tort for serious invasions of privacy.
The Australian Government will likely translate at least some of the Review Report’s proposals into a Bill to amend the Privacy Act, which may be introduced in Parliament later in 2023.
Australia’s Privacy Act, which was passed by the Australian Parliament in 1988 and took effect in 1989, was one of the world’s first data protection laws.
In the three decades since the Act was passed, it has been regularly updated through legislative amendments. While the Act originally only extended to federal government agencies, major reforms in 2000 expanded the scope of the Act to the private sector for the first time. Further amendments in 2014 introduced a unified set of 13 “Australian Privacy Principles” (APPs) applying to all “APP entities” – a broad term which comprises a wide range of public- and private-sector entities other than small businesses and registered political parties.
The APPs establish rights and obligations regarding:
- Collection, use, and disclosure of personal information;
- Use of personal information for direct marketing;
- Cross-border transfer of personal information;
- Quality and security of personal information; and
- Access and correction to personal information.
The latest round of amendments to the Privacy Act, which were passed in December 2022 in response to several high-profile data breaches, increased penalties under the Act and expanded the enforcement powers of the privacy authority, the Office of the Australian Information Commissioner (OAIC).
In 2019, the Australian Competition and Consumer Commission (ACCC) published a final report on its three-year “Digital Platforms Inquiry” (DPI Report), which focused on the impact of online search engines, social media platforms, and other digital aggregation platforms on competition in the advertising and media markets. Broadly, the DPI Report recommended amending the Privacy Act to increase protection of personal information, including strengthening notice and consent requirements, enabling individuals to request erasure of their personal information, and providing individuals with a right of action against entities that interfere with their privacy.
In 2020, the AGD initiated a comprehensive review of the Privacy Act in response to the ACCC’s recommendations in the DPI Report. To that end, the AGD held two rounds of public consultation before releasing its final Review Report.
- In October 2020, the AGD released an issues paper, which outlined the present state of Australia’s privacy law and sought feedback on potential reforms across a broad range of issues, including the scope and application of the Privacy Act, the protections afforded by the Privacy Act, and how the Privacy Act is regulated and enforced.
- In October 2021, the AGD released a discussion paper, which presented more detailed proposals for reform to the Privacy Act in 28 distinct areas, based on feedback from the public and private sectors, civil society, and academia in response to the issues paper.
- Between January 2022 and February 2023, the AGD considered feedback on the discussion paper and convened roundtables and consultations with stakeholders on specific issues.
Overview of the Review Report’s Proposals
The Report makes 116 proposals to reform the Privacy Act in 28 key areas. Broadly, these proposals aim to strengthen the protection of personal information and individuals’ control over how their personal information is collected, used, and disclosed.
Several of the most notable proposals from the Review Report include:
- Amending the definition of “personal information” to broaden it and bring it in line with other modern data protection laws, like EU’s General Data Protection Regulation (GDPR);
- New requirements around de-identified information;
- Removing the “small business exemption” for private-sector entities with an annual turnover below AU$3 million;
- Amending the Privacy Act’s notice, consent, and transparency requirements;
- A new “fair and reasonable” test for all forms of processing of personal information;
- A new requirement to conduct a PIA for high-risk activities;
- New requirements concerning children’s privacy;
- New rights for individuals, such as the right to erasure of one’s personal information, to be delisted from online search engines, and to opt out of certain forms of processing, including ADM;
- New transparency requirements for ADM;
- New protections against direct marketing, targeting, and trading in personal information;
- A new controller-processor distinction;
- Refining the Privacy Act’s cross-border data transfer mechanisms;
- Introducing a tiered penalty framework;
- Introducing a direct right of action for breaches of the APPs and separate statutory tort for serious invasions of privacy; and
- Amending the Privacy Act’s data breach notification requirements.
1. The definition of “personal information” would be broadened (Proposals 4.1-4.4)
The Review Report proposes amending the definition of “personal information” in the Privacy Act so that the term would no longer refer to information “about” an individual but, rather, information that “relates to” an individual. This proposed change would bring the Privacy Act’s definition of “personal information” closer to the definitions of similar terms in other leading data protection laws, such as the EU’s General Data Protection Regulation (GDPR).
The Report also makes several other proposals to clarify the scope of the term “personal information,” including suggesting examples of personal information to be included in the Act and encouraging the OAIC to issue guidance on the relevant factors for determining whether information relates to an individual and how entities should assess whether an individual is “reasonably identifiable.”
2. De-identified information would be defined and included under some of the provisions of the Privacy Act (Proposals 4.5-4.8)
The Review Report proposes:
- Clarifying the definition of de-identified information; and
- Extending the privacy protections in APPs 8 (cross‑border disclosure) and 11 (security) to de-identified information.
In particular, APP entities would be required to take reasonable steps to protect de-identified information from:
- Misuse, interference, and loss; and
- Unauthorized re-identification, access, modification, or disclosure.
The Report proposes that, when disclosing de-identified information overseas, an APP entity would be required to take reasonable steps to ensure that the overseas recipient does not breach the APPs, including ensuring that the recipient does not re-identify the information or further disclose the information in a way that would undermine the effectiveness of de-identification.
The Report also proposes prohibiting re-identification of de-identified information, subject to exceptions, and holding further consultations on creating a criminal offense for malicious re-identification.
3. The small business exemption would be removed (Proposals 6.1-6.2)
The Privacy Act currently applies to private sector entities only if they either have an annual turnover of AU$3 million or above or if they undertake certain activities, such as providing a health service. The Review Report proposes removing the exception for small businesses whose annual turnover is below AU$3 million, based on community expectations that entities should protect personal information regardless of their annual turnover and the risk posed by serious data breaches.
However, the Review Report also proposes that before the exception is removed, an impact assessment should be conducted, and other measures should be undertaken, to ensure that small businesses are in a position to comply with the Privacy Act’s requirements.
4. The requirements for notice, consent, and transparency would be toughened (Proposals 11-12)
The current version of the Privacy Act generally requires an entity to provide individuals with a collection notice containing certain prescribed information when the entity collects personal information directly from those individuals (see APP 5).
However, the Act generally only requires the entity to obtain the individuals’ consent in a limited set of circumstances, such as collection of sensitive information, and use or disclosure of personal information for a secondary purpose (see APP 6) or for direct marketing (see APP 7).
The Review Report does not alter these fundamental requirements but instead, proposes clarifying these requirements by:
- Introducing an express requirement that notices at the time of collection of personal information should be clear, up-to-date, concise, and understandable, and should implement appropriate accessibility measures;
- Expanding the items of information that must be provided in a collection notice under APP 5 to include:
- Whether the entity processes personal information for a “high risk privacy activity”;
- The existence of individuals’ rights under the Privacy Act; and
- The types of information that may be disclosed to overseas recipients;
- Amending the definition of consent to codify existing OAIC guidance that consent must be voluntary, informed, current, specific, and unambiguous; and
- Expressly requiring that consent must be capable of being withdrawn as easily as the consent was originally given.
The Report also encourages the OAIC to develop guidance on how online services should design consent requests, and on standardized templates and layouts for privacy policies and collection notices, using standardized terminology and icons.
5. A requirement that collection, use, and disclosure of personal information must be “fair and reasonable in the circumstances” would be introduced (Proposal 12)
The Privacy Act currently requires that the collection of personal information must be done by lawful and fair means (APP 3.5). The Review Report proposes replacing this with a much broader requirement that any collection, use, or disclosure of personal information by an APP entity must be “fair and reasonable in the circumstances.” This requirement would apply regardless of whether the entity had obtained consent for collecting, using, or disclosing the personal information in question.
The Report proposes that the reasonability of any given processing activity be assessed objectively based on some or all of the following factors:
- Whether an individual would reasonably expect the personal information to be collected, used, or disclosed in the circumstances;
- The kind, sensitivity, and amount of personal information being collected, used, or disclosed;
- Whether the collection, use, or disclosure is reasonably necessary for, or directly related to, the functions and activities of the entity;
- The risk of unjustified adverse impact or harm;
- Whether the impact on privacy is proportionate to the benefit;
- If the personal information relates to a child, whether the collection, use, or disclosure of the personal information is in the best interests of the child; and
- The objectives of the Privacy Act.
6. A PIA would be required for high-risk activities (Proposal 13.1)
The Review Report proposes introducing a new requirement that entities conduct a PIA prior to commencing any activities that are “likely to have a significant impact on the privacy of individuals,” and provide the PIA to the OAIC on request.
The Report also recommends that the OAIC should issue guidance on the relevant factors for determining whether an activity is likely to have a significant impact on individuals’ privacy and provide examples of such activities, which may include:
- Collection, use, or disclosure of sensitive information or of children’s personal information on a large scale;
- Online tracking, profiling, and delivery of personalized content and advertising to individuals;
- Ongoing or real-time tracking of an individual’s geolocation;
- Use of biometric templates or information for the purposes of verification or identification, or when collection of biometric templates or information in publicly accessible spaces;
- Sale of personal information; and
- Collection, use, or disclosure of personal information for the purposes of ADM with legal or other significant effects.
7. The Review Report seeks to enhance children’s privacy (Proposals 16 and 20)
Proposal 16 of the Review Report proposes introducing several new provisions on children’s privacy in the Privacy Act, including a statutory definition of a “child” as an individual who has not reached 18 years of age.
On children’s capacity to consent to processing of their personal information, the Review Report proposes introducing an express provision in the Privacy Act stating consent is only valid if:
- The individual who gave the consent had the capacity to do so; and/or
- It is reasonable to expect that the individual would understand the nature, purpose, and consequences of the processing of their personal data, to which they gave consent.
This provision would also cover circumstances in which it would be appropriate or inappropriate for an entity to obtain consent from a child’s parent or guardian.
In line with the Review Report’s proposal to grant the OAIC the power to determine codes of practice, the Report proposes that the OAIC should issue a “Children’s Online Privacy Code” for online services that are likely to be accessed by children. The Code would align with the UK’s Age Appropriate Design Code, and would provide guidance on how collection notices and privacy policies should be designed with children in mind.
Proposal 20 of the Report also proposes prohibiting:
- Direct marketing to children, unless the personal information used for direct marketing was collected directly from the child and the direct marketing is in the child’s best interests;
- Trading in the personal information of children;
- Targeting children, unless the targeting is in the children’s best interests; and
- Targeting based on sensitive personal information, unless the content is socially beneficial.
8. A right to erasure and a right to be delisted would be introduced (Proposals 18, 19, and 20)
The Privacy Act currently provides individuals with the rights to access personal information about them that is held by an APP entity (see APP 12) and if the information is inaccurate, out-of-date, incomplete, irrelevant or misleading, require the APP entity to correct the information (see APP 13).
The Review Report proposes establishing several new rights for individuals over their personal information in addition to those under APPs 12 and 13. The proposed new rights would include the rights to:
- Request that an entity holding the requestor’s personal information:
- The source of that information, if the information was collected indirectly;
- What the entity has done with the information;
- Erase the information, and notify any third parties from whom the entity collected the information, or to whom the entity has disclosed the information of the erasure request;
- Object to the collection, use, or disclosure of one’s personal information by the entity, and receive a written response from the entity; and
- Request that an online search engine de-index search results containing personal information which is:
- About a child;
- Excessively detailed; or
- Inaccurate, out-of-date, incomplete, irrelevant, or misleading (i.e., a right to be forgotten);
These rights would be subject to exceptions for countervailing public interest or legal interests, or where compliance would be impossible, unreasonable, frivolous, or vexatious.
The Review Report also proposes introducing new obligations for entities to help individuals to exercise their rights under the Privacy Act, including obligations to:
- Inform individuals about their rights and how to obtain further information on them, including how to exercise them, at the point of collection of personal information;
- Provide reasonable assistance to individuals to assist in the exercise of their rights;
- Take reasonable steps to respond to requests from individuals for exercise of their rights within a reasonable timeframe, including acknowledging receipt of the request; and
- If the entity refuses such a request, provide an explanation for the refusal and explain to the individual how to lodge a complaint with the OAIC regarding the refusal.
9. A right to explanation of substantially automated decision-making is proposed (Proposal 19)
The Review Report proposes introducing several new transparency requirements concerning the use of ADM in processing of personal information. In particular:
- Privacy policies would have to set out:
- The types of personal information that would be used in substantially automated decisions that have a legal or similarly significant effect on individual’s rights (the Privacy Act would also be amended to include high-level indicators of the types of decisions which would meet this standard);
- How these types of personal information are used in making such decisions; and
- Individuals would also have the right to request meaningful information about how substantially automated decisions with legal or similar effect are made.
10. Absolute opt-out from direct marketing and targeting is included in the reform (Proposal 20)
The Review Report proposes expanding the Privacy Act’s provision on direct marketing to:
- “Targeting,” which would be defined as collection, use, or disclosure of information that relates to an individual for tailoring services, content, information, advertisements, or offers provided to or withheld from an individual, either on alone or as a member of a group or class; and
- “Trading” in one’s personal information, which would be broadly defined as disclosure of personal information for a benefit, service, or advantage.
The Report proposes that the Privacy Act should grant individuals an unqualified right to opt-out of the use and disclosure of their personal information for direct marketing purposes and to receiving targeted advertising. Entities would also be required to obtain individuals’ consent before trading in individuals’ personal information and, more broadly, provide information about targeting, including their use of algorithms and profiling to recommend content.
11. A controller-processor distinction would be made (Proposal 22)
Currently, the APPs apply to entities that “hold” personal information. This includes both entities that control such information and those that simply possess a record of it. During the review of the Privacy Act, the AGD received feedback that this scope of application presents compliance challenges for APP entities that hold an individual’s personal information but do not have a direct relationship with the individual (e.g., outsourced service providers).
The Review Report, therefore, proposes introducing the concepts of “controllers” and “processors” into the Privacy Act. This would expand the scope of the Privacy Act to non-APP entities that process personal information on behalf of APP entity controllers and would bring the Act closer to other data protection laws that recognize a controller-processor distinction, such as the GDPR and the data protection laws of Brazil, Hong Kong, Japan, New Zealand, Singapore, and South Korea.
If the AGD’s proposal is adopted in its current form, the Privacy Act would be amended to include the new concepts of:
- “APP entity controllers,” which would be subject to all obligations in the APPs; and
- “APP entity processors,” which would be subject only to the obligations to manage personal information in an open and transparent manner (APP 1), secure personal information held by them (APP 11), and report eligible data breaches under the Notifiable Data Breaches (NDB) Scheme.
12. New requirements around cross-border data transfers would be added (Proposal 23)
The Review Report recommends retaining the Privacy Act’s existing framework for cross-border data transfers (see APP 8) but proposes several additions to this framework, including:
- Establishing a mechanism comparable to the EU’s adequacy agreements to assess whether other countries and certification schemes provide substantially similar protection to that provided by the APPs;
- Making standard contractual clauses for cross-border data transfers available to APP entities; and
- Increasing notification requirements around cross-border data transfers by requiring APP entities to specify in their collection notices under APP 5 the types of personal information that may be disclosed to overseas recipients.
Notably, in its 2021 discussion paper, the AGD proposed removing consent as a basis for transferring personal information out of Australia in situations where the entity has not taken reasonable steps to ensure the overseas recipient does not breach the APPs.
However, as this proposal met resistance from numerous stakeholders, the Review Report proposed retaining consent as one of several options to transfer personal information out of Australia. However, the AGD also proposed adding a requirement for entities that seek to rely on consent for this purpose to consider, and specifically inform individuals of, any privacy risks that may result from cross-border transfers of their personal information.
13. Penalties would be restructured (Proposal 25)
The Review Report proposes replacing the Privacy Act’s existing penalty framework with a three-tiered framework on a scale of severity spanning:
- Low-tier, covering specific administrative breaches of the Privacy Act, such as:
- Failure to give individuals an option not to identify themselves;
- Failure to make a written note of a use or disclosure of personal information under APP 6.2(e); and
- Failure to deal with correction requests in specified timeframes;
- Mid-tier, covering interferences with privacy that lack a “serious” element; and
- Serious interferences with privacy, which may include:
- Those involving “sensitive information” or other information of a sensitive nature;
- Those adversely affecting large groups of individuals;
- Those impacting people experiencing vulnerability;
- Repeated breaches;
- Willful misconduct; and
- Serious failures to take proper steps to protect personal information.
As an alternative to issuing low-tier penalties, the Review Report proposes empowering the OAIC to issue infringement notices. The amount payable under an infringement notice is typically 20% or less of the maximum amount of the related civil penalty provision.
14. Direct right of action and statutory tort for invasion of privacy would be introduced (Proposals 26 and 27)
The Review Report proposes introducing:
- A direct right of action, which would allow individuals to seek relief from the courts for an interference with privacy; and
- A statutory tort of privacy according to a model proposed by the Australian Law Reform Commission in 2014.
The proposed direct right of action would be available to any individual or group of individuals (i.e., in a class action) who have suffered loss or damage due to a privacy interference by an APP entity. Loss or damage would need to be established within the existing meaning of the Act, including injury to the person’s feelings or humiliation.
To exercise the direct right of action, a claimant would first need to make a complaint to the OAIC and have their complaint assessed for conciliation by the OAIC or a recognized External Dispute Resolution scheme. If the complaint is deemed unsuitable for conciliation, or if conciliation is unlikely to resolve the dispute, the complainant would have the option to pursue the matter further in the Federal Court or the Federal Circuit and Family Court of Australia. Available remedies would be any order that the Court sees fit, including any amount of damages.
The proposed statutory tort for serious invasions of privacy would require a claimant to prove that there had been an intrusion into seclusion or misuse of the claimant’s private information that was committed intentionally or recklessly, in circumstances where the claimant otherwise had a reasonable expectation of privacy. The claimant would not need to prove that the invasion caused actual damage, as damages could be awarded for emotional distress. However, the claim would be subject to a “balancing exercise” in which the Court would need to be satisfied that the public interest in privacy outweighs any countervailing public interests.
Proposed defenses to the statutory tort would include:
- Lawful authority;
- Conduct was incidental to defense of persons or property;
- Absolute privilege;
- Publication of public documents; and
- Fair reporting of public proceedings.
15. Data breach notification requirements would be clarified (Proposal 28)
The Review Report also proposes amending Section 26WK(2)(b) of the Privacy Act to require an entity to prepare a statement regarding a suspected data breach and give a copy of the statement to the Information Commission within 72 hours.
The Report also proposes introducing new requirements that the statement must set out the steps that the entity has taken or intends to take in response to the breach, including, where appropriate, steps to reduce any adverse impacts on the individuals to whom the relevant information relates. This requirement would be subject to exceptions for any disclosure that would require the entity to reveal personal information or where the harm from disclosure would outweigh any benefit.
While there is still some way to go before these proposals are reflected in actual legislation, several observations can be made. First, the proposed changes in the Review Report represent some of the most extensive proposed reforms to the Privacy Act since its enactment. Second, while several of these reforms bring some parts of Australia’s privacy regime closer in line with other global equivalents like the GDPR (such as in the case of the definition of “personal information” and the controller-processor distinction), they also continue to ensure that Australia’s privacy regime remains uniquely Australian (such as the “fair and reasonable” requirement for the processing of personal data). Third, these proposals come at a time when Australia has been rocked by major data breaches, such as the Optus and Medibank data breaches, and more recently, the Latitude Financial data breach in March 2023. These data breaches may supply additional political will to implement these changes to Australia’s privacy regime. With the government’s response to these proposals expected sometime in 2023 or 2024, the FPF APAC office will continue to track these developments closely.