Australia: Privacy Act faces comprehensive review
This is an Insight article, written by a selected partner as part of GDR's co-published content. Read more on Insight
Australia’s privacy laws
In Australia, privacy is largely regulated at a federal (Commonwealth) level. The key legislation is the Privacy Act 1988 (Cth) (the Privacy Act), which applies to most Commonwealth public sector entities and private sector entities (including not-for-profits) with an annual turnover of more than A$3 million or meeting other specified criteria, for example, businesses providing health services or that are credit reporting bodies. Regulated entities are referred to as ‘APP entities’. The Privacy Act contains 13 Australian Privacy Principles (APPs), a mandatory notifiable data breach scheme and a regime for credit reporting.
In addition to the Privacy Act, a number of other federal statutes regulate privacy in specific contexts:
- The Data Availability and Transparency Act 2022 (DATA), which establishes a regime for the sharing of data by Commonwealth public sector entities and contains privacy protections specific to data sharing under that scheme. The National Data Commissioner, established under the DATA, is responsible for all enforcement under the DATA.
- There is an additional privacy regime applicable to data that is shared under the Consumer Data Right (CDR) regime, contained in Part IVD of the Competition and Consumer Act 2010 (Cth) (CCA). The CDR gives individuals the right to share their data with different service providers in designated sectors, including banking.
- The My Health Records Act 2012 (Cth) (the My Health Records Act) creates a privacy regime for the federal government’s digital health records scheme, My Health Record.
- The Telecommunications Act 1997 (Cth) (the Telecommunications Act) includes additional protections for certain personal information related to telecommunications services.
Other legislation applies in limited cases, such as the Data-matching Program (Assistance and Tax) Act 1990 (Cth), which applies to certain data matching by government agencies.
All states and territories (other than Western Australia and South Australia) have privacy legislation that applies to the handling of personal information by the relevant state or territory public sector and, in certain cases, private sector health service providers.
Australia’s regime is not modelled on the EU’s General Data Protection Regulation (GDPR) and contains some key differences. For example, the Privacy Act does not distinguish between data controllers and data processors but instead regulates all entities that collect and hold personal information in the same manner. The Privacy Act has more limited rights for individuals – among other limitations, it does not include a right to erasure.
International privacy commitments
Australia is:
- a signatory to the Asia-Pacific Economic Cooperation (APEC) Cross-Border Privacy Rules, which is a government-backed data privacy certification scheme that companies may join to demonstrate compliance with internationally-recognised data privacy protections;
- a participant in the Global Privacy Assembly’s Global Cross Border Enforcement Cooperation Arrangement (GCBECA), with other participants including Canada, Germany and the UK. This provides a framework for privacy regulators to work together on cross-border enforcement of privacy laws;
- a participant in the APEC Cross-Border Privacy Enforcement Arrangement, which, like the GCBECA, provides a framework for the cross-border enforcement of privacy laws; and
- party to an agreement with the EU on the Processing and Transfer of Passenger Name Record (PNR) Data by air carriers to the Australian Customs and Border Protection Service. This agreement authorises the transfer of PNR data to the Australian Department of Home Affairs from airlines that process PNR data in the EU and allows the Department to provide that data to other Australian and foreign government agencies provided safeguards in the agreement are complied with.
Australia has also entered into non-binding memoranda of understanding with other jurisdictions in relation to data governance and cross-border information flow.
Regulator and enforcement powers
The Australian Information Commissioner and Privacy Commissioner (the Commissioner), appointed under the Australian Information Commissioner Act 2010 (Cth) (the AIC Act), is responsible for the enforcement of the Privacy Act, the My Health Record Act and the CDR privacy regime under the CCA. The Commissioner has regulatory responsibilities under the Crimes Act 1914 (Cth), the Data-matching Act, the National Health Act 1953 (Cth) and the Telecommunications Act. The Commissioner is supported by the Office of the Australian Information Commissioner (OAIC), which is also established under the AIC Act.
In May 2023, the government announced that it will appoint a stand-alone Privacy Commissioner to tackle the growing threats to data security and the increasing volume and complexity of privacy issues. This appointment will restore the OAIC to the three-commissioner model originally envisaged under the legislation. Currently, the Information Commissioner holds a dual appointment as the Privacy Commissioner.
The recent mass-scale data breaches on significant corporations have led to an increase in class-action lawsuits and prompted the government’s recent appointment of Australia’s first cybersecurity coordinator to assist with managing these breaches.
The Privacy Act regime encourages individuals to complain directly to the relevant APP entity if they believe that entity has interfered with their privacy. In most cases, the Commissioner will not commence investigation of a complaint, which may be made by an individual under section 36 of the Privacy Act, unless a complaint has first been made to the relevant APP entity. Section 40A of the Privacy Act requires the Commissioner to first make a reasonable attempt to conciliate a complaint where the Commissioner believes that the complaint could be successfully conciliated.
If a complaint moves past the conciliation phase, the Commissioner (supported by the OAIC) must investigate it, subject to limited exceptions. The Commissioner may also investigate possible breaches of the Privacy Act on their own volition, under a Commissioner-initiated investigation.
If the Commissioner determines a breach has occurred following an investigation, they may make certain declarations, such as requiring the entity in breach to take steps to ensure the breach is not repeated or continued and to provide compensation to impacted individuals. The Commissioner may also:
- accept court-enforceable undertakings requiring compliance with the Privacy Act;
- seek injunctions to prevent ongoing or potential breaches of the Privacy Act; and
- seek civil penalties for serious or repeated interferences with the privacy of individuals and specified breaches of the credit-reporting provisions of the Privacy Act.
Enforcement proceedings must be taken in Australia’s Federal Court or Federal Circuit Court.
Following recent large-scale data breaches, the Privacy Legislation Amendment (Enforcement and Other Measures) Act 2022 introduced significantly increased penalties for serious and repeated privacy breaches and greater powers for the Australian Information Commissioner when making determinations at the conclusion of investigations.
The Commissioner may seek civil penalties for serious or repeated breaches not exceeding the greater of:
- A$50 million;
- three times the value of the benefit obtained by the body corporate from the conduct that constituted a serious and repeated interference with privacy (if that value can be determined); or
- 30 per cent of the body corporate’s domestic annual turnover in the 12 months before the contravention occurred (if the court cannot determine the value of the benefit obtained).
The Commissioner has similar enforcement powers under the My Health Records Act and Part IVD of the CCA.
Extraterritorial operation of the Privacy Act
The Privacy Act, and codes registered under the Act, have extraterritorial application (section 5B), as the following are regulated:
- acts or practices of federal government agencies, wherever performed; and
- acts or practices of organisations where an ‘Australian link’ exists.
An Australian link will exist where an organisation (or individual) is an Australian citizen or permanent resident, a partnership or trust established in Australia, a body corporate incorporated in Australia or an unincorporated entity with central management and control in Australia. If this requirement is not satisfied, then an act or practice of an organisation done or engaged in outside Australia will have an Australian link if both:
- the organisation carries on business in Australia; and
- the relevant personal information was collected or held by the organisation in Australia, either before or at the time of the act or practice.
The scope of an Australian link has been considered in two recent matters. First, in the Australian Information Commissioner’s proceedings against Facebook Inc and Facebook Ireland for breach of the Privacy Act in relation to the Cambridge Analytica scandal. Facebook Inc had challenged these proceedings on the basis of an argument that it did not have the necessary Australian link because it was not carrying on business in Australia at the time of the alleged breach. This argument was dismissed by the full bench of the Federal Court,[1] largely on the basis that there was a prima facie case that Facebook Inc was carrying on a business in Australia as it installed cookies on the devices of Australian users and also because it offered its Graph API in Australia. Facebook Inc appealed the decision, and the Full Federal Court dismissed Facebook Inc’s appeal. Facebook Inc was granted special leave to appeal to the High Court of Australia in relation to the Full Federal Court’s decision.
After a change to the Federal Court Rules 2011, the Commissioner applied to revoke the grant of special leave to Facebook Inc. The High Court granted the Commissioner’s application to revoke the grant of special leave on the basis that the matter no longer raised an issue of public importance.[2] Following the decision, the proceedings seeking civil penalties against Facebook Ireland and Facebook Inc are now progressing in the Federal Court.
The second matter involved a determination made by the Commissioner following a Commissioner-initiated investigation of Clearview AI Inc (Clearview),[3] the provider of controversial facial recognition software. Similar to the Cambridge Analytica case with Facebook Inc, Clearview disputed jurisdiction on the basis that it was not carrying on business in Australia. However, in her determination, the Commissioner found that Clearview was carrying on business in Australia. Clearview unsuccessfully challenged the Commissioner’s determination in Australia’s Administrative Appeals Tribunal. The Tribunal confirmed that Clearview’s operations had an Australian link because it was carrying on a business in Australia by extracting value from servers located in Australia. In this case, the Tribunal found that each interaction between Clearview’s web crawlers and an Australian server constituted a transaction, and the repeated transactions constituted the carrying on of a business in Australia.[4]
What is personal information under the Privacy Act?
The Privacy Act regulates the collection, use, transfer, storage and destruction of personal information. ‘Personal information’ is defined in the Privacy Act to mean information or an opinion about an identified individual, or an individual who is reasonably identifiable, whether or not the information is true or recorded in a material form.
Determining whether an individual is reasonably identifiable needs to be considered on a case-by-case basis. This may depend on whether the holder of the information retains other data that could potentially allow them to identify the relevant individual. An example the Commissioner uses to demonstrate this is a car licence plate number. Third parties, in general, would not be able to use such a number to identify the owner of the vehicle, therefore that number would not be considered personal information; however, if a third party has access to the relevant car registration number database, they could potentially identify the owner, making the car licence plate number personal information in the hands of that specific third party.[5]
Extra protections are provided under the Privacy Act in relation to sensitive information, which is a subset of personal information and includes:
- information or an opinion about a person’s race, political stance, religion, trade union and other professional memberships, sexual preferences and criminal record provided this is also personal information;
- health and genetic information about a person; and
- biometric information used for verification or identification and biometric templates.
Key APPs
The Privacy Act includes 13 APPs. The key APPs are:
- APP 1: this APP imposes an overall obligation on APP entities to manage personal information openly and transparently and includes a requirement for each APP entity to have a freely accessible privacy policy;
- APPs 3 and 6: these two APPs, in combination, impose obligations on APP entities regarding the collection, use and disclosure of personal information;
- APP 8: this APP regulates offshore disclosures; and
- APP 11: this APP imposes obligations on APP entities regarding both security measures to protect personal information and obligations regarding destruction or de-identification, once personal information is no longer required.
APP 1.2 requires reasonable steps to be taken by APP entities to implement practices to comply with the APPs and any applicable APP codes. The Commissioner regards APP 1.2 as a foundational privacy requirement for APP entities, requiring entities to take a privacy by design approach in their operations. It requires an APP entity to have practices, procedures and systems in place to identify and manage privacy risks through all stages of its management of personal information, from collection to final destruction or de-identification. To comply with APP 1.2, APP entities need to consider and, where appropriate, undertake privacy impact assessments for new projects that include the processing of personal information.
It is not a mandatory requirement of APP 1.2(a) to appoint a data protection or privacy officer, but the OAIC guidelines recommend it as a suitable governance mechanism. The Privacy (Australian Government Agencies – Governance) APP Code 2017 (Cth) (Code), applicable to Commonwealth public agencies, requires those agencies to have at least one privacy officer. The Code also requires that such public agencies have a ‘privacy champion’. A privacy champion is a senior official of the public agency who promotes a privacy culture in the relevant agency and provides leadership on strategic privacy issues.
Each APP entity is required to maintain a privacy policy that sets out, among other things, the types of personal information it collects and holds and the purposes for which it collects, holds, uses and discloses that information (APPs 1.3 and 1.4). This privacy policy must be made available free of charge and in a suitable form, which is typically satisfied by making the policy available on the APP entity’s website (APP 1.5).
APP 1 is supported by APP 5, which requires, either before or at the time personal information is collected or as soon as practicable thereafter, an APP entity to take reasonable steps to notify the relevant individual of (or to otherwise ensure they are aware of) the details of the collecting APP entity and, among other things, the purposes for the collection and the persons to whom the information would usually be disclosed, including whether cross-border disclosure is likely (APP 5).
An agency (ie, a regulated Commonwealth public sector entity) may collect personal information that is not sensitive information if the information is reasonably necessary for, or directly related to, its functions or activities (APP 3.1). Private sector APP entities may collect personal information that is not sensitive information if it is reasonably necessary for the entity’s functions or activities (APP 3.2).
In the case of all APP entities, consent must be obtained from the relevant individual to collect sensitive information (APP 3.3) other than in limited cases, such as if required by law.
In addition:
- personal information is generally required to be collected directly from the individual (APP 3.1). This ensures that an individual has a choice regarding whether to provide that information or not; and
- collection must be by lawful and fair means (APP 3.5). For example, an APP entity cannot seek to collect personal information by deception (eg, pretending to collect information for a police agency) or in a way that disrespects a person’s culture.
Personal information may only be used or disclosed for the purposes for which it was collected (the ‘primary purpose’) or for secondary purposes agreed by the relevant individual or otherwise listed in APP 6. An APP entity should disclose its primary purpose in its privacy policy or otherwise at the time of collection. The Commissioner’s APP guidelines[6] provide that, when describing to individuals the purposes for which personal information is collected, and therefore the primary purposes for which it may be used or disclosed, APP entities must not frame these purposes too broadly, such as ‘for carrying on [APP entity’s] business’. APPs 6.2 and 6.3 set out permitted ‘secondary purposes’. For example, if an individual, despite not being informed of a purpose, would reasonably expect the information to be used or disclosed for a particular purpose that is related (or for sensitive information, directly related) to the primary purpose, this will be a permitted secondary purpose.
Generally, there is no restriction on an APP entity transferring personal information outside Australia; however, there is increasing community concern about offshore transfers. APP 8.1 provides that, unless an exemption applies, if an APP entity discloses personal information to a person outside Australia who is not bound by the APPs, that entity must take reasonable steps to ensure the offshore recipient does not breach the APPs (other than APP 1) in relation to that information and the transferring APP entity will be liable for breaches by that offshore recipient.
The reasonable steps required to ensure that an offshore recipient does not breach the APPs are typically to impose contractual obligations on that entity and take active steps to monitor and enforce compliance with those obligations.
The most relevant exemptions to APP 8.1 are:
- the APP entity reasonably believes the offshore recipient is subject to laws or a binding scheme similar to the APPs and the relevant individuals are entitled to seek recourse under such laws or scheme (APP 8.2(a)); or
- the relevant individuals expressly agree that the APP entity does not need to take steps to ensure compliance with the APPs by the offshore recipient (APP 8.2(b)).
In a practical sense, APP entities have lesser obligations where the offshore recipient is subject to the laws or binding rules of a jurisdiction providing safeguards at least equivalent to the Privacy Act.
Under APP 11.1, each APP entity is required to take reasonable steps to protect the personal information that it holds from misuse, interference and loss, as well as unauthorised access, modification or disclosure. The OAIC’s view is that the reasonable steps required to comply with APP 11.1 will depend on the relevant circumstances. These circumstances include the nature of the APP entity (size, complexity, etc), the amount and sensitivity of the information, the potential adverse consequences for impacted individuals if there is a breach, the practical implications of implementing security measures and whether any proposed measure is privacy-invasive.
Reasonable steps include putting in place suitable information technology and access security arrangements and implementing steps and strategies regarding:
- governance, culture and training;
- binding contractual arrangements with service providers who access personal information;
- processes for dealing with data breaches; and
- policies for dealing with personal information that is no longer required to be held. APP 11.2 imposes an obligation on APP entities to destroy or de-identify personal information when an APP entity no longer needs it, unless it is required to retain it by law or in other limited cases.
The Information Commissioner has taken regulatory action in numerous cases where APP entities have not complied with APP 11.1. For example, in 2019, the Commissioner accepted an enforceable undertaking from the Commonwealth Bank of Australia in relation to breaches of APP 11.1. Those breaches involved the loss of data tapes holding customer personal information and an absence of appropriate policies and procedures to restrict employees from accessing the personal information of customers when this was not required to perform their roles.
Direct marketing
Where APP 7 applies, it requires that consent is obtained to use or disclose personal information for any form of direct marketing, unless (other than for sensitive information) an exemption applies. Direct marketing is not defined in the Privacy Act but is considered by the Commissioner to include marketing by any channel, including online advertising.[7] The most common exemption is where personal information (excluding sensitive information) is directly collected in circumstances where the person would expect it to be used or disclosed for direct marketing and an easy to use opt-out mechanism is provided.
However, APP 7 does not apply to direct marketing that is regulated under the following legislation:
- the Interactive Gambling Act 2001 (Cth) (IGA);
- the Spam Act 2003 (Cth) (the Spam Act); and
- the Do Not Call Register Act 2006 (Cth) (the DNCR Act).
The IGA contains general prohibitions on advertising interactive online gambling to Australians. In addition, it creates an offence where licensed interactive wagering services are marketed to any registered individual (ie, a person who has voluntarily registered themselves on a National Self-exclusion Register). The Australian Communications and Media Authority (ACMA) operates the National Self-Exclusion Register, dubbed ‘BetStop’. Individuals can self-exclude from all licensed interactive wagering services for between three months and indefinitely.
The Spam Act is the primary legislation that regulates electronic direct marketing. It prohibits the sending of unsolicited commercial electronic messages, which includes emails, instant messaging, SMS and MMS. These messages are regulated if they have a commercial purpose, such as advertising goods or services, promoting a business or advertising business opportunities or investments. The Spam Act applies in situations where there is an ‘Australian link’, that is, the messages must either originate in Australia, the person sending (or authorising the sending of) the messages must be physically in Australia or, in the case of an organisation, has its central management or control in Australia, or the messages must be accessible in Australia.
The DNCR Act regulates unsolicited telemarketing calls and marketing faxes. Telemarketing calls are voice calls, including pre-recorded or synthetic voice calls, made to Australian numbers for purposes such as advertising goods or services, promoting a business or advertising business or investment opportunities. These purposes largely reflect the definition of a commercial message under the Spam Act. Similarly, a marketing fax is a fax sent to an Australian number for any of the same purposes.
In general, telemarketing calls and marketing faxes cannot be sent to numbers registered on the Do Not Call Register without consent. The Register is maintained by ACMA. Any individual may register an Australian number on the Do Not Call Register if it is used for private or domestic purposes, primarily used for faxes, used by a government body or it is an emergency service number. Registration is indefinite.
Both the DNCR Act and the Spam Act include a number of exemptions. For example, registered political parties and registered charities may make telemarketing calls and send commercial electronic messages, but must comply with other requirements of both Acts.
Automated processing, profiling and data analytics regulation
The Privacy Act does not directly regulate automated processing, profiling or data analytics. In this respect, it differs from the GDPR, which includes additional protections in these areas, such as rights not to be subject to automated processing decisions in some circumstances.
These areas are regulated only indirectly under the Privacy Act. For example, the use of personal information for some profiling purposes, such as to infer sensitive information, including a person’s health information, without their knowledge, is likely to breach APP 3.5. This principle requires the collection of personal information by lawful and fair means. Use of profiling for direct marketing is to a limited extent regulated by APP 7, which is considered in the ‘Direct marketing’ section above.
Regime applicable to data breaches
Part IIIC of the Privacy Act contains the notifiable data breach regime, which applies to eligible data breaches.
A data breach is unauthorised access or disclosure of personal information or loss of personal information where unauthorised access or disclosure is likely to occur; however, a data breach must satisfy additional criteria before it is considered to be an eligible data breach, namely:
- the data breach must be likely to result in serious harm to the relevant individuals; and
- the relevant APP entity has been unable to take steps to prevent that likely risk of serious harm.
Section 26WG of the Privacy Act sets out factors that need to be considered in determining whether serious harm is likely from a data breach, including the sensitivity of the information, the persons who have obtained (or may obtain) the information and the nature of the harm that may result. Financial harm would be serious harm, but this is a broader concept and could include physical, emotional or psychological harm.
If there are reasonable grounds to believe an eligible data breach has occurred, an APP entity must notify the Information Commissioner as soon as practicable (section 26WK). The notification must include the name and contact details of the entity (and of any other entities involved in the breach), a description of the breach, the type of information involved and recommended steps for protection from the consequences of the breach.
Individuals impacted by an eligible data breach must also be notified[8] and provided with the same information the Commissioner receives. If it is practicable to notify each individual whose information has been disclosed or each individual at risk of serious harm, the APP entity must take reasonable steps to do this. If an entity usually communicates with an individual using a particular communication method, that method may be used but this is not obligatory. Depending on the circumstances, email, phone, text, etc, may be appropriate. If it is not practicable to directly notify individuals, the entity must publish the notification statement on the entity’s website (if it has one) and take reasonable steps to ensure impacted individuals are aware of that statement.
The My Health Records Act has a separate data breach notification regime.
Rights of individuals under the Privacy Act
Under the Privacy Act, individuals may require access to, and correction of, their personal information held by an APP entity (APP 12), though, unlike the GDPR, the Privacy Act does not include a right to erasure.
An individual may not take direct action against an APP entity if that entity breaches the Privacy Act in relation to their personal information. Instead, the individual must first complain to the APP entity. In most cases, it is only if their complaint is not resolved satisfactorily by the APP entity that an individual may complain to the Commissioner (noting representative complaints may be made by an individual on their own behalf and on behalf of other similarly impacted individuals). The Commissioner will initially attempt to conciliate a complaint. If that is unsuccessful and the Commissioner investigates the complaint, this will occur with only limited involvement by the complainant or complainants.
Individuals have limited rights to take action to protect their privacy outside of the complaint mechanism in the Privacy Act. There is no general law right to privacy in Australia. There is limited judicial authority that protects privacy, although in 2019, an innovative privacy class action was settled in New South Wales. In that case, the complainants alleged that the unauthorised disclosure of their personal information amounted to, among other things, a breach of contract and misleading and deceptive conduct. Recent cases by the Australian Competition and Consumer Commission (ACCC) under the misleading and deceptive conduct provisions of the Australian Consumer Law, discussed in ‘The future of privacy regulation’ below, suggest that individuals may be able to successfully make privacy claims under those provisions.
Surveillance laws
In addition to legislation protecting personal information directly, both federal and state and territory legislation in Australia regulates the surveillance of individuals.
At a federal level, the Telecommunications (Interception and Access) Act 1979 (Cth) prohibits the interception of communications, including phone calls, emails and text messages, unless a specific exemption applies. Interception of communication refers to listening to, or recording, a communication passing over a telecommunications system as it is passing over that system without the knowledge of the person making the communication.
At a state level, in New South Wales, the Workplace Surveillance Act 2005 (the WS Act) regulates camera, computer and tracking surveillance of employees (but not listening device surveillance). The term ‘employee’ has an expanded meaning, including a person employed by a particular employer or any related corporations. It may also extend to volunteers and individuals engaged through labour hire companies. The WS Act applies when an employee is at the workplace of the employer (or its related entities) or when the employee is performing work, even if they are not present at such a workplace. In addition, the Surveillance Devices Act 2007 (NSW) regulates surveillance generally (ie, not limited to the surveillance of employees), including listening device surveillance.
Each state and territory has legislation that regulates surveillance activities; however, only New South Wales, the Australian Capital Territory and (to a limited extent) Victoria have surveillance legislation that specifically regulates surveillance by employers of employees.
New developments in privacy law
Outcomes of the Privacy Act review
In February 2023, the federal government released the report of the Attorney-General’s Department’s comprehensive review of the Privacy Act.[9] The report contains 116 proposals for reform. The proposals are aimed at strengthening the protection of personal information and the control that individuals have over it, as well as enhancing the powers of the Information Commissioner to enforce privacy obligations and to identify systemic privacy issues and address privacy breaches.
The key proposals arising from the Privacy Act Review include:
- removing the current exemption for small businesses from the application of the Privacy Act;
- retaining the existing exemption relating to employee records, but placing new obligations on private sector employers regarding the collection, use and protection of employee personal information;
- introducing a right to seek erasure, to de-index internet search results containing certain types of personal information, with limitations on that right, such as in cases of a countervailing public interest;
- extending existing obligations about the ‘offshoring’ of personal information to apply to de-identified datasets and introducing a new criminal offence of ‘malicious re-identification’ of de-identified information;
- introducing a statutory tort for serious invasions of privacy, as well as a statutory direct cause of action for individuals in the Federal Court and Federal Circuit and Family Court of Australia;
- introducing the concepts of ‘controller’ and ‘processor’ to lessen responsibilities on processors;
- creating new protections for those subject to automated decisions that affect their rights;
- requiring APP entities to self-set maximum and minimum retention periods; and
- tightening notification obligations under the notifiable data breach scheme.
The report proposes the introduction of new low and mid-tier civil penalty provisions for specific breaches of the Privacy Act and interferences with privacy, which may be enforced through infringement notices.
Further proposals include:
- requiring that entities act fairly and reasonably when collecting, using and disclosing personal information;
- amending the definition of consent to clarify that consent must be voluntary, informed, current, specific and unambiguous;
- introducing a requirement to conduct privacy impact assessments for any high privacy risk activity;
- prohibiting the use of an individual’s information for the purpose of targeted advertising and content to children; and
- improved protections for children and vulnerable persons.
As at the time of writing, legislation implementing the proposals is currently under development.
Consumer protection laws to protect privacy
Another interesting trend in Australia is the increasing use of consumer protection laws to protect privacy. Australia’s federal competition and consumer protection regulator, the ACCC, has commenced enforcement action under the misleading and deceptive conduct prohibitions of Australian consumer law in a number of cases that could be considered more appropriately dealt with under the Privacy Act, including the following:
- In early 2021, the Federal Court found that Google had misled consumers about personal location data collected through Android mobile devices in a world-first enforcement action taken by the ACCC.[10] In August 2022, the Federal Court ordered a A$60 million fine in relation to the conduct.[11]
- In mid-2020, the ACCC commenced proceedings against Google relating to steps Google took in 2016 to combine data collected through its advertising technology services (previously known as DoubleClick) with personal information from consumers’ Google accounts. In December 2022, the Federal Court dismissed the ACCC’s case, finding it did not mislead consumers as the steps were taken with informed consent of account holders.[12]
- In December 2020, the ACCC commenced proceedings against Facebook (now Meta) for alleged misleading and deceptive conduct in relation to the advertising of Facebook’s Onavo Protect mobile app to Australian consumers. The ACCC claimed that, contrary to the representations that Facebook made to consumers, Facebook collected and used the personal information of consumers who used that app for its own purposes. In July 2023, the Federal Court ordered two subsidiaries of Meta (Facebook Israel and Onavo Inc) to each pay A$10 million for engaging in conduct liable to mislead in breach of the Australian Consumer Law.[13]
These cases are of particular interest because they not only demonstrate the overlap of privacy and consumer protection regulation but also provide that individuals, including through class actions, may be able to take direct measures to protect their personal information through the use of consumer protection laws, even though direct action is not currently possible under the Privacy Act.
Given the increased focus of Australians on the protection of their privacy, it is likely that Australia will see even greater levels of privacy regulation and enforcement action in future.
Notes
[1] Facebook Inc v Australian Information Commissioner [2022] FCAFC 9.
[2] Facebook Inc v Australian Information Commissioner [2023] HCA Trans 22 (7 March 2023).
[3] Commissioner initiated investigation into Clearview AI, Inc. (Privacy) [2021] AICmr 54 (14 October 2021).
[4] Clearview AI Inc and Australian Information Commissioner [2023] AATA 1069.
[5] This example is used in paragraph B.92 of the Commissioner’s Australian Privacy Principles Guidelines (the APP Guidelines).
[6] APP Guidelines, paragraphs B.98 to B.103.
[7] APP Guidelines, paragraphs 7.9 to 7.12.
[8] Privacy Act, section 26WL.
[9] Attorney-General’s Department, ‘Privacy Act Review Report’, 16 February 2023.
[10] Australian Competition and Consumer Commission v Google LLC (No. 2) [2021] FCA 367.
[11] Australian Competition and Consumer Commission v Google LLC (No. 4) [2022] FCA 942.
[12] Australian Competition and Consumer Commission v Google LLC (No. 2) [2022] FCA 1476.
[13] Australian Competition and Consumer Commission, press release, ‘$20m penalty for Meta companies for conduct liable to mislead consumers about use of their data’, 26 July 2023.