Australia: Privacy

Australia’s privacy laws

In Australia, privacy is largely regulated at a federal (Commonwealth) level. The key legislation is the Privacy Act 1988 (Cth) (the Privacy Act), which applies to most Commonwealth public sector entities as well as private sector entities (including not-for-profits) with annual turnover of more than A$3 million or meeting other specified criteria, for example, businesses providing health services or that are credit reporting bodies. Regulated entities are referred to as APP entities. The Privacy Act contains 13 Australian Privacy Principles (APPs), a mandatory notifiable data breach scheme, a regime for credit reporting and requirements applying to the COVIDSafe app, the Australian government’s covid-19 contact tracing app.

In addition to the Privacy Act, a number of other federal statutes regulate privacy in specific contexts:

  • The Data Availabiity and Transparency Act 2022 (DATA), which establishes a regime for the sharing of data by Commonwealth public sector entities, contains privacy protections that are specific to data sharing under that scheme. The National Data Commissioner, established under DATA, is responsible for all enforcement under the DATA.
  • There is an additional privacy regime applicable to data that is shared under the Consumer Data Right (CDR) regime, contained in Part IVD of the Competition and Consumer Act 2010 (Cth) (CCA). The CDR gives individuals the right to share their data with different service providers in designated sectors, including banking.
  • The My Health Records Act 2012 (Cth) (the My Health Records Act), creates a privacy regime for the Australian government’s digital health records scheme, My Health Record.
  • The Telecommunications Act 1997 (Cth) (the Telecommunications Act) includes additional protections for certain personal information related to telecommunications services.

Other legislation applies in limited cases, such as the Data-matching Program (Assistance and Tax) Act 1990 (Cth), which applies to certain data matching by government agencies.

All states and territories (other than Western Australia and South Australia) have privacy legislation that applies to the handling of personal information by the relevant state or territory public sector and, in certain cases, private sector health service providers.

Australia’s regime is not modelled on the EU’s General Data Protection Regulation (GDPR) and contains some key differences. For example, the Privacy Act does not distinguish between data controllers and data processors but instead regulates all entities that collect and hold personal information in the same manner. The Privacy Act has more limited rights for individuals – among other limitations, it does not include a right to erasure.

International privacy commitments

Australia is:

  • a signatory to the Asia-Pacific Economic Cooperation (APEC) Cross-Border Privacy Rules, which is a government-backed data privacy certification scheme that companies may join to demonstrate compliance with internationally-recognised data privacy protections;
  • a participant in the Global Privacy Assembly’s Global Cross Border Enforcement Cooperation Arrangement (GCBECA), with other participants including Canada, Germany and the UK. This provides a framework for privacy regulators to work together on cross-border enforcement of privacy laws;
  • a participant in the APEC Cross-Border Privacy Enforcement Arrangement, which, like the GCBECA, provides a framework for the cross-border enforcement of privacy laws; and
  • party to an Agreement with the EU on the Processing and Transfer of Passenger Name Record (PNR) Data by Air Carriers to the Australian Customs and Border Protection Service. This agreement authorises the transfer of PNR data to the Australian Department of Home Affairs from airlines that process PNR data in the EU and allows the Department to provide that PNR data to other Australian and foreign government agencies provided safeguards in the agreement are complied with.

Australia has also entered into non-binding memoranda of understanding with other jurisdictions in relation to data governance and cross-border information flow.

Regulator and enforcement powers

The Australian Information Commissioner and Privacy Commissioner (the Commissioner), appointed under the Australian Information Commissioner Act 2010 (Cth) (the AIC Act), is responsible for the enforcement of the Privacy Act, the My Health Record Act and the CDR privacy regime under the CCA. The Commissioner has regulatory responsibilities under the Crimes Act 1914 (Cth), the Data-matching Act, the National Health Act 1953 (Cth) and the Telecommunications Act. The Commissioner is supported by the Office of the Australian Information Commissioner (OAIC), which is also established under the AIC Act.

The Privacy Act regime encourages individuals to complain directly to the relevant APP entity if an individual believes that APP entity has interfered with their privacy. In most cases, the Commissioner will not commence investigation of a complaint, which may be made by an individual under section 36 of the Privacy Act, unless a complaint has first been made to the relevant APP entity. The Privacy Act requires the Commissioner to first make a reasonable attempt to conciliate a complaint where the Commissioner is of the view that the complaint could be successfully conciliated (section 40A).

If a complaint moves past the conciliation phase, the Commissioner (supported by the OAIC) must investigate it, subject to limited exceptions. The Commissioner may also investigate possible breaches of the Privacy Act on their own volition, under a Commissioner-initiated investigation.

If the Commissioner determines a breach has occurred following an investigation, they may make certain declarations, including that the entity in breach take steps to ensure the breach is not repeated or continued and requiring payment of compensation to impacted individuals. The Commissioner may also:

  • accept court-enforceable undertakings requiring compliance with the Privacy Act;
  • seek injunctions to prevent ongoing or potential breaches of the Privacy Act; and
  • seek civil penalties for serious or repeated interferences with the privacy of individuals and specified breaches of the credit-reporting provisions of the Privacy Act.

Enforcement proceedings must be taken in Australia’s Federal Court or Federal Circuit Court.

The civil penalties that the Commissioner may seek under the Privacy Act are quite low, with the maximum penalty being A$2.1 million. Government announcements have stated since 2019 that the penalties will increase to match those in the Australian Consumer Law, that is, to be the greater of A$10 million and three times the value of any benefit obtained from the breach or, if this cannot be determined, 10 per cent of the breaching entity’s annual Australian turnover. However, no legislation has yet been passed to give effect to such an increase.

The Commissioner has similar enforcement powers under the My Health Records Act and Part IVD of the CCA.

Extra-territorial operation of the Privacy Act

The Privacy Act, and codes registered under it, have extraterritorial application (section 5B), as the following are regulated:

  • acts or practices of federal government agencies, wherever performed; and
  • acts or practices of organisations, where an ‘Australian link’ exists.

An Australian link will exist where an organisation (or individual) is an Australian citizen or permanent resident, a partnership or trust established in Australia, a body corporate incorporated in Australia or an unincorporated entity with central management and control in Australia. If this requirement is not satisfied, then an act or practice of an organisation done or engaged in outside Australia will have an Australian link if both:

  • the organisation carries on business in Australia; and
  • the relevant personal information was collected or held by the organisation in Australia, either before or at the time of the act or practice.

The scope of an Australian link has been considered in two recent matters. First, the Commissioner has initiated Federal Court proceedings against Facebook Inc and Facebook Ireland for breach of the Privacy Act in relation to the Cambridge Analytica scandal. Facebook Inc had challenged these proceedings on the basis of an argument that it did not have the necessary Australian link because it was not carrying on business in Australia at the time of the alleged breach. This argument was dismissed by the full bench of the Federal Court,[1] largely on the basis that there was a prima facie case that Facebook Inc was carrying on a business in Australia as it installed cookies on the devices of Australian users and also because it offered its Graph API in Australia. This finding was only a preliminary finding, and a different decision may be reached when the proceedings are heard in full. The High Court will hear an appeal from the Federal Court decision.

The second matter involved a determination made by the Commissioner following a Commissioner initiated investigation of Clearview AI Inc (Clearview),[2] the provider of controversial facial recognition software. Clearview, like Facebook Inc in the Cambridge Analytica case, disputed jurisdiction on the basis that it was not carrying on business here. However, in her determination, the Commissioner found that Clearview was carrying on business in Australia because Clearview provided trials and demonstrations to several Australian policy agencies (and undertook supporting activities, such as sending marketing emails to those agencies and collecting images from those agencies for inclusion in its database) and scraped facial images from the internet as an integral part of its business, without regard to source. Clearview is (as at August 2022) challenging the Commissioner’s determination in Australia’s Administrative Appeals Tribunal.

What is personal information under the Privacy Act?

The Privacy Act regulates the collection, use, transfer, storage and destruction of personal information. ‘Personal information’ is defined in the Privacy Act to mean information or an opinion about an identified individual, or an individual who is reasonably identifiable, whether or not the information is true or recorded in a material form.

Determining whether an individual is reasonably identifiable needs to be considered on a case-by-case basis, and this may depend on whether the holder of the information holds other information that may allow the holder to identify the relevant individual. An example the Commissioner uses to demonstrate this is a car licence plate number. Third parties would typically not be able to use such a number to identify the owner of the vehicle, and therefore generally that number would not be personal information. But a third party with access to the relevant car registration number database could do so, and therefore the car licence plate number would be personal information in the hands of that specific third party.[3]

Extra protections are provided under the Privacy Act in relation to sensitive information, which is a subset of personal information and includes:

  • information or an opinion about a person’s race, political stance, religion, trade union and other professional memberships, sexual preferences and criminal record provided this is also personal information;
  • health and genetic information about a person; and
  • biometric information used for verification or identification and biometric templates.

The Australian government is (as at August 2022) part way through a comprehensive review of the Privacy Act. As part of that review, consideration is being given to amending the definition of personal information to more clearly include technical information such as online identifiers and IP addresses, given the increased focus on personal information collection and use practices in the digital economy.

Key Australian Privacy Principles

The Privacy Act includes 13 APPs. Key APPs are:

  • APP 1: this imposes an overall obligation on APP entities to manage personal information in an open and transparent way, and includes a requirement for each APP entity to have a freely accessible privacy policy;
  • APPs 3 and 6: these two APPs, in combination, impose obligations on APP entities regarding the collection, use and disclosure of personal information;
  • APP 8: APP 8 regulates offshore disclosures; and
  • APP 11: this APP imposes obligations on APP entities regarding both security measures to protect personal information and obligations regarding destruction or de-identification, once personal information is no longer required.

APP 1.2 requires reasonable steps to be taken by APP entities to implement practices to comply with the APPs and any applicable APP codes. APP 1.2 is seen by the Commissioner as a foundational privacy requirement for APP entities, essentially requiring entities to take a privacy by design approach in their operations. It requires an APP entity to have in place practices, procedures and systems to identify and manage privacy risks through all stages of its management of personal information, from collection to final destruction or de-identification. To comply with APP 1.2, APP entities need to consider and, where appropriate, undertake, privacy impact assessments for new projects that include the processing of personal information.

It is not a mandatory requirement of APP 1.2(a) to appoint a data protection or privacy officer, but the OAIC Guidelines suggest this is an appropriate governance mechanism. The Privacy (Australian Government Agencies – Governance) APP Code 2017 (Cth) (Code), applicable to Commonwealth public agencies, requires those agencies to have at least one privacy officer. The Code also requires that such public agencies have a ‘privacy champion’. A privacy champion is a senior official of the public agency who promotes a privacy culture in the relevant agency and provides leadership on strategic privacy issues.

Each APP entity is required to maintain a privacy policy that sets out, among other things, the types of personal information it collects and holds and the purposes for which it collects, holds, uses and discloses that information (APPs 1.3 and 1.4). That privacy policy must be made available free of charge and in an appropriate form, which is typically satisfied by making the policy available on the APP entity’s website (APP 1.5).

APP 1 is supported by APP 5, which requires, either before or at the time personal information is collected or as soon as practicable thereafter, an APP entity to take reasonable steps to notify the relevant individual of (or to otherwise ensure they are aware of) the details of the collecting APP entity and, among other things, the purposes for the collection and the persons to whom the information would usually be disclosed, including whether cross-border disclosure is likely (APP 5).

An agency (that is, a regulated Commonwealth public sector entity) may collect personal information that is not sensitive information if the information is reasonably necessary for, or directly related to, its functions or activities (APP 3.1). Personal information that is not sensitive information may be collected by private sector APP entities if it is reasonably necessary for the entity’s functions or activities (APP 3.2).

In the case of all APP entities, consent must be obtained from the relevant individual to collect sensitive information (APP 3.3) other than in limited cases, such as if required by law.

In addition:

  • personal information is generally required to be collected directly from the individual (APP 3.1). This ensures that an individual has a choice as to whether to provide that information or not; and
  • collection must be by lawful and fair means (APP 3.5). For example, an APP entity cannot seek to collect personal information by deception (such as pretending collection is for a police agency) or in a way that disrespects a person’s culture.

Personal information may only be used or disclosed for the purposes for which it was collected (the ‘primary purpose’) or for secondary purposes agreed by the relevant individual or otherwise listed in APP 6. An APP entity will disclose its primary purpose in its privacy policy or otherwise at the time of collection. The Commissioner’s APP Guidelines[4] provide that, in describing to individuals the purposes for which personal information is collected, and therefore the primary purposes for which it may be used or disclosed, APP entities must not frame these purposes too broadly, such as ‘for carrying on [APP entity’s] business’. APPs 6.2 and 6.3 set out permitted ‘secondary purposes’. For example if, notwithstanding that the individual was not informed of a purpose, that individual would reasonably expect the information to be used or disclosed for a particular purpose that is related (or for sensitive information, directly related) to the primary purpose, this will be a permitted secondary purpose.

Generally, there is no restriction on an APP entity transferring personal information outside Australia. However, there is increasing community concern about offshore transfers, particularly of personal information collected by the government. This is reflected in the mid-2020 Privacy Act amendments regulating the collection, use and disclosure of information collected via the government’s COVIDSafe app. That information is deemed to be personal information and any disclosure of that information outside Australia is an offence.

APP 8.1 provides that, unless an exemption applies, if an APP entity discloses personal information to a person outside Australia who is not bound by the APPs, that APP entity must take reasonable steps to ensure the offshore recipient does not breach the APPs (other than APP 1) in relation to that information and the transferring APP entity will be liable for breaches by that offshore recipient.

The reasonable steps required to ensure that an offshore recipient does not breach the APPs are typically to impose contractual obligations on that entity and take active steps to monitor and enforce compliance with those obligations.

The most relevant exemptions to APP 8.1 are:

  • the APP entity reasonably believes the offshore recipient is subject to laws or a binding scheme similar to the APPs and the relevant individuals are entitled seek recourse under such laws or scheme (APP 8.2(a)); or
  • the relevant individuals expressly agree that the APP entity does not need to take steps to ensure compliance with the APPs by the offshore recipient (APP 8.2(b)).

In a practical sense, this means APP entities have lesser obligations where the offshore recipient is subject to the laws or binding rules of a jurisdiction providing safeguards at least equivalent to the Privacy Act.

Under APP 11.1, each APP entity is required to take reasonable steps to protect the personal information that it holds from misuse, interference and loss, as well as unauthorised access, modification or disclosure. The OAIC’s view is that the reasonable steps required to comply with APP 11.1 will depend on the relevant circumstances. These circumstances include the nature of the APP entity (size, complexity, etc), the amount and sensitivity of the information, the potential adverse consequences for impacted individuals if there is a breach, the practical implications of implementing security measures and whether any proposed measure is privacy-invasive.

Reasonable steps include not only putting in place suitable information technology and access security arrangements, but also implementing steps and strategies regarding:

  • governance, culture and training;
  • binding contractual arrangements with service providers who access personal information;
  • processes for dealing with data breaches; and
  • policies for dealing with personal information that is no longer required to be held. APP 11.2 imposes an obligation on APP entities to destroy or de-identify personal information when an APP entity no longer needs it, unless it is required to retain it by law or in other limited cases.

The Information Commissioner has taken regulatory action in numerous cases where APP entities have not complied with APP 11.1. For example, in 2019, she accepted an enforceable undertaking from the Commonwealth Bank of Australia in relation to breaches of APP 11.1. Those breaches involved the loss of data tapes holding customer personal information and an absence of appropriate policies and procedures to restrict employees from accessing personal information of customers when this was not required to perform their roles.

Direct marketing

Where APP 7 applies, it requires that consent is obtained to use or disclose personal information for any form of direct marketing, unless (other than for sensitive information) an exemption applies. Direct marketing is not defined in the Privacy Act but is considered by the Commissioner to include marketing by any channel, including online advertising.[5] The most common exemption is where personal information (excluding sensitive information) is directly collected in circumstances where the person would expect it to be used or disclosed for direct marketing and an easy to use opt-out mechanism is provided.

However, APP 7 does not apply to direct marketing that is regulated under any of the following:

  • the Interactive Gambling Act 2001 (Cth) (IGA);
  • the Spam Act 2003 (Cth) (the Spam Act); and
  • the Do Not Call Register Act 2006 (Cth) (the DNCR Act).

The IGA contains general prohibitions on advertising interactive online gambling to Australians. In addition, the IGA creates an offence where licensed interactive wagering services are marketed to any registered individual (being a person who has voluntarily registered themself on a National Self-exclusion Register – though that register had not yet been established as at August 2022).

The SPAM Act is the primary legislation that regulates electronic direct marketing. It prohibits the sending of unsolicited commercial electronic messages, such as emails, instant messaging, SMS and MMS. Messages are regulated if they are commercial in nature, such as advertising goods or services, promoting a business or advertising business opportunities or investments. The Spam Act applies where there is an ‘Australian link’, that is, the messages must originate in Australia, the person sending (or authorising the sending of) the messages must be physically in Australia (if that person is an individual) or have its central management or control in Australia (in the case of an organisation) or the messages must be accessible in Australia.

The DNCR Act regulates unsolicited telemarketing calls and marketing faxes. Telemarketing calls are voice calls (including pre-recorded or synthetic voice calls) to Australian numbers for purposes such as advertising goods or services, promoting a business or advertising business or investment opportunities (these purposes largely reflect the definition of a commercial message under the Spam Act). A marketing fax is a fax sent to an Australian number for any of the same purposes. Telemarketing calls and marketing faxes generally cannot be sent to numbers registered on the Do Not Call Register without consent. The Do Not Call Register is maintained by the Australian Communications and Media Authority. Any person may enter an Australian number on the register if it is used for private or domestic purposes, it is primarily used for faxes, it is used by a government body or it is an emergency service number. Registration is indefinite.

Both the DNCR Act and the Spam Act include a number of exemptions. For example, registered political parties and registered charities may make telemarketing calls and send commercial electronic messages, but must comply with other requirements of both Acts.

Regulation of automated processing, profiling and data analytics

The Privacy Act does not directly regulate automated processing, profiling or data analytics. In this respect, it differs from the GDPR, which includes additional protections in these areas, such as rights not to be subject to automated processing decisions in some circumstances.

These areas are regulated only indirectly under the Privacy Act. For example, use of personal information for some profiling purposes such as to infer sensitive information, including a person’s health information, without their knowledge is likely to breach APP 3.5, which requires collection to be by lawful and fair means. Use of profiling for direct marketing is to a limited extent regulated by APP 7, which is considered in the ‘Direct marketing’ section above.

The current Australian Government Privacy Act review is considering whether express restrictions should be included in the Privacy Act dealing with these uses of personal information. For example, the Discussion Paper for the review released by the Attorney-General’s Department in late 2021[6] asked stakeholders to comment on, among other issues:

  • whether additional limitations should be imposed on practices considered to be high-risk, such as the collection, use or disclosure of particular types of personal information on a large scale or the use of personal information for the purposes of automated decision-making with legal or significant effects. An option for high-risk practices would include the imposition of additional organisational accountability measures; and
  • if an absolute prohibition should be imposed on profiling or automated processing of personal information for certain purposes under a ‘no-go zones’ regime similar to Canadian regulation, as in place at the time the Discussion Paper was issued.

Regime applicable to data breaches

Part IIIC of the Privacy Act contains the notifiable data breach regime, which applies to eligible data breaches.

A data breach is unauthorised access or disclosure of personal information or loss of personal information where unauthorised access or disclosure is likely to occur. However, a data breach must satisfy additional criteria before it is considered to be an eligible data breach, namely:

  • the data breach must be likely to result in serious harm to the relevant individuals; and
  • the relevant APP entity has been unable to take steps to prevent that likely risk of serious harm.

Section 26WG of the Privacy Act sets out factors that need to be considered in determining whether serious harm is likely from a data breach, including the sensitivity of the information, the persons who have obtained (or may obtain) the information and the nature of the harm that may result. Financial harm would be serious harm, but this is a broader concept and could include physical, emotional or psychological harm.

If there are reasonable grounds to believe an eligible data breach has occurred, an APP entity must notify the Information Commissioner as soon as practicable (section 26WK). The notification must include the name and contact details of the entity (and of any other entities involved in the breach), a description of the breach, the type of information involved and recommended steps for protection from the consequences of the breach.

Individuals impacted by an eligible data breach must also be notified (section 26WL of the Privacy Act) and provided with the same information the Commissioner receives. If it is practicable to notify each individual whose information has been disclosed or each individual at risk of serious harm, the APP entity must take reasonable steps to do this. If an entity usually communicates with an individual using a particular communication method, that method may be used but this is not obligatory. Email, phone, text, etc, may be appropriate depending on the circumstances. If it is not practicable to directly notify individuals, the entity must publish the notification statement on the entity’s website (if it has one) and take reasonable steps to ensure impacted individuals are aware of that statement.

The My Health Records Act has a separate data breach notification regime.

Rights of individuals under the Privacy Act

Under the Privacy Act, individuals may require access to, and correction of, their personal information held by an APP entity (APP 12), though, unlike the GDPR, the Privacy Act does not include a right to erasure.

An individual may not take direct action against an APP entity in the event that entity breaches the Privacy Act in relation to their personal information. Instead, the individual must first complain to the APP entity. In most cases, it is only if their complaint is not resolved satisfactorily by the APP entity that an individual may complain to the Commissioner (noting representative complaints may be made by an individual on their own behalf and on behalf of other similarly impacted individuals). The Commissioner will initially attempt to conciliate a complaint. If that is not successful and the Commissioner investigates the complaint, this will occur with only limited involvement by the complainant or complainants.

Individuals have limited rights to take action to protect their privacy outside of the complaint mechanism in the Privacy Act. There is no general law right to privacy in Australia. There is limited judicial authority that protects privacy, though in 2019 an innovative privacy class action was settled in New South Wales. In that case, the complainants alleged that the unauthorised disclosure of their personal information amounted to, among other things, a breach of contract and misleading and deceptive conduct. Recent cases by the Australian Competition and Consumer Commission (ACCC) under the misleading and deceptive conduct provisions of the Australian Consumer Law, discussed in ‘The future of privacy regulation’ below, suggest that privacy claims may be able to be successfully made by individuals under those provisions.

The current Privacy Act review is canvassing the question of whether a direct right of action should be given to individuals under the Privacy Act to seek compensation for interference with their privacy. If such a right was introduced, it would also allow class actions. The Privacy Act review is also, more controversially, considering the introduction of a statutory tort for serious invasions of privacy that would entitle individuals to take action outside the Privacy Act.

Surveillance laws

In addition to legislation protecting personal information directly, both federal and state and territory legislation in Australia regulates surveillance of individuals.

At a federal level, the Telecommunications (Interception and Access) Act 1979 (Cth) prohibits the interception of communications, including phone calls, emails and text messages, unless a specific exemption applies. Interception of a communication refers to listening to, or recording, a communication passing over a telecommunications system as it is passing over that system without the knowledge of the person making the communication.

At a state level, in New South Wales, the Workplace Surveillance Act 2005 (the WS Act) regulates camera, computer and tracking surveillance of employees (but not listening device surveillance). ‘Employee’ has an expanded meaning, including a person employed by a particular employer or any related corporations, and may also extend to volunteers and persons engaged through labour hire companies. The WS Act applies when an employee is at the workplace of the employer (or its related entities) or when the employee is actually performing work, even if not at such a workplace. In addition, the Surveillance Devices Act 2007 (NSW) regulates surveillance generally (that is, not limited to the surveillance of employees), including listening device surveillance.

Each other state and territory has legislation that regulates surveillance activities. However only New South Wales, the Australian Capital Territory and (to a limited extent) Victoria have surveillance legislation that specifically regulates surveillance by employers of employees.

The future of privacy regulation

As at August 2022, the Australian government is part way through a comprehensive review of the Privacy Act. Following the recent Australian election, which led to a change of government, the new Attorney-General committed the government to pursuing this reform, and it is likely that consultation draft legislation will be released for consultation before the end of 2022. The scope of the reforms are not yet known; however, given the breadth of topics covered in the consultation process, it is likely the Privacy Act will be subject to a substantial overhaul. This will not result in Australia moving to a GDPR model as neither the current nor the previous Australian government has shown support for that approach.

High-profile cases by the Commissioner are few. The Commissioner is currently taking action against Facebook for civil penalties arising from the Cambridge Analytica scandal, but these are the first proceedings of this type that have ever been commenced by the Commissioner. This is because the Commissioner (and her predecessors) typically use determination powers to resolve privacy breaches. Examples of where the Commissioner has used those powers include:

  • a determination in early 2021 involving the Department of Home Affairs and a data breach that involved the public disclosure of personal information of over 1,000 asylum seekers. In that matter, the Department was required to compensate impacted individuals;
  • a determination made in mid-2021 regarding Uber’s 2016 data breach, which impacted riders and drivers globally but was not disclosed by Uber for 12 months after it occurred. Uber was found to have breached the APPs and required to take remedial action; and
  • a determination in early 2022 in relation to Clearview’s facial recognition technology. Clearview was found to have breached the APPs and required to not only cease to scrape images of individuals in Australia from the internet but also to destroy all such images (and the vectors created from them) that had been collected by Clearview.

Another interesting trend in Australia is the increasing use of consumer protection laws to protect privacy. Australia’s federal competition and consumer protection regulator, the ACCC, has commenced enforcement action under the misleading and deceptive conduct prohibitions of Australian Consumer Law in a number of cases that could be considered more appropriately dealt with under the Privacy Act, including:

  • in early 2021, the Federal Court found that Google had misled consumers about personal location data collected through Android mobile devices in a world first enforcement action taken by the ACCC;[7]
  • in mid-2020, the ACCC commenced proceedings against Google relating to steps Google took in 2016 to combine data collected through its advertising technology services (previously known as DoubleClick) with personal information from consumers’ Google accounts. That case has been heard by the Federal Court, though a decision has not been handed down; and
  • in December 2022, the ACCC commenced proceedings against Facebook (now Meta) for alleged misleading and deceptive conduct in relation to the advertising of Facebook’s Onavo Protect mobile app to Australian consumers. The ACCC has claimed in that case (which has not yet been heard) that, contrary to the representations that Facebook made to consumers, Facebook collected and used for its own purposes the personal information of consumers who used that app.

These cases are of particular interest not only in demonstrating the overlap of privacy and consumer protection regulation but also because they demonstrate that individuals (including through class actions) may be able to take direct steps to protect their personal information through the use of consumer protection laws, even though at the current time direct action is not possible under the Privacy Act.

Given the increased focus of Australians on the protection of their privacy, it is likely that Australia will see even greater levels of privacy regulation (and enforcement action) in future.


Notes

[1] Facebook Inc v Australian Information Commissioner [2022] FCAFC 9.

[2] Commissioner initiated investigation into Clearview AI, Inc. (Privacy) [2021] AICmr 54 (14 October 2021)

[3] This example is used on the Commissioner’s APP Guidelines (paragraph B.92), which are available here: https://www.oaic.gov.au/__data/assets/pdf_file/0009/1125/app-guidelines-july-2019.pdf

[7] Australian Competition and Consumer Commission v Google LLC (No. 2) [2021] FCA 367.

Unlock unlimited access to all Global Data Review content