The Paper Trail: Data Protection Impact Assessments and Documentation
This is an Insight article, written by a selected partner as part of GDR's co-published content. Read more on Insight
During the past few years, several data protection laws have been enacted throughout the world. The European Union’s General Data Protection Regulation (GDPR) has been viewed as one of the most comprehensive data protection laws in the world and is deemed by many to be the gold standard for laws regulating the processing of personal data. It is no surprise, therefore, that the drafting of data protection laws in many countries has been inspired by the GDPR.
One similarity between the GDPR and the data protection laws in many countries is the idea of accountability, which reflects the obligation of the data controller to be responsible for, and to be able to demonstrate, compliance with the law. In other words, simply complying with the law is not enough: data controllers must be able to effectively show that they are complying with the law.
To do that, creating documentation is fundamental. In some situations, it may also be one of the main obligations of data controllers, such as having records of the processing activities, structuring privacy notices that observe the principle of transparency by providing data subjects with proper information about the processing activities, drafting incident response plans to handle data breaches, and the like.
This chapter focuses on one of the most relevant documents for this paper trail: data protection impact assessments (DPIAs). A DPIA is a process that shows that the data controller, having noticed that a project that involves the processing of personal data may result in risks to the fundamental rights and individual freedom of data subjects, has conducted a prior analysis regarding the intended processing activities, identifying the risks arising from the processing and mapping measures that could be implemented to reduce or eliminate those risks.
We begin with a brief overview of how different countries have used DPIAs (or privacy impact assessments (PIAs)) over the years to understand the evolution of this process and the distinct approaches by different laws.
As Brazil is one of the countries that has followed the GDPR stance of making accountability a key element of compliance with its data protection law, in the final section we look at how Brazil’s experience can provide a model for accountability and what it may hold for the future.
The European Union is considered the cradle of identification and standardisation of the social value of modern privacy, raised to a fundamental human right, and the conception of practices in favour of the protection of personal data. Nonetheless, even at the European level, the history of adopting DPIAs is fairly recent.
Regulation (EU) 2016/679 (the GDPR), which was approved on 14 April 2016 and became fully effective on 25 May 2018, was the first law in the European Union that imposed a mandatory requirement on data controllers to conduct DPIAs in some cases, specifically where the processing is likely to result in a high risk to the rights and freedom of natural persons. However, the fact that a requirement for conducting a DPIA has only been set forth by law fairly recently is not to say that PIAs were not a subject of attention by EU Member States prior to 2016.
The United Kingdom was the first (being a Member State at that time) to emphasise the importance of conducting PIAs, when editing, through the Information Commissioner’s Office (ICO), a manual on PIAs, released in December 2007 and revised in June 2009.
The manual published by the ICO identified a PIA as a process that helps to visualise the risks arising from the collection, use or disclosure of information about individuals and to anticipate any problems arising from these practices and possible solutions to those difficulties. The responsibility for conducting a PIA would lie with the senior executive level of the organisation, especially someone linked to the areas of audit or risk analysis. Among the reasons listed by the ICO to understand that it was necessary to carry out a PIA were preventing inappropriate solutions and minimising the costs of a project that may have a high probability of privacy risks, causing a loss of trust and damage to the reputation of the organisation with its stakeholders.
The ICO listed 11 questions as a screening to aid in determining whether a large-scale PIA, with an exhaustive analysis of the privacy risks posed by the project, should be undertaken. Depending on the responses to these questions, conducting a PIA could be highly recommended as an appropriate measure to prevent greater risks to data subjects.
In November 2010, the European Commission published a communication entitled ‘A comprehensive approach on personal data protection in the European Union’, in which it outlined the challenges being faced 15 years after the publication of Directive 95/46/EC (the Data Protection Directive), and stating the profound changes around the world as a result of globalisation and the accelerated development of new technologies, especially those that collect data in ways that are not easily perceptible to individuals.
One of the ways identified by the European Commission to ensure that data controllers adopted more adequate policies to respect the privacy of individuals and more effective data protection mechanisms was precisely the analysis of the possible inclusion of an obligation to carry out DPIAs in certain cases, such as when sensitive personal data is processed. This mandatory requirement, however, would not actually be implemented until much later, with the entry into force of the GDPR.
In December 2010, Ireland was the second EU Member State to publish a guide on carrying out PIAs, specifically for the processing of sensitive data about aspects of the health of patients in hospitals across the country. Among other things, this guide analysed factors that must be taken into account to identify real and potential risks to the privacy of data subjects, indicating the questions that must be asked by health service providers and proposing a step-by-step process for performing a PIA.
The process proposed by the Health Information and Quality Authority of Ireland was divided into four stages: (1) answering an 11-question questionnaire – a positive answer to only one of them would make it necessary to conduct a PIA; (2) identifying potential privacy risks, detailing the project scope, information flows and adopted security measures; (3) analysing the identified risks and defining ways to eliminate or reduce them; and (4) preparation of the report itself, with the documentation of the entire process that had been carried out so far.
One of the relevant points exposed by the Health Information and Quality Authority’s guide was the need for the PIA to be regularly updated and to monitor the development process of the respective product or service, so as to ensure that all possible risks to privacy discovered along the way were addressed by the report. Therefore, it was not deemed sufficient to prepare a PIA and keep it for registration only – the report would have to be revisited frequently.
In February 2011, the Article 29 Working Party (WP29) ratified the recommendations issued by the European Commission on the development of a PIA model to be adopted for the development of products or services involving identification methods by radio frequency. This is the first example of using PIAs to address concerns relating to a specific industry sector.
In February 2014, the ICO published a new guide on PIAs entitled ‘Conducting privacy impact assessments code of practice’. This code was intended to improve the guidelines formulated in the manual made available in 2007 and revised in 2009, especially to make it clearer how to better integrate PIAs with existing projects and other risk management tools, as well as to make the PIAs more effective and practical tools.
The code identifies some projects that possibly warrant a PIA being carried out, such as a new surveillance system or the application of new technologies to existing systems, and explains that it is up to each organisation to define who is better positioned internally to coordinate the process, and emphasising that the data protection officer, when such a position exists in the organisation, is naturally seen as a professional who will have significant influence on the work, even though she or he may not be responsible for carrying out all steps of the process.
In April 2017, WP29 issued a guideline on DPIAs and when a risk should be interpreted as high. In the document, WP29 indicates some criteria that must be taken into account for the analysis, such as whether the processing involves sensitive data, whether personal data will be transferred outside the European Union, whether innovative uses will be adopted, among other things. If two of these criteria are present, the interpretation of WP29 was that it was highly likely that the processing could involve high risks. The guide also advises that a DPIA should be re-evaluated at least every three years, or less, depending on the nature of the processing.
On 25 May 2018, when the GDPR became fully effective, conducting a DPIA was set as a requirement under Article 35(1). It is interesting to note that the text of the GDPR brings a condition regarding the quantification of the level of risk to the rights and freedom of natural persons to assess whether or not there is an obligation to carry out a DPIA.
Although it is stated explicitly that the use of new technologies is considered as an aspect to be observed when assessing the obligation to carry out a DPIA, the criterion that effectively defines whether a previous DPIA should have been conducted is the existence of a high risk to the rights and freedom of individuals. It is not, therefore, any type of risk that triggers the requirement for a DPIA, only those risks that are considered more serious and that may cause greater damage to the privacy of natural persons.
The great difficulty, however, is to measure this risk and identify when it would actually be deemed high, compared with the less relevant risks that would not entail the obligation to carry out a DPIA. In this sense, the GDPR text itself includes an exemplifying list of situations in which a DPIA would clearly be necessary, provided for in Article 35(3).
Among the hypotheses therein are (1) in the case of systematic and exhaustive assessments of personal aspects about natural persons, which are based on automated data processing, including profiling, and in which decisions that produce legal effects for these individuals are ruled or similarly affect them, (2) in cases of large-scale processing of sensitive data or data about criminal convictions or criminal offences and (3) in cases of large-scale systematic monitoring of publicly accessible areas.
It is worth remembering that these hypotheses are merely illustrative, not exhaustive. Also, for this very reason, the GDPR has already foreseen that the supervisory authorities of EU Member States should establish and make public lists with specific processing activities that would be subject to the obligation to carry out a DPIA, in accordance with its sole discretion, respecting the consistency mechanism provided for in the Regulation, so that different authorities do not have divergent positions on the same topic that may affect the free flow of personal data within the European Union.
The opinions issued by the European Data Protection Board in the past few years on the lists made available by supervisory authorities of EU Member States bring some interesting points that deserve note, including the following:
- the processing of biometric data does not necessarily represent a high risk; however, when this processing activity is carried out only to identify a natural person and with at least some more criteria, a DPIA would be necessary;
- the use of a new or innovative technology, by itself, does not represent a high risk, so that the requirement for a DPIA in this case would need to be guided in conjunction with some other criterion;
- the processing of data not relating to health, collected or processed with the aid of a body implant, does not require the carrying out of a DPIA in all cases, but the processing of health data by such an implant does; and
- the processing of location data does not necessarily represent a high risk, and may be carried out without carrying out a DPIA, except in cases where other additional criteria are present that make the performance of a DPIA necessary in accordance with the joint analysis of all the factors involved.
Another relevant aspect about DPIAs under the GDPR regime is the data controller’s duty to carry out a prior consultation with the competent supervisory authority when an assessment indicates that the processing would result in a high risk to data subjects in the absence of measures taken to mitigate those risks.
PIAs have been known and used in the United States for several years. In 2002, the E-Government Act was passed, a federal law designed to improve the administration and promotion of electronic services provided by the government and to establish a framework of measures to improve citizens’ access to the respective services and to government information.
In Section 208 of the Act, an obligation was created for government agencies to conduct a PIA before developing or acquiring technologies that collect, maintain or disseminate information that is in an identifiable format or before initiating a new collection of information that will be collected, maintained or disseminated using information technologies, and include any information in an identifiable form, allowing physical or digital contact with a particular individual, provided that the collection is imposed on 10 or more people, excluding employees of the federal government.
The PIA must be reviewed and approved by the chief information officer or equivalent position of the respective government agency and, after its approval, must be made public through the agency’s website or publication in the official gazette, except when there is a need to protect classified, confidential or private information. A copy of the report must be provided to the agency director, who may edit specifications on the minimum content of a PIA within the respective agency.
In September 2003, the Office of Management and Budget, linked to the Office of the President of the United States of America, issued a memorandum addressed to the heads of government agencies and departments of the Executive Branch with guidance on how to implement the provisions of the E-Government Act regarding the performance of PIAs.
As indicated in that document, PIAs must be carried out and updated whenever necessary, especially when there is any change in systems that creates risks to privacy, such as the conversion of paper support systems to the shared use of digital or new media among government agencies, with the exchange of information in identifiable formats.
Another point made clear by the memorandum was that agencies should identify, in the PIAs, what choices were made in relation to information technology systems and information collection as a result of carrying out the PIA, and that the study should be carried out at the beginning of the development, being later updated before the effective implementation of the system to consider aspects that were not identified at the product design stage.
It is interesting to note that the obligation to carry out PIAs in the United States, at the federal level, is restricted to departments of the Executive Branch, government agencies and any third parties that contract with them, provided that they use information technologies or operate websites with the purpose of interacting with the public. In other words, there is no federal mandate that determines the performance of PIAs by private companies without direct contractual relations with the public administration, demonstrating one of the aspects much discussed in relation to privacy and data protection in North America, that such factors cannot impede or create obstacles to business development.
The Canadian provinces of Ontario, British Columbia and Alberta were the first to develop specific regulations on PIAs, even before the topic was discussed at the federal level.
In Ontario, conducting a PIA became a mandatory and prior requirement for the approval of any government project involving information and information technologies as of 1998, with the subsequent availability of guidelines in December 1999 in a guide issued by a government agency known at the time as the Management Board Secretariat.
In British Columbia, PIAs became mandatory on the part of government agencies for the implementation of any new system, project or programme as of 2002, owing to changes made to the Freedom of Information and Protection of Privacy Act, even if the provisions contained in that law do not treat PIAs as a comprehensive study of privacy risks, but rather as a checklist verification to ensure that certain legal requirements are being adhered to.
In Alberta, the preparation of PIAs began to be required with the passage of the Health Information Act in 1999, even though the provisions only apply to agencies in the health sector, so that no other sector, whether public or private, is obliged to comply with the rules. In early 2009, the Office of the Information and Privacy Commissioner (OIPC) revised the manuals it had previously published on the performance of PIAs, making it clear that reports would need to be submitted to the OIPC prior to implementing the proposed project. Those reports could be rejected by the OIPC or readjusted in accordance with the OIPC guidelines.
At the federal level, all government institutions are subject to the obligation to carry out a PIA to ensure that any projects or initiatives to be implemented, and that involve the collection, use or provision of personal information, comply with the provisions set out in the Privacy Act, Library and Archives of Canada Act, and government privacy and data protection policies. Any substantial changes to existing programmes or projects that could pose a risk to privacy should also be subject to a PIA.
The final report of the PIA must be submitted to the Treasury Board of Canada Secretariat (TBS) and the Office of the Privacy Commissioner of Canada. In April 2010, the TBS promulgated a new directive on PIAs, which linked the performance of PIAs to the release of funds for programme approval. This means that when a government agency does not complete a PIA, in cases where it is obliged to do so, it may not receive the necessary resources to implement the respective project.
As is the case in the United States, there is no specific legislation that obliges private companies to carry out PIAs, although they may be viewed favourably by regulators in the event of a data breach.
Suggested model for DPIAs in Brazil
Brazil’s General Data Protection Law (LGPD) is the first Brazilian law to address DPIAs. Prior to its enactment, impact assessments regarding privacy or data protection were not something that was considered by local regulators, thus the current lack of guidance on the performance of DPIAs in the country.
Even though the LGPD came into force in September 2021, it is still not clear when a DPIA is actually required under the law. The relevant provisions are open to doubt and, unlike the GDPR, there are no provisions that explicitly state that a DPIA should be carried out where a high risk to the individual rights and freedom of the data subjects is expected.
According to Article 38 of the LGPD, the National Data Protection Authority (ANPD) may request data controllers to carry out a DPIA at any time and shall issue further regulations on carrying out DPIAs, but there is no prior obligation set forth by this provision to carry out a DPIA before any request from the ANPD.
Besides that provision, Article 10, Paragraph 3 of the LGPD also mentions that the ANPD may request a DPIA when the processing is based on the legitimate interests of the data controller. Although some commentators believe that this provision set out a requirement for carrying out a DPIA in every instance where the lawful ground for the processing is the legitimate interest of the data controller, it seems that this interpretation would create an unnecessary burden that was not intended by the legislator, especially considering that not every processing activity based on the legitimate interest of the controller carries a high risk to the data subjects.
Although there is no guidance from the ANPD, we believe that an appropriate approach for when it would be necessary to carry out a DPIA should be based on an evaluation of the level of risk to the rights and freedom of data subjects resulting from the processing activities, in a similar way to how this is addressed by the GDPR. The recommended approach would be to adopt an assessment method supported by thresholds defined by the data controller, through the response to a previously prepared standard questionnaire, with a checklist of some crucial factors that must be analysed and that might indicate risks involved in a given personal data processing activity. The questions could include the following:
|Does the project involve the processing of sensitive personal data?|
|Does the project involve the processing of personal data of vulnerable individuals?|
|Does the project involve large-scale processing?|
|Does the project involve systematic monitoring of data subjects or public areas?|
|Does the project involve the adoption of decisions based on automated data processing?|
|Does the project involve new technologies or new applications of current technologies?|
|Does the project involve profiling, scoring or another form of specific classification attributed to each data subject and decisions based on this classification?|
|Does the project involve any type of restriction on data subjects in exercising their rights?|
|Does the project involve combining, comparing or matching data from multiple sources?|
|Does the project involve the processing of geolocation data from data subjects?|
|Does the project involve the processing of personal data of children and adolescents?|
|Does the project involve sharing personal data with third parties?|
|Does the project involve international transfers of personal data?|
|Does the project involve contacting data subjects in ways that could be considered intrusive?|
|Does the project involve the processing of financial data?|
|Does the project involve the processing of data that, although considered anonymised, may, in combination with other data, from the same source or from multiple sources, allow data subjects to be identified?|
|Does the project involve the indirect collection of personal data, when it is not possible or feasible to guarantee the right to information to the data subject?|
|Does the project involve the migration of personal data from one system to another?|
Although an isolated positive answer to some of the above questions does not mean, by itself, the existence of a high risk to the rights and freedom of data subjects, these questions serve as a good basis for evaluating the level of risk involved in the project, given that the more affirmative answers given, the greater the risk in the intended operation. When a high risk is observed, it is recommended that a DPIA be carried out, regardless of any request from the ANPD.
Answering such a questionnaire might give the false impression that carrying out a DPIA is only a tick-box exercise, but this is far from what is truly expected from such an assessment. The DPIA should be seen as a process rather than just a report, composed of several steps towards drafting a final report and that are fundamental to the process. Those fundamental steps might include the following:
|Describing the proposed project, including its nature, scope, context, purposes and legal bases for data processing|
|Describing the types of data collected, the volume, the methodology used for collection, with whom the data will be shared, who the data controllers and processors will be, how the data will be stored and for how long the data will be used and retained, the measures of security already in place and who can access the data|
|Explaining the choices made in the project so that it complies with all the fundamental principles set out in Article 6 of the LGPD|
|Analysing data flows and identifying possible risks to privacy involved in the project that may violate the fundamental rights and freedom of data subjects, classifying the probability and degree of the respective risks|
|Identifying and analysing the measures and safeguards that could be implemented to eliminate, or at least reduce, the risks involved|
|Consulting the stakeholders involved, taking into account any suggestions received and the concerns raised by them|
|Formulating the necessary recommendations, establishing an action plan for its implementation, and integrating the solutions into the project before it is available to the market|
|Preparing the report itself and, depending on the case, evaluating its public disclosure for knowledge by the stakeholders involved|
|Implementing the recommendations set out in the report|
|Reviewing and updating the DPIA throughout the life of the project, ensuring that the assumptions and descriptions defined in the report remain true and that there are no new identified risks and new protective measures to be implemented|
These steps are merely illustrative and serve as a standard threshold to be considered, but others can be added, so that the structure of the DPIA process is in accordance with an organisation’s guidelines.
What can never be omitted is a description of the types of data that will be processed, the methods by which they will be collected, the guarantees of information security and the formulation of measures, safeguards and mechanisms that will be implemented to mitigate the highlighted risks, pursuant to the express determination of Article 38, sole paragraph, of the LGPD.
One of the most important points about the steps outlined above is that the DPIA cannot be an end in itself. It is not enough to draw up a report and imagine that the final document, alone, which will be used to record the process carried out, is sufficient to prevent all the risks involved with a given project. On the contrary, as it is a process, the DPIA must be regularly updated and revised, following the project throughout the time it is implemented and adopted by the organisation.
After all, a small change in some aspect of the project, even after it has already been implemented and is being made available to the market, can significantly change the risks to the privacy of data subjects, making it essential to adopt new measures of previously unanticipated mitigation.
This suggested model for a DPIA in Brazil should be reviewed once the ANPD has issued further regulation on this subject. It is expected that the ANPD will provide specific instructions about when and how to conduct DPIAs.
 Felipe Palhares is a partner at BMA – Barbosa, Müssnich, Aragão Advogados.
 Regulation (EU) 2016/679.
 One of those countries is Brazil. In August 2018, legislators enacted Law No. 13.709/2018 – commonly referred to as the Brazilian General Data Protection Law (Lei Geral de Proteção de Dados Pessoais), which is the first law in Brazil drafted specifically to regulate the processing of personal data. It came into effect on 18 September 2021.
 Wright, David; Finn, Rachel; Rodrigues, Rowena, ‘A comparative analysis of privacy impact assessment in six countries’, Journal of Contemporary European Research, Vol. 9, Issue 1 (2013), p. 170.
 Wadhwa, Kush, ‘Privacy impact assessment reports: a report card’, Info, Vol. 14, Issue 3 (2012), p. 40.
 European Commission, Communication from the Commission to the European Parliament, the Council, the Economic and Social Committee and the Committee of the Regions, ‘A comprehensive approach on personal data protection in the European Union’, available at https://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=COM:2010:0609:FIN:EN:PDF (last accessed 31 Jan. 2022).
 Wright, David, ‘The state of the art in privacy impact assessment’, Computer Law & Security Review, Vol. 38 (2012), p. 55.
 Health Information and Quality Authority, ‘Guidance on Privacy Impact Assessment in Health and Social Care’ (December 2010), p. 14, available at www.hiqa.ie/sites/default/files/2017-03/ HI_Privacy_Impact_Assessment.pdf (last accessed 31 Jan. 2022).
 Article 29 Data Protection Working Party, ‘Opinion 9/2011 on the revised Industry Proposal for a Privacy and Data Protection Impact Assessment Framework for RFID Applications’ (February 2011), available at https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2011/wp180_en.pdf (last accessed 31 Jan. 2022).
 Costa, Luiz, ‘Privacy and the precautionary principle’, Computer Law & Security Review, Vol. 28, 2012, p. 19.
 Information Commissioner’s Office, ‘Conducting privacy impact assessments code of practice: Data Protection Act’, available at https://www.pdpjournals.com/docs/88317.pdf (last accessed 31 Jan. 2022).
 Article 29 Data Protection Working Party, ‘Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679 (2017), available at http://ec.europa.eu/newsroom/document.cfm?doc_id=47711 (last accessed 31 Jan. 2022).
 European Data Protection Board (EDPB), ‘Opinion 6/2019 on the draft list of the competent supervisory authority of Spain regarding the processing operations subject to the requirement of a data protection impact assessment (Article 35.4 GDPR)’, available at https://edpb.europa.eu/sites/edpb/files/files/file1/201906_edpb_art.64_es_sas_dpia_list_en_0.pdf (last accessed 31 Jan. 2022).
 EDPB, ‘Opinion 21/2018 on the draft list of the competent supervisory authority of Slovakia regarding the processing operations subject to the requirement of a data protection impact assessment (Article 35.4 GDPR)’ , available at https://edpb.europa.eu/sites/edpb/files/files/file1/2018-09-25-opinion_2018_art._64_sk_sas_dpia_list_en.pdf (last accessed 31 Jan. 2022).
 EDPB, ‘Opinion 18/2018 on the draft list of the competent supervisory authority of Portugal regarding the processing operations subject to the requirement of a data protection impact assessment (Article 35.4 GDPR)’, available at https://edpb.europa.eu/sites/edpb/files/files/file1/2018-09-25-opinion_2018_art._64_pt_sas_dpia_list_en.pdf (last accessed 31 Jan. 2022).
 EDPB, ‘Opinion 22/2018 on the draft list of the competent supervisory authority of the United Kingdom regarding the processing operations subject to the requirement of a data protection impact assessment (Article 35.4 GDPR)’, available at https://edpb.europa.eu/sites/edpb/files/files/file1/2018-09-25-opinion_2018_art._64_uk_sas_dpia_list_en.pdf (last accessed 31 Jan. 2022).
 Clarke, Roger, ‘An evaluation of privacy impact assessment guidance documents’, International Data Privacy Law, Vol. 1, Issue 2 (2011), p. 117.
 Seto, Yoichi, ‘Application of privacy impact assessment in the smart city’, Electronics and Communication in Japan, Vol. 98, Issue 2 (2015), p. 11.
 Office of Management and Budget, ‘M-03-22, OMB Guidance for Implementing the Privacy Provisions of the E-Government Act of 2002’ (26 September 2003), available at www.whitehouse.gov/wp-content/uploads/2017/11/203-M-03-22-OMB-Guidance-for-Implementing-the-Privacy-Provisions-of-the-E-Government-Act-of-2002-1.pdf (last accessed 31 Jan. 2022).
 Wright, David; Finn, Rachel; Rodrigues, Rowena, op. cit., at p. 171.
 Bryant, Jennifer, ‘Washington Privacy Act fails for second time’, International Association of Privacy Professionals (13 March 2020), available at https://iapp.org/news/a/washington-privacy-act-fails-for-second-time/ (last accessed 31 Jan. 2022).
 Clarke, Roger, op. cit., at p. 127.
 Wright, David; Finn, Rachel; Rodrigues, Rowena, op. cit., at p. 167.
 See footnote 3, above.