Contact tracing principles

Taylor Wessing London partner Christopher Jeffery sets out the data protection principles that covid-19 contact tracing apps will need to follow.

As lockdowns ease across the world, public authorities and private companies are looking to roll out contact tracing apps that notify people where they have been in proximity to someone with covid-19 or its symptoms. Some of these technologies are well-publicised, especially at the public authority level – and in the private sector, the market has been estimated at US$4 billion. Contact tracing is a long-established public health action.

Consulting companies like PwC and a range of tech companies such as Microshare, Locix and NetCompany are building or offering contact-tracing technology, often as a suite of "back to work" services. Some, for instance, track location in real time and show a heatmap of where people are in workplaces, shopping malls, airports and so on to help employers and operators manage congestion, and sometimes allow employees to select less congested times to visit canteens and other facilities or to exit the building.

The contact tracing apps typically work like this:

  • Mobile phone apps or wearables (bracelets, keyrings, badges), sometimes with beacons positioned around the site that communicate with the apps or wearables. The apps may communicate with each other using Bluetooth low-energy "handshakes".
  • Employees’ location is tracked automatically in the workplace only, typically (and very precisely – often to a few centimetres).
  • Unique IDs are assigned to each instance of the app or wearable device (representing an employee).
  • Either the devices or a central server log the unique ID of every employee an individual comes into contact with.
  • If a colleague reports covid-19 symptoms or a diagnosis, everyone they have been on contact with is notified so they can self-isolate. That notification is either automated through the app or is done discreetly by the HR team.

Often existing tracking and deep-location technology – used for asset tracking in IOT, step-by-step navigation guides in shopping centres, museums etc, and location-targeted advertising – are being rapidly re-purposed for contact tracing.

For governments and employers, several different factors combine to make this an urgent and delicate challenge:

  • The desire to protect people from the pandemic;
  • The timing needed to make these technologies available quickly as many countries start to emerge from lockdown;
  • The combination of health and location data making a spicy cocktail of privacy concerns and media coverage of the risks of data misuse – as well as long-established cultural sensitivities to workplace monitoring of any kind in some countries;
  • The need for buy-in – the tech is largely app-based, so people have to download it; unless enough do, organisations will not achieve the desired protections;
  • Employment law also is key, which can drive the need for consent in countries like the UK even where privacy requirements do not, and consultations with or approvals from works councils in countries like France and Germany.

What of the privacy challenges?  

Despite pragmatic and helpful noises from regulators like the UK Information Commissioner’s Office recognising that organisations may see the need to process health data for contact tracing and other covid-related risk management measures, European regulators are very clear that the GDPR still applies in the pandemic. When using new kinds of data in new ways like this, organisations need to apply the governance principle at the heart of GDPR. Guidance from the various regulators do not always align, so a patchwork of approaches across Europe (let alone looking beyond) needs to be managed.

The main privacy steps we see as relevant are:

  • Data minimisation: Collect only what you need. This is front and centre for technologies the EDPB describes as "a grave intrusion into … privacy". Privacy teams must test thoroughly why data is needed, whether it can be anonymised or at least pseudonymised in the pre-notification phase, and push for a short retention period reflecting public health guidance on infection risk periods.
  • Purpose limitation: The system and its access controls and retention periods should all be designed to prevent use for other purposes. Being transparent about this will also help build trust. Ensuring the location data of employee movements cannot be used to measure productivity, or to support decision-making around furlough or redundancy is key – and hard-wiring that into system design is much better than policies around how HR and other teams are allowed to use the data.
  • Lawful basis needs thought and awareness of local regulator sensitivities. Health data requires a GDPR article 9 lawful basis too – and for employers this will either be explicit consent, or carrying out the obligations of the controller in the field of employment and social security and social protection law.
  • ePrivacy: Consider whether the collection of location data requires consent here (if it does, you should look at consent as the likely lawful basis under the GDPR) or whether it can be argued that collection of location data is strictly necessary for the provision of a service the user requests.
  • Security: Investigate whether the nature of this data merits additional measures, and audit tech providers carefully about app and wearable security and document their responses. This is standard vendor management, but there is perhaps a more acute need here given the data collected.
  • Transparency, with more thoroughness and thought than a privacy policy update. Employment lawyers recommend that all aspects of back-to-work safety are the subject of employee consultation; the uses of data should be included here, and employee feedback considered and engaged with (even though any radical re-design of how the tech works may be challenging or impossible)
  • Work your privacy/governance programme: larger companies will usually have a governance programme with a process for new technologies and processing activities. This is exactly the kind of unusual data collection that governance is for, so any temptation to take the project out of that programme for timing or other reasons should be resisted – and if that means creating a specific fast-track process within the programme on the fly, so be it.
  • Data protection impact assessments and data protection officers: Governance programmes will require these and even if you do not have a written governance program or a DPO, you need to do a data protection impact assessment (DPIA) – the EDPB, ICO and German regulators' guidance say so. Companies that have not done DPIAs before sometimes assume they are painful, but they need not be. It is one area of GDPR governance we think a lot of companies have under-utilised, sometimes performing mental acrobatics to find reasons why a DPIA is not required. Doing one here will establish an internal template and process which, perhaps with some refining, can be used for other processing activities that need it. The bottom line is that where there are grey areas in the GDPR treatment of the processing, a DPIA shows you have done your homework and will be helpful in managing any employee or regulator’s complaint or investigation.  
  • Timing!  The privacy team needs to be at the table early on this one, even if that does not always or ever generally happen. Internal legal and compliance teams have not always been given the chance to assess what data is collected in these apps, and how it is used, shared and retained – let alone the change to carry out the DPIA. In discussions with companies looking to sell and buy private sector solutions for contact tracing, one occasional misconception is that the role of the privacy professional here is to "paper" what the tech and business teams think is right in terms of data collection and use. That is unfortunate at the best of times, making privacy by design and default and data minimisation and other privacy steps challenging as they need to be retrofitted, but in this instance it creates real regulatory risk.

While lockdown easing and preventing a second wave of covid-19 infections is a key initial driver, doubts about the real prospects of a vaccine and the prospects (and fears) of a rapid move to herd immunity mean these apps could be with us maybe for years. The privacy issues should be approached quickly and with the sense of urgency required, but also on the assumption that we will live with them and any privacy shortcomings for some time to come – meaning quick privacy fixes which rely on some "slack" from the regulators are unsuitable.   

Nothing in the list above will come as a surprise – we are just applying normal GDPR principles.  The combination of high-risk processing requiring a measured, considered application of those principles, the need to move quickly as lockdowns are starting now in many countries, and the time these solutions will take to be implemented, tested and rolled out creates a specific privacy challenge that HR and IT engineering teams need to deal with together.


Get unlimited access to all Global Data Review content