Lokke Moerel
  • Senior of Counsel
  • Morrison & Foerster
Belgium
Lokke Moerel

Lokke Moerel

  • Senior of Counsel
  • Morrison & Foerster
Belgium

Traditional practice areas like litigation often look at the world in the rear-view mirror. You litigate events that took place in the past. What I love about my work is that it is forward-looking and about innovation. I spend most of my time working with large tech companies on their global roll-outs of new technologies based on privacy and ethics by design. Privacy by design and ethics by design are not about choosing between clear-cut options, ie choosing between pre-existing options A or B, but about developing option C to mitigate the negative impact of a new technology on individuals and society. Because new digital services present new ethical design choices and ethical dilemmas, rules cannot always address this new reality.

At the same time, this new reality is not a moral-free area either. My role focuses on advising companies on how to try to shape new technologies like AI “for the good”, and how to shape the new normative framework. For example, advanced forms of AI are still a black box – we do not know how algorithms come to their outputs. EU privacy rules, however, require that algorithms do not create discriminatory outcomes and that automated decisions are explainable. This means we need to innovate to prevent discriminatory outcomes while ensuring transparency and explanation.

In terms of challenges in the field, many EU organisations are concerned about how to comply with the Schrems II decision and deciding on additional mitigating measures that will allow them to continue data transfers to the US.

The bigger political picture is that these companies are caught in the middle between the European Commission’s push for EU digital sovereignty (including a push for storage of data in the EU) and the inherently global business models of large cloud providers. Individual companies do not have many meaningful ways to influence the political agenda. 

To adapt to these challenges – the ‘serenity prayer’ is always a good start! Accept what you cannot change now – individual companies will not be able to force fundamental changes in the current ecosystem – and concentrate on what you can influence. My prediction is that any inter-company transfers that are inherently required to run your business will be able to continue. Focus on implementing mitigating measures there. Transfers that are not inherently required by the services provided by the large cloud providers will likely be solved at the political level.

Being a professor, in addition to my role as private practice lawyer, enables me to pick my research topics and hopefully set trends rather than just follow them. I am a member of the Dutch Cyber Security Council and my current fascination is with the impact of quantum computing and AI on our cyber resilience. If there is not enough innovation, there will be new dependencies. For example, without proper encryption, we will not be able to protect the valuable and sensitive information of our governments, companies, and citizens. Current encryption will not hold against the computing power of future quantum computers. We need to innovate now to protect our critical information in the future. This is not only relevant for future information but also for current information. A case in point: hostile states currently intercept and preserve encrypted communications in anticipation that these may be decrypted at a later stage and analysed by deploying AI. 

Another concern is how to protect our ‘collective’  data. We currently see massive mining of Western social media data by hostile states. This is concerning because analysing the data of a large enough portion of a population will be predictive for the entire population, and the GDPR provides no protection in this instance. For example, if enough EU citizens were to provide consent for their DNA to be analysed by a foreign company, this could potentially impact all EU citizens.

I think our field is much more gender-balanced than others. The reason is that privacy started out as a ‘soft issue’ (human rights), did not carry high rates and was ‘left’ to the women in the same way family law often is. To put it diplomatically, I used to see many men focus on practice areas where the money was made (eg patent infringement, M&A). When data became the fuel of the new economy, I started seeing more men enter the field.

I would say the biggest gender challenge in our field is getting bias out of data used for training algorithms!

With data becoming central to new economy’s business models, I have seen many women enjoy great careers. It’s wonderful to see and an example for other practice areas!

It is difficult to imagine a more inclusive group of practitioners. It is proven that where you have gender diversity, other forms of diversity quickly follow. Our field is evolving so quickly and is so multidisciplinary that nobody has a monopoly on wisdom. You simply have to collaborate. So join this pioneering and inclusive club!

Unlock unlimited access to all Global Data Review content