I came to the field as I studied computer science. The curriculum covered lots of maths and technical stuff, but hardly anything concerning societal issues, ethics or fundamental rights. So a student group organised additional lectures which covered non-technical topics such as data protection. We established a contact with the Data Protection Commissioner at that time (about 30 years ago). After passing my diploma exams, I started working at the Data Protection Commissioner’s office – and I have been the Data Protection Commissioner since 2015.
Data differs from other practice areas in that conventional law often relies on tangible goods that are owned by a person – these goods can be stolen or given away; they cannot be on different locations at the same time. Digital data is intangible. It can be copied so that it’s not distinguishable from the original. It can be given to other parties without being lost. It can be re-used for other purposes. But ultimately data represents knowledge or assumptions, and it may be the basis for all kinds of decisions.
Data protection is not so much about protecting data (in the sense of security measures or backup systems), but maintaining rights and freedoms of natural persons. Data can be (mis-)used to manipulate individuals, groups of people or society.
The biggest challenge in regulating the use of data is sharing the same vision of fairness in the use of data all over the globe and creating a joint understanding of chances and risks – but not losing time.
To adapt to these challenges, for the big picture, we need interdisciplinary debates, impact assessments, and ways to close the gap between research and practice. For a broad understanding in society, we need a hands-on approach, for example by communicating practical examples so that we all can learn from mistakes or from good solutions and best practices.
This means lifelong learning, bridging communities (for example in our daily work among lawyers, computer scientist and other experts in Germany, Europe and abroad), contributing to research (for example with our small research department), and educating adults and kids.
I’m closely following basically all trends in IT because they may change the world by posing or reducing the risks for rights and freedoms of persons or our society. These trends include artificial intelligence, the IOT, standardisation of policy languages or icons, data trustees, new developments in cryptography, anonymisation, and other privacy-enhancing technologies.
In my opinion, we need a standardised regulatory impact assessment, specifically concerning fundamental rights in the laws containing surveillance measures. In our “Forum Privatheit” we have proposed a ‘Surveillance Calculus’ to analyse the impact of existing law and give advice to law-makers proposing additional legislation.
The key to an equitable work environment is awareness of management. In my office we have an equal opportunities officer who is involved in all hiring processes. We communicate that we strive for an equitable work environment to encourage everybody apply. There is no payment difference. Also, we support family-friendliness in order to help better reconcile work, family and private life.
A piece of advice I would give aspiring data regulators and professionals is to be prepared for lifelong learning and to stay curious and interested in all kinds of topics, because data may affect all kinds of areas of business and lives. Be open, try to understand other perspectives, exchange information and look for synergies. It will never become boring.
Last year, the regional Higher Administrative Court ruled that the Schleswig-Holstein data protection authority's order to deactivate a Facebook page in 2011 was lawful. GDR asked: What is the significance of this latest ruling from the higher court, including with regards to the GDPR's one-stop-shop mechanism?
Before the latest ruling, the courts had decided: that if service providers have own purposes for processing the collected personal data (here: Facebook), it’s not a controller-processor relationship, but all parties are joint controllers (Facebook as well as the Facebook fan page administrator).
The latest ruling confirmed that the data processing by these joint controllers was not lawful, so the data protection authority was able to issue an order to the fan page administrator in its jurisdiction.
This does not contradict the one-stop-shop mechanism. In case a Facebook user sends a complaint on a presumed data protection infringement by the social network, the responsibility for the case stays with the competent data protection authority. But in a joint controllership scenario, a data protection authority competent for one of the joint controllers will have to handle the cases with relevance to that joint controller (which would not encompass the full scope of processing).
In the ideal world, the service provider could prove its compliance with data protection law, and the joint controllers would have a valid joint agreement according to article 26 GDPR. But in today’s not-so-perfect world it boils down to some wisdom one of the judges from the Federal Administrative Court has expressed: If all houses are burning, the firefighting has to start somewhere.