As far as I am concerned, privacy does not stand in the way of good data exchange in health care. It can even lead to great innovations. "Privacy by design" is the starting point in the GDPR (Article 25). But how does this work in practice? "Privacy by design" does not mean making something and then justifying the choices you have made in a Privacy Impact Assessment (or PIA, another requirement of the GDPR). Rather, it works the other way around: by first making such a PIA and only then starting the development, you get the best solutions. This is the way we work when it comes to digital support in fighting the pandemic.
Sounds vague, doesn't it, "privacy by design"? It does, and thus requires a concrete example. At the end of last year, scientists and others proposed allowing more contacts in cases where the epidemiological situation does not actually allow it (e.g. at events). That could be done - they said - if you lowered the risks by having a negative test prior to such an event. This gave rise to the idea of entry tests. But how do you prove a negative test?
You can, of course, prove it with a test result with all your data on it. But this involves your own medical data and is therefore not the most desirable situation. When we were asked to think about this (and about the digital technology to make it possible), we put privacy first. For example, there would have to be a legal basis for processing the required data (the law was recently passed by the Senate), people should not be trackable across visits to different events, it should not be possible to see at the gate what the reason for the reduced risk is (such as a test or a vaccination), there should not be a central database of that data and it is not necessary to include all your data in a proof to make it likely enough that the proof indeed belongs to you.
That thinking about data protection led to CoronaCheck, an app in which you securely retrieve the data from your negative test (and soon your vaccination) from the test provider and store it on (only) your own phone, turning it into a QR code that can be scanned. For example, the QR code changes constantly (so there's no point in comparing QR codes if you visit multiple events), the QR code does not indicate whether it's a negative test or a vaccination and only your initials, day and month of birth are included. Because sometimes initials are also very rare, we have even ensured (based on data from the Meertens Institute) that even less information is shown for rare initials.
The QR code that you show at a door is therefore not traceable across locations, does not say whether you have had a test or vaccination and does not tell exactly who you are. The way it works is described in the law and in upcoming legal regulations. Soon, the DPIA that explains it all will also go to the House of Representatives. So that it is clear to everyone how it works, what data is processed and by whom. The source code of the app is on GitHub so that everyone can check that it really works that way.
The development of the law, devising the working method and designing and creating the technology went hand in hand in recent months. Policy and implementation as worlds that influence each other to achieve privacy by design. Is it only about privacy? No, it certainly is not. Privacy, security and accessibility are the starting point. Not only for the app, but also for the chain.
Design by privacy, security and accessibility: it seems complicated and limiting, but it is not. It leads to innovative solutions and is even extra fun. So much fun that the developers who have been doing this together for a while now are coming up with their own suggestions as to how privacy might be embedded just that little bit better.