Anonymous Data
The General Data Protection Regulation (GDPR) applies only to information concerning an identified or identifiable natural person. If data is anonymised, it is no longer considered to be personal and is thus outside the scope of GDPR application. In other words, if data in a Sub Project (SP) is anonymous, the GDPR does not apply and the data can be processed for research purposes without the restrictions of data protection law.
However, given the difficulty in creating truly anonymous data, the bar for anonymisation has been set extremely high in the GDPR. To determine whether a person is identifiable, we must consider “all the means reasonably likely to be used, such as singling out, either by the controller or by another person to identify the natural person directly or indirectly.” To make this determination, we must consider all “objective factors, such as the costs of and the amount of time required for identification, taking into consideration the available technology at the time of the processing and technological developments.” In making this determination, SPs must consider the robustness of the anonymisation techniques they apply.
Anonymisation Techniques
The main anonymisation techniques applied in data protection law are randomization and generalization. Regardless of the technique applied (e.g. addition, permutation, differential privacy, aggregation, k-anonymity, l-diversity, t-closeness, etc.) three main questions should be considered:
- Is it still possible to single out an individual?
- Is it still possible to link records relating to an individual?
- Can information be inferred concerning an individual?
In its opinion on anonymisation techniques, the Working Party 29—a board of data protection authorities in the EU, also called the Article 29 Working Party—focuses particularly on appropriate engineering and application of the techniques. On that foundation, the Working Party 29 provides that “the optimal solution should be decided on a case-by-case basis, possibly by using a combination of different techniques…”. The table below shows the strengths and weaknesses of some of most common anonymisation techniques:
|
Is Singling out still a risk? |
Is Linkability still a risk? |
Is Inference still a risk? |
Pseudonymisation |
Yes |
Yes |
Yes |
Noise addition |
Yes |
May not |
May not |
Substitution |
Yes |
Yes |
May not |
Aggregation or K-anonymity |
No |
Yes |
Yes |
L-diversity |
No |
Yes |
May not |
Differential privacy |
May not |
May not |
May not |
Hashing/Tokenisation |
Yes |
Yes |
May not |
Strengths and weaknesses of different anonymisation techniques (WP216)
Although there is no prescriptive standard for in the EU, the Working Party 29 states that anonymisation requires “irreversibility preventing identification of the data subject” taking into account all the means “reasonably likely to be used” for identification. In short, effectively using this exception in practice is extremely difficult. Although this ‘zero risk’ approach has been criticized, it is the current position taken by regulators.
Once a determination has been made that data is anonymous, it must be re-evaluated and updated in light of technological changes such as big data. In other words, anonymisation is not a one-off exercise and requires regular revaluation and risk assessment by the data controllers employing the techniques.
HBP Policy
If HBP human data requires re-identification at some point, the data is not anonymised for purposes of the GDPR. The GDPR will remain applicable. Personal data that have been pseudonymised (i.e. encrypted) has certain advantages under the GDPR such as data breach reporting. Whenever possible, personal data in HBP data should be pseudonymised. However, the GDPR still applies to pseudonymised data because the data can be attributed to a natural person by the use of additional information such as a decryption key.
Helpful Links
Working Party 29 Opinion on Anonymisation Techniques (in particular Annex I on page 26)
ENISA: Privacy by design in big data (especially pages 28-38)
Data Protection in the HBP
Find out who is the Human Brain Project's data protection officer, utilise our resources for data protection in the Human Brain Project, or delve into EU resources on the topic.
Register an Ethical Concern
Anyone can requests to address ethical, regulatory and social issues in Human Brain Project research. The POint of REgistration (PORE) is HBP’s mechanism to register and identify these issues and keep track of how they are dealt with.