Transparency at the Point of Collection – The CJEU on Relational Pseudonymisation and Personal Data in EDPS v. SRB, by Anubhuti Raje

On 4 September 2025, the Court of Justice of the European Union issued its judgment in EDPS v. SRB (C-413/23 P), overturning a General Court decision and setting a new benchmark in the interpretation of EU data protection law. At the heart of the dispute lay a seemingly technical question- can pseudonymisation shield controllers from transparency duties? The Court’s answer was a resounding no.

In no uncertain terms, the CJEU held that personal opinions can qualify as personal data, that pseudonymisation is to be assessed in relation to the recipient’s realistic re-identification means, and that the obligation to inform data subjects arises at the moment of collection, not later. These holdings are far from narrow. They reshape the compliance obligations of EU institutions and private organisations alike, reaffirm the centrality of individual rights under the GDPR, and recalibrate the balance between data protection and the EU’s broader digital policy agenda.

The Case and Its Context

The case arose in 2017, in the aftermath of the resolution of Banco Popular. As part of a compensation procedure, the Single Resolution Board (“SRB”) launched a consultation in which 23,822 comments were submitted by affected shareholders and creditors. In an attempt to protect identities, the SRB stripped names and replaced them with 33-digit codes while retaining the linkage table internally. From this pool, 1,104 pseudonymised comments were transmitted to Deloitte, which had been contracted to conduct a valuation.

Yet the individuals who participated in the consultation were never informed that their submissions might be passed on to an external contractor. Complaints followed, and the European Data Protection Supervisor (“EDPS”) found that the SRB had violated its duty to inform under Regulation 2018/1725. The General Court partly annulled that decision, holding that Deloitte could not realistically re-identify the individuals. On appeal, the CJEU disagreed – while leaving the factual question of Deloitte’s re-identification capacity to the General Court, it laid down clear doctrinal principles on the legal character of pseudonymised free text and the timing of transparency obligations.

The Judgment and Doctrinal Holdings

The CJEU’s reasoning crystallises in three decisive holdings that together reshape the contours of EU data protection law.

Firstly, the Court confirmed that personal opinions fall under personal data. Paragraph 58 of the judgment further elaborates

“That assessment by the General Court misconstrues the particular nature of personal opinions or views which, as an expression of a person’s thinking, are necessarily closely linked to that person.”

This principle applies to free-text comments, employee feedback, or whistleblower reports, all of which remain connected to their authors. Even without explicit identifiers, such statements qualify as personal data where the author can be identified. This interpretation also expands on earlier jurisprudence, most notably Nowak, where exam scripts were brought within the scope of the GDPR.

Secondly, the Court clarified that pseudonymisation is not an absolute shield but a contextual concept. Whether data can be considered personal depends on “all the means reasonably likely to be used” by a particular recipient, taking into account cost, time, technological capabilities, and the availability of auxiliary datasets. In effect, the same dataset may be anonymous in one set of hands, yet personal in another. This relational approach builds on Breyer, but with far greater clarity and operational consequences.

Thirdly, the Court stressed that transparency duties must be discharged at the point of collection. Controllers are required to inform data subjects about foreseeable recipients when gathering the data. It is not sufficient to rely on subsequent pseudonymisation or downstream safeguards. The obligation is triggered from the outset, and controllers cannot retroactively cure an omission by arguing that data were later transformed or anonymised.

Essentially, this judgment closes off easy compliance shortcuts. They confirm that pseudonymisation is a tool of risk reduction, not a pathway out of the GDPR.

Legal and Compliance Significance

The implications of these holdings are far-reaching, both doctrinally and operationally. From a doctrinal perspective, the Court reinforces the principle, codified in Recital 26 and Article 4(1) of the GDPR, that ‘personal data’ encompasses any information relating to an identifiable individual, extending protection beyond static identifiers to the full spectrum of personal expressions, behaviors, and digital traces. By treating subjective statements as personal data, the Court ensures that individuals’ voices, whether in workplace surveys, customer complaints, or online consultations, remain under the law’s protection.

From a compliance perspective, the ruling recalibrates existing practices. Privacy notices must now disclose categories of recipients at the collection stage, even when pseudonymisation is planned. Data Protection Impact Assessments must explicitly analyse the re-identification risk from the recipient’s perspective. Vendor contracts require strengthening- prohibitions on re-identification, warranties on the absence of auxiliary datasets, and audit rights are no longer optional.

Equally significant is the governance dimension. Pseudonymisation is no longer a purely technical issue but a board-level risk. It should be treated as an ongoing strategic process integrated into enterprise compliance frameworks, rather than a one-off technical measure. This entails embedding documentation, accountability structures, and oversight mechanisms within organisational risk registers. TThe Court’s judgment signals to regulators that enforcement against ‘pseudonymisation washing’, that is, treating personal data as non-personal merely because names have been removed or technical codes applied without assessing re-identification risks, is both legitimate and expected.

Policy and Governance Consequences

The judgment arrives at a critical moment for the EU’s digital agenda. The Data Governance Act, the Data Act, and the Artificial Intelligence Act all hinge on the ability to distinguish reliably between personal and non-personal data. By insisting that pseudonymisation must be assessed contextually, the CJEU narrows the category of data that can circulate freely as “non-personal.” This makes secondary use regimes more complex, especially in domains such as AI training, health research, and financial services.

Yet the Court’s approach is not accidental. By grounding its reasoning in the Charter of Fundamental Rights and echoing the Strasbourg case that treats informational privacy as intrinsic to human dignity, most notably in Satakunnan Markkinapörssi Oy and Satamedia Oy v. Finland and Bărbulescu v. Romania, the Court reaffirms that fundamental rights are non-negotiable, even when weighed against efficiency or innovation.. This has two direct consequences. On the regulatory side, the EDPS and national Data Protection Authorities are empowered to scrutinise pseudonymisation practices more aggressively. On the litigation front, NGOs and privacy advocates now have a stronger platform to challenge weak compliance in strategic cases.

The result is a recalibration of governance. Controllers must no longer see pseudonymisation as a technical fix but as a continuous compliance programme, integrating legal, contractual, and organisational safeguards. In short, pseudonymisation has shifted from being a protective label to becoming a test of institutional seriousness about rights.

Conclusion

EDPS v. SRB is more than a technical clarification; it is a doctrinal landmark. By recognising opinions as personal data, adopting a relational test for pseudonymisation, and tying transparency obligations to the moment of collection, the Court has made clear that shortcuts will not suffice.

For organisations, the lesson is unmistakable. Pseudonymisation mitigates risk but does not remove obligations. Unless controllers can demonstrate, with robust documentation, technical measures, and contractual guarantees, that re-identification is not reasonably likely, pseudonymised data must be treated as personal.

This is not simply a compliance burden. It is a reaffirmation of the European digital legal order – rights do not vanish when data are coded or stripped of names. They travel with the data, pseudonyms included.

Posted by Anubhuti Raje (Final year law student at Gujarat National Law University, Gandhinagar)

Anubhuti Raje is a final year law student at Gujarat National Law University, specializing in Dispute Resolution and White-Collar Crimes. She has interned at leading law firms and chambers, and her research focuses on arbitration, criminal justice, and corporate governance, with several publications.