ECJ Ruling on Automated Decision-Making and Data Subject Access
-
Legal Development 24 March 2025 24 March 2025
-
UK & Europe
-
Regulatory & Investigations - Regulatory Risk
On 27 February 2025, the European Court of Justice (ECJ) delivered an important judgment (C-203/22 - Dun & Bradstreet Austria GmbH) concerning the right of access to information under Article 15 of the General Data Protection Regulation (GDPR). The main issue addressed by the ECJ was the scope of the right of access to information in the context of automated individual decision-making, as set out in Article 22 GDPR, especially when trade secrets are involved. This decision has substantial practical implications, as companies, including those within the insurance sector, increasingly integrate artificial intelligence (AI) into their decision-making processes.
Facts of the case
An Austrian mobile telecommunications operator refused to enter into a contract with a customer, because of insufficient credit standing. In its assessment, the operator relied on an automated credit evaluation provided by a third-party credit information agency. The customer requested access to information held by the credit information agency, which it provided but not to the satisfaction of the customer. Subsequently, the customer initiated legal proceedings to demand further information on the logic behind the automated decision-making process. The dispute centred around whether the credit agency was obliged to provide additional details under Article 15(1)(h) GDPR, specifically regarding the logic of the automated decision-making. The credit agency argued that revealing this information would disclose protected trade secrets. As a result of the dispute, the credit information agency was ordered to provide the customer with “meaningful information about the logic involved” as required by Article 15(1)(h) GDPR.
For a separate Austrian court that was responsible for enforcing the judgment (Enforcement Court) the question arose, which specific information the term “meaningful information about the logic involved” encompasses. Since the enforcement of that judgment depended on an interpretation of the GDPR, the Enforcement Court has essentially referred the following questions to the ECJ:
- Does "meaningful information about the logic involved" require the controller to provide a comprehensive explanation of the procedures and principles used to come to a specific decision?
- In cases where the controller argues that the requested information involves third-party data protected by the GDPR or trade secrets (as per Directive 2016/943), is the controller obliged to submit the potentially protected information to supervisory authorities or courts for review?
First question: Meaningful information about the logic involved.
In response to the first question, the ECJ affirmed that the term "meaningful information about the logic involved" varies in different language versions of the GDPR. However, the ECJ concluded that this phrase encompasses all relevant details about the procedures and principles employed in automated decision-making.
In alignment with Articles 13(2)(f) and 14(2)(g) GDPR, which set out transparency obligations, the information provided must be clear, concise, and easily understandable to enable data subjects to comprehend how their personal data is processed. The ECJ reiterated that the right of access, as enshrined in Article 15 GDPR, is intended to enable data subjects to verify the accuracy and lawfulness of their data processing (see also Discovery through the backdoor? – ECJ on GDPR data subject access request for purposes not related to data protection), a key aspect of ensuring individuals' rights under Article 22(3) GDPR, which regulates automated decision-making and profiling.
While the ECJ clarified that "meaningful information" does not require the disclosure of complex algorithms, it does require a sufficiently detailed explanation of the decision-making procedures and principles. This explanation must enable individuals to understand the personal data factors that influenced the decision and how variations in these data might change the outcome. This principle applies equally in the insurance sector, where automated processes are increasingly utilised in underwriting, claims assessments, and risk management.
Second question: Trade secrets and the right of access to information.
On the second question, the ECJ sought to balance the right of access under GDPR with other fundamental rights, including the protection of trade secrets and third-party data. The ECJ emphasised that data protection, while a fundamental right, must be carefully weighed against intellectual property protections, as outlined in Recital 63 GDPR.
In instances where the provision of personal data to a data subject could infringe upon third-party rights—such as trade secrets—the ECJ affirmed that a proportionality test must be applied. The controller must assess whether the disclosure can be made without violating the rights of others. If a conflict arises, the matter must be referred to the competent supervisory authority or court, which will determine the appropriate balance between the data subject’s access rights and the protection of third-party interests.
The ECJ also ruled that a Member State cannot impose a blanket prohibition on disclosing business or trade secrets, as this would undermine the GDPR's requirement for a proportional balance of competing rights. In contested access cases, controllers must provide relevant information to supervisory authorities or courts to facilitate an informed decision in line with the principle of proportionality.
Ultimately, the ECJ ruled that controllers must carry out case-by-case assessments and cannot refuse to provide access to personal data solely on the grounds of protecting third-party rights or trade secrets.
Practical implications of the decision: Increased focus on transparency and change of processes.
At first glance, this judgment, like the ECJ ruling on SCHUFA (C-634/21 - SCHUFA Holding AG) in December 2023, appears to relate only to credit information agencies. Upon closer examination, however, its implications go much further.
Following the ECJ ruling on SCHUFA, the Hamburg Commissioner for Data Protection and Freedom of Information (HmbBfDI) has published an opinion on the SCHUFA ruling, stating that the principles of this judgment are not limited to the assessment of credit standings but are also applicable to AI systems. Considering the standards of the ECJ, the HmbBfDI stated that AI systems, that are used to create a basis for a preliminary decision, can also be classified as decision-making in terms of Article 22 GDPR if they play a decisive role in the decision-making process. This is particularly the case if the output is based on barely comprehensible criteria developed independently by AI. (see also Hamburg Commissioner for Data Protection and Freedom of Information - Opinion on ECJ ruling C-634/21 [German]). The other German data protection supervisory authorities support this opinion and have since increased their scrutiny of AI applications that rely on algorithms for decision-making.
In light of this, companies can also expect the German data protection supervisory authorities to apply the principles of the recent judgment to AI applications.
As a result, companies that use AI for decision-making must implement transparency and human oversight measures. Regulatory bodies are expected to enforce compliance through increased monitoring and regulatory measures.
The increasing reliance on AI in decision-making processes within industries underlines the importance of this judgment. Alongside compliance with the GDPR, companies, must also consider the provisions of the EU AI Act. Both regulatory frameworks require increased transparency.
The recent ruling of the ECJ reaffirms that organisations must provide data subjects with meaningful insight into the logic behind automated decisions. In practice, this may compel insurers and other affected companies to review or adapt their AI systems and decision-making processes, update their internal documentation and implement appropriate safeguards to protect third-party data, intellectual property, and trade secrets. Insurers, for instance, increasingly rely on AI for claims management, underwriting, risk assessment, and pricing. Given the regulatory requirements, they will need to ensure that they have clear processes for handling access requests, particularly in complex cases where the balance between transparency and the protection of proprietary information must be carefully considered.
Furthermore, data protection authorities will play an increasingly pivotal role in mediating disputes related to trade secrets and the right of access. Regulatory guidance and case law will likely shape how these competing interests are balanced in practice.
End