The CJEU confirmed that national courts have extensive powers to scrutinize data protection authorities – thereby strengthening the rights of data subjects. In a nutshell, the CJEU ruled that the assignment of automatically calculated credit scores is not in line with the GDPR.
In the case C-634/21, the CJEU examined the practices of SCHUFA with regard to compliance with the General Data Protection Regulation (“GDPR”) specifically focusing on automated decision-making processes. The case arose from a dispute where an individual (“OQ”) was denied credit based on a score determined by SCHUFA. OQ subsequently requested SCHUFA to provide information as to her registered personal data to correct any inaccuracies. SCHUFA, in response, disclosed the OQ’s score and provided a general overview of the scoring methods used by them. However, citing trade secrecy, SCHUFA refused to reveal the specific elements considered in the calculation of the scoring as well as their respective weights leading up to SCHUFA decision.
While SCHUFA argued that it merely provided information to contractual partners who made the actual credit decisions, the key question was whether the agency’s automated credit score creation constituted an ‘automated decision’ under Article 22 of the GDPR.
The CJEU held that when a credit information agency’s probability value significantly influences a bank’s credit decision, it qualifies as a decision with legal affects under the GDPR.
According to the CJEU’s press release (No. 186/23) dated the 8th of December 2023 relating to case C-634/21 (plus joined cases C-26/22 and C-64/22):
“The Court considers that it is contrary to the GDPR for private agencies to keep such data for longer than the public insolvency register. The discharge from remaining debts is intended to allow the data subject to re-enter economic life and is therefore of existential importance to that person. That information is still used as a negative factor when assessing the solvency of the data subject. In this case, the German legislature has provided for data to be stored for six months. It therefore considers that, at the end of the six months, the rights and interests of the data subject take precedence over those of the public to have access to that information.”
The CJEU explained and ruled that private agencies violating GDPR by retaining data related to the discharge from remaining debts for a period longer than mandated by German Law (ie. 6 months) are acting contrary to the GDPR. The CJEU emphasized that the purpose of such discharge is to enable individuals to re-enter economic life. Thus, once the 6 month period elapses, the CJEU asserted that the rights and interests of the data subject should take precedence over public access to that information. Consequently, the CJEU determined that the extended retention of such data is therefore illegal, granting the data subject the right to request deletion. Following the ruling, the private agency is obligated to promptly delete the data in accordance with the aforementioned ruling.
One could argue that this landmark decision has implications for businesses across the EU, emphasizing the importance of aligning credit decision processes with GDPR requirements to protect the individual’s right to privacy. This ruling serves as a reminder for companies to evaluate and adjust their practices when using third-party credit scoring services to ensure compliance with EU privacy regulations.
The CJEUs interpretation in this regard emphasizes the crucial equilibrium between leveraging AI innovatively and safeguarding the individual’s rights within the GDPR framework. The ruling acts as both a guide for compliance and a driving force for ethical AI practices. In addition, it marks the onset of a new era where technology and privacy harmoniously coexist. The CJEU’s judgement in itself stands as proof of the continuously evolving legal landscape.
For more information or assistance on AI and GDPR please contact Dr Ian Gauci and Dr Terence Cassar.