On December 7, 2023, the ECJ ruled in the Case C‑634/21 on SCHUFA’s scoring, the Court issued its decision was made. In the opinion of the ECJ the scoring is subject to SCHUFA Art. 22 GDPR on automated decisions, insofar as the credit score significantly determines whether the financial institution establishes, implements or terminates a contractual relationship with the data subject. Unsurprisingly, the ECJ thus follows the Opinion of the Advocate General (we have reported on this).
Initial situation
SCHUFA, the most important credit agency in Germany, collects creditworthiness data on companies and private individuals and transmits it to financial institutions, among others, for a fee in order to assess the creditworthiness of persons interested in a loan. To simplify this assessment, it uses mathematical-statistical procedures to automatically calculate probability values for creditworthiness (Credit score).
A person interested in a loan, “OQ“was denied a loan agreement based on her credit score, whereupon she submitted a request for deletion and information to SCHUFA. SCHUFA informed OQ of its score value and its calculation method in general terms, but refused to disclose which data it had used to calculate the credit score and how it had weighted it. In particular, it argued that it no automated decisions within the meaning of Art. 22 GDPR cases, which is why there is no right to information about the logic involved (cf. Art. 15 para. 1 lit. h GDPR).
In a case brought by OQ, the Wiesbaden Administrative Court referred the following questions, among others, to the ECJ Question for preliminary ruling submitted: Does the automated creation of a credit score, on which third parties base a decision, for example on the establishment of a contractual relationship, already constitute an automated decision?
Credit score as an automated decision
According to the ECJ, Art. 22 para. 1 GDPR has three cumulative requirements, which must first be interpreted according to the wording: What is required is (i) a decision which (ii) is based solely on automated processing and (iii) produces legal effects concerning the data subject or similarly significantly affects the data subject. The term “decision” is not defined in the GDPR. EC 71 GDPR, according to which a decision also “may include a measure“but confirm the “broad meaning of the term”. As examples, the recital mentions the automatic rejection of an online credit application or online recruitment procedures without any human intervention.
Since the term “decision” within the meaning of Art. 22 (1) GDPR can thus encompass […] several actions that can affect the data subject in many ways, this term is broad enough to cover the result of the Calculation of a person’s ability to meet future payment obligations in the form of a probability value.
The second condition is fulfilled in the present case, since an activity such as that of SCHUFA corresponds to the definition of profiling in Art. 4 no. 4 GDPR. In addition, the question referred expressly refers to the automated creation of a probability value. The third condition is also met. The third party, such as a bank, is largely guided in its actions (according to the question referred) by the probability value calculated by SCHUFA. An inadequate credit score leads in almost all cases to the rejection of the loan applied for, so that the credit score at least significantly impairs the person concerned. Against this background, the ECJ concludes that the Determination of a credit score by a credit agency to be classified as a decision if the credit score is a plays a decisive role in the granting of a loan.
In order to further support the interpretation, the ECJ refers to the Contextwhich contains Art. 22 (1) GDPR, as well as the Purpose and objectives of the GDPR. The purpose of Art. 22 is to protect individuals from the specific risks to their rights and freedoms associated with the automated processing of personal data – including profiling – and the associated assessment of personal aspects. These particular risks are likely to impair the interests and rights of the person concerned, in particular through any discriminatory effects. The interpretation set out above and the broad meaning of the term “Decision” reinforce the effective protection at which the provision is aimed.
If, on the other hand, a “narrow interpretation” would be preferred, according to which credit scoring would be regarded as a preparatory act, there would be a risk of circumvention and a Legal protection gap. In this case, the determination of a probability value would not be subject to the special requirements of Art. 22 para. 2 and 4 GDPR, although the procedure is based on automated processing and has a significant impact, as the actions of the credit institution are significantly guided by the probability value transmitted.
In addition, the data subject […] would not be able to exercise his or her right of access to the specific information referred to in Article 15(1)(h) GDPR with the credit reference agency that determines the probability value concerning him or her if there is no automated decision-making by that company. Secondly, the third party […] would not be able to provide this specific information because it does not generally have it.
Notes
The ECJ’s reasoning is thin and unconvincing in view of the scope of the decision for credit agencies and other parties that provide third parties with a basis for decision-making. The ECJ does not explain why, for example, a possibly broadly understood concept of a decision should mean that it can encompass several acts. Apparently, not only the credit score, but also (at the same time) the subsequent refusal of credit should constitute a decision. However, after a decision worthy of its name, there is still room for a further Decision on the same subject of the required scope no longer.
The Decision then with the credit agency, is in particular therefore problematicbecause it is not within the control of the credit agency to determine whether the requirements of Art. 22 GDPR are met. The financial institution determines whether a person checks additional criteria, i.e. whether there is exclusive automation, and whether – depending on the link to the score value – the credit decision is positive or negative and therefore has a relevant impact. Placing the decision with the credit agency leads to the questionable result that, depending on the behavior of third parties, the agency is subject to punishable obligations, in particular to provide information and disclosure. In addition, the credit agency often does not know whether the financial institution relies significantly on the credit score and therefore the credit scoring is already subject to Art. 22 GDPR.
In addition gives the mentioned Legal protection gap not. If the financial institution makes an automated decision (based on the credit score), it bears the associated obligations, in particular the obligation to provide information about the logic involved. The financial institution can and must therefore obtain the information required to fulfill the right to information from the credit agency. If the financial institution does not provide information, it risks a fine. This will be incentive enough for financial institutions to have access to this information contractually guaranteed by the credit agency.
Less relevant to the question referred here, but nevertheless remarkable are also the following points:
- The ECJ also considers exclusive automation, condition (ii), to be given because SCHUFA carries out profiling. This is surprising since, at least according to prevailing doctrine, profiling does not have to be exclusively automated. Why any profiling should fulfill this requirement, i.e. why the reference to profiling in Art. 22 para. 1 GDPR should be more than just an example of automated processing, would have required at least a separate justification.
- Furthermore, the ECJ derives from EC 71 GDPR that exclusively automated processing - including profiling - requires the assessment of personal aspects relating to the data subject. The ECJ thus appears to take the controversial position in the literature that Art. 22 para. 1 GDPR presupposes the assessment of personal aspects. This position should be rejected, partly because it lacks a basis in the wording. Of course, most automated decisions are likely to involve the assessment of personal aspects anyway.
Legal nature of Art. 22 (1) GDPR
Without necessity and without any real justification, the CJEU also addresses the controversial question of whether Art. 22 para. 1 GDPR a general ban or provides for a right of objection:
In this respect, it should be noted that […] Art. 22 para. 1 GDPR gives the data subject the “right” not to be subject to a decision based solely on automated processing, including profiling. This provision establishes a fundamental prohibition, the violation of which need not be asserted individually by such a person. As follows from Article 22(2) GDPR in conjunction with recital 71 of that regulation, the adoption of a decision based solely on automated processing is only permissible in the cases referred to in Article 22(2) […].
Although this view corresponds to the prevailing doctrine, it has recently been questioned in the literature on various occasions and rightly so. Of course, EC 71 GDPR states that an automated decision “however [should be allowed]“if one of the cases in Art. 22 (2) GDPR applies. This wording only makes sense if such decisions are generally prohibited. However, this understanding of EC 71 GDPR no basis in the wording of Art. 22 para. 1 GDPR; a legally non-binding recital is not able to override the enacting part of the GDPR. In addition to the wording, the system also speaks against a prohibition. The information obligations that refer to Art. 22 para. 1 GDPR would not make sense in the case of a general prohibition: they would oblige the controller to provide information about a prohibited activity. In addition, the legislator’s link to the GDPR’s predecessor decree, which provided for a right of data subjects, also speaks in favor of continuity of this legal nature. As usual, the ECJ ignores these and other arguments in favor of a right to object.
Concluding remarks
The ECJ’s decision should Significance beyond credit scoring in particular for the use of automated preparation and recommendation systems (based on artificial intelligence). How the Hamburg Data Protection Officer states that, according to the standards of the ECJ, computer-generated suggestions, such as a pre-sorting of applications, can already be classified as a decision if they play a significant role in the decision-making process.
Also for the Switzerland is the decision not insignificant. With the FADP, which came into force on September 1, 2023, Swiss law now also subjects automated decisions by private controllers to special rules for the first time. The ECJ’s ruling is not automatically decisive for the interpretation of Art. 21 FADP in particular, the obligation to provide information in the case of automated decisions. However, the FADP and GDPR formulate the term “automated decision” only slightly differently and the legislator (as the materials show) has oriented itself conceptually on the GDPR. Therefore, the case law cannot be ignored for Swiss law either.
However, for the reasons mentioned, the poor reasoning is not convincing for Swiss law either and it is therefore to be hoped that Swiss court and authority practice will deviate from the ECJ ruling. After all, the Federal Council in his Report on the legal framework for the practices of credit reference agencies took a different and, in this respect, accurate position on this issue:
The calculation of creditworthiness by the credit agencies does not constitute an automated individual decision within the meaning of the nDSG, but rather a decision-making aid, provided that the actual decision (e.g. refusing a purchase on account) is made by the customers of the credit agencies.
Finally, it should be noted that on the same day the ECJ also issued its Decision in cases C‑26/22 and C‑64/22, also concerning SCHUFA was decided. These proceedings relate in particular to the admissibility of the retention of data from public registers by a credit agency.