On March 16, 2023, the Advocate General (GA) of the ECJ, Priit Pikamäe. Case C‑634/21 his opinion. submitted. The proceedings concern the scope of application of Art. 22 GDPR on automated decisions and the regulatory leeway of EU member states on scoring. However, the GA also expressed its views on two other much-discussed issues regarding automated decisions.
Initial situation
SCHUFA, the most important credit agency in Germany, collects creditworthiness-related data on companies and private individuals and transmits it in return for a fee to financial institutions, among others, for the purpose of assessing the creditworthiness of persons interested in a loan. To simplify this assessment, it automatically calculates probability values for creditworthiness (credit scores) on the basis of mathematical and statistical methods.
A person interested in credit, “OQ”, was denied a credit agreement based on the credit score, whereupon she filed a request for cancellation and information with SCHUFA. SCHUFA informed OQ of its score and, in general terms, of its calculation method, but refused to disclose what data it had used to calculate the credit score and how it had weighted it. In doing so, it invoked trade secrets and argued that it did not make any automated decisions within the meaning of Article 22 of the GDPR, but merely provided financial institutions with information for their decision-making. Therefore, there was no claim against it for information about the logic involved (cf. Art. 15(1)(h) GDPR).
In response to an action brought by OQ, the Wiesbaden Administrative Court referred two questions to the ECJ for a preliminary ruling:
- Does the automated creation of a credit score, which third parties use as the basis for a decision, for example, on the establishment of a contractual relationship, already constitute an automated decision?
- May the legislator of an EU member state impose requirements on scoring that go beyond the GDPR (in this case with Section 31 of the German BDSG, which regulates the use of a probability score)?
Credit score as an automated decision
To clarify the first question, the GA divided Art. 22(1) GDPR into three requirements (in a somewhat odd way):
- First, there must be automated processing of personal data, of which profiling is a subcategory. Unsurprisingly (and rightly), the GA qualified SCHUFA’s scoring as profiling.
- Next, a legal consequence or significant impairment of the data subject is required, whereby Art. 22 GDPR covers “only serious effects”. In OQ’s situation, as the GA concludes somewhat hastily, there is a significant impact, especially since Recital (EC) 71 of the GDPR mentions the automatic rejection of an online credit application as a typical example of an automated decision.
- The third and central condition in the present case requires, first, a decision and, second, that this decision is based exclusively on automated processing. The exclusive automation is given in the case of SCHUFA’s credit scoring, but the question arises as to where the decision is to be located.
According to the GA, a decision implies an opinion on a specific matter and – unlike a recommendation – must be binding. The term is to be understood broadly because there is no legal definition. A qualification as a decision would then require a case-by-case examination of, among other things, the severity of the effects on the person concerned.
Which action is the relevant decision in constellations such as the present one – granting or refusing a loan by the financial institution or scoring by SCHUFA – depends on the individual case. The decisive factor is whether the decision of the financial institution is in fact predetermined by the scoring, i.e. whether it attaches the greatest importance to the credit score in its decision-making.
This depends on the internal rules and practices of the financial institution in question, which generally may not give it any leeway in applying the score to a credit application.
This was a question of fact that was better assessed by the national court. However, in view of the facts presented by the referring court, according to which the financial institution does not have to make the decision solely dependent on the credit score, but usually does so to a decisive extent, the GA considered the credit score to be a “decision”.
Any other interpretation would lead to a gap in legal protection:
The credit agency from which the information required for the data subject could be obtained is not obliged to provide information under Article 15 (1) (h) of the GDPR because it ostensibly does not engage in its own “automated decision-making” within the meaning of Article 15 (1) (h) of the GDPR, and the financial institution that bases its decision-making on the automatically generated score value and is obliged to provide information under Article 15 (1) (h) of the GDPR cannot provide the required information because it does not have it.
Consequently, the financial institution cannot review the credit scoring in the event of a decision challenge (cf. Art. 22(3) GDPR) or ensure fair, transparent and non-discriminatory processing through appropriate mathematical or statistical procedures (cf. EC 71 GDPR). Also, only the credit bureau could comply with the other data subject rights, e.g. the right to rectification or deletion.
Notes
Already the explanations on the term “decision” are not convincing. First of all, the conclusion of the GA only from the missing legal definition to a broad understanding of the term is wrong. Next, the GA mixes up the requirement of a decision with the already affirmed significant impairment when it requires a case-by-case examination of the severity of the impact for the decision. And finally, it remains unclear to what extent the credit score should have the binding force previously demanded and not merely be a recommendation.
The location of the “decision” at the credit agency also raises questions. As the reference to the internal rules and practices of the financial institution shows, it is not within the power of the credit agency whether a decision is subject to Art. 22 GDPR. The financial institution determines whether a human being checks additional criteria, i.e. whether there is no exclusive automation, and whether – depending on the link to the score value – the credit decision is positive or negative and thus whether there is a relevant impact. Placing the decision with the credit agency leads to the problematic result that, depending on the behavior of third parties, punishable obligations are imposed on the credit agency, especially with regard to information and disclosure. To make matters worse, the credit agency will often not be aware of these internal requirements of the financial institutions.
Furthermore, there is no gap in legal protection. If the financial institution makes an automated decision (based on the credit score), it bears the associated obligations, in particular to provide information about the logic involved. Article 15 of the GDPR does not provide for an exception due to impossibility. The financial institution can and must therefore obtain the information required to fulfill the right to information from the credit agency. If the financial institution does not provide information, it risks a significant fine accordingly. This will also be incentive enough for financial institutions to have access to this information contractually guaranteed by the credit agency. Furthermore, there is even less of a legal protection gap with regard to the rights to rectification and deletion addressed by the GA, which the data subject can assert against the credit agency anyway.
On the contrary, it is the case law proposed by the GA that opens a legal protection gap. If the data subject has the rights to review and challenge (cf. Art. 22(3) GDPR) vis-à-vis the credit agency and not vis-à-vis the financial institution, he or she can at most obtain a change in the credit score, but loses an opportunity to influence the credit decision, which is probably more relevant for him or her.
The GA’s implicitly expressed view that Art. 22 (1) GDPR does not already cover profiling (which causes a legal consequence or a significant impairment), as is sometimes argued in the doctrine, must nevertheless be endorsed. Otherwise, he could have saved himself the trouble of locating the decision after he had qualified credit scoring as profiling and affirmed a significant impairment.
Conformity of Section 31 BDSG with European Law
In the context of the second question referred, the GA examined relatively comprehensively whether an opening clause exists, i.e. a legal basis for the enactment of a national provision such as Section 31 BDSG.
Article 22(2)(b) of the GDPR does allow an exception to the restrictions on automated decisions “based on legal provisions of the Union or the Member States”. However, the provision cannot serve as a legal basis, since Section 31 BDSG also covers “non-automated decisions” without differentiation and regulates the “use”, not the “creation” of a probability value.
An opening clause could result from Art. 6(2) or (3) of the GDPR, upon consideration of which the GA concludes,
that Member States may adopt more specific provisions if the processing is “necessary for compliance with a legal obligation to which the controller is subject” or “necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller”. These conditions have the effect of narrowly limiting the regulatory power of the Member States and thus preclude arbitrary recourse to the opening clauses provided for in the GDPR, which could frustrate the objective of harmonizing the law in the area of personal data protection.
There is no obligation under national law to establish a score value. It is true that a credit agency also acts in the public interest, for example by helping to protect consumers from overindebtedness, to ensure the stability of the financial system and to improve access to credit. However, this is not (as would be required) a matter of public welfare, such as public health or social security, i.e. classic tasks of the state. These opening clauses are therefore not relevant either.
Article 6(1)(f) of the GDPR, the legal basis for data processing based on a legitimate interest, does not contain an opening clause and does not allow the Member States to specify the legitimate interest, as is the case under Section 31 of the BDSG.
In the opinion of the GA, it must therefore be assumed that national provisions such as Section 31 BDSG are not compatible with the GDPR. This is a justifiable position, which is also found in German doctrine. Despite the possible inapplicability of Section 31 BDSG, it can be assumed that its value decisions will continue to play an important role in the balancing of interests pursuant to Article 6 (1) (f) GDPR (in German practice).
Obiter dicta
Legal nature of Art. 22 (1) GDPR
In addition, the GA speaks casually and without necessity on the question, which is disputed in the doctrine, whether Art. 22 (1) GDPR provides for a general prohibition or a right of objection:
Notwithstanding the terminology used, the application of Article 22(1) GDPR does not require that the data subject actively invokes the right. Indeed, an interpretation in light of the 71st recital of this Regulation and taking into account the scheme of this provision, in particular its paragraph 2, which lists the cases in which automated processing is exceptionally allowed, leads to the conclusion that this provision establishes a general prohibition of the decisions of the type described above.
Although this view corresponds to the (still) predominant doctrine, it has recently been rightly questioned in the literature. The reason for this is not only the wording, which also in other language versions speaks for a right of data subjects – and not a prohibition. The information obligations that refer to Art. 22(1) GDPR would make no sense in the case of a general prohibition: they would oblige the controller to provide information about a prohibited activity. In addition, the legislator’s link to the predecessor decree of the GDPR, which provided for a right of data subjects, also argues for a continuity of this legal nature. Finally, further arguments in favor of a right to object can be derived from the purpose of Art. 22(1) GDPR, which the GA omitted here.
Scope of the right to information concerning automated decisions
Also without necessity, the GA commented on the scope of the right to information in Art. 15(1)(h) GDPR, in particular on the importance of meaningful information about the logic involved:
In my opinion, this provision should be interpreted as also covering, in principle, the calculation method used by a credit agency to determine a score value, provided that there are no conflicting interests worthy of protection.
In this respect, reference should be made to the 63rd recital of the GDPR, which states, inter alia, that “[the right of access] [should] not prejudice the rights and freedoms of other persons, such as trade secrets or intellectual property rights and in particular copyright in software”.
The GA thus opposes the European Data Protection Board, which in its guideline on the right to information wants to allow the reservation of business secrets and intellectual property rights to apply only with regard to the right to copy (cf. 15 (4) GDPR). However, since this reservation appears to make sense especially in the case of information about the logic involved, the GA must be agreed with on this point.
The reservation requires a balancing, whereby a minimum of information must be provided to the data subject. The protection of business secrecy or intellectual property is in principle a justified reason for a credit agency to refuse to disclose the algorithm used to calculate the score value of the data subject.
The credit agency therefore owes:
sufficiently detailed explanations on the method for calculating the score value and on the reasons […] that led to a certain result. In general, the controller should provide the data subject with general information, in particular on factors taken into account in the decision-making process and their weighting at an aggregated level, which is also useful for challenging “decisions” within the meaning of Article 22(1) GDPR on the part of the data subject.
What this means remains unclear. If the credit agency merely owes general information on the factors taken into account, it probably does not have to name all the factors. The “weightings at aggregate level” also leaves room for interpretation: the credit agency will have to make statements about the weighting, but it does not have to state them specifically. Since the GA wants to protect the algorithm or the score formula, it should be sufficient to indicate which factors or factor categories are given more or less weight.
Concluding remarks
Opinions of a GA are not binding. Therefore, it remains to be hoped that the ECJ will consider the aspects disregarded by the GA, in particular in the first question referred, and decide differently. As a rule, however, the ECJ follows the GA’s opinion, which would not be surprising in the present case. A corresponding decision of the ECJ would fit seamlessly into the previous, very data protection-friendly case law of the ECJ.
The outcome of the proceedings will also be significant for Swiss law, because the nDSG, which will enter into force on September 1, 2023, will for the first time subject automated decisions by private data controllers to special rules. The considerations of the GA and the pending decision of the ECJ may not be transferred unseen to Art. 21 nDSG. However, since the DPA and the GDPR formulate the term “automated decision” in only slightly different terms and the legislator (as can be seen in the materials) has based its terminology on the GDPR, the European case law should also influence the understanding in Swiss law.
Finally, it should be noted that on the same day the GA also submitted his Opinion in Cases C‑26/22 and C‑64/22, which also concern SCHUFA. The questions to be answered in these preliminary ruling proceedings relate in particular to the permissibility of the retention of data from public registers by a credit agency.