datenrecht.ch

ECJ C‑634/21: Cre­dit score (of SCHUFA) as an auto­ma­ted decision

On Decem­ber 7, 2023, the ECJ ruled in the Case C‑634/21 on SCHUFA’s scoring, the Court issued its decis­i­on was made. In the opi­ni­on of the ECJ the scoring is sub­ject to SCHUFA Art. 22 GDPR on auto­ma­ted decis­i­ons, inso­far as the cre­dit score signi­fi­cant­ly deter­mi­nes whe­ther the finan­cial insti­tu­ti­on estab­lishes, imple­ments or ter­mi­na­tes a con­trac­tu­al rela­ti­on­ship with the data sub­ject. Unsur­pri­sin­gly, the ECJ thus fol­lows the Opi­ni­on of the Advo­ca­te Gene­ral (we have repor­ted on this).

Initi­al situation

SCHUFA, the most important cre­dit agen­cy in Ger­ma­ny, coll­ects cre­dit­wort­hi­ness data on com­pa­nies and pri­va­te indi­vi­du­als and trans­mits it to finan­cial insti­tu­ti­ons, among others, for a fee in order to assess the cre­dit­wort­hi­ness of per­sons inte­re­sted in a loan. To sim­pli­fy this assess­ment, it uses mathe­ma­ti­cal-sta­tis­ti­cal pro­ce­du­res to auto­ma­ti­cal­ly cal­cu­la­te pro­ba­bi­li­ty values for cre­dit­wort­hi­ness (Cre­dit score).

A per­son inte­re­sted in a loan, OQwas denied a loan agree­ment based on her cre­dit score, whereu­pon she sub­mit­ted a request for dele­ti­on and infor­ma­ti­on to SCHUFA. SCHUFA infor­med OQ of its score value and its cal­cu­la­ti­on method in gene­ral terms, but refu­sed to dis­c­lo­se which data it had used to cal­cu­la­te the cre­dit score and how it had weigh­ted it. In par­ti­cu­lar, it argued that it no auto­ma­ted decis­i­ons within the mea­ning of Art. 22 GDPR cases, which is why the­re is no right to infor­ma­ti­on about the logic invol­ved (cf. Art. 15 para. 1 lit. h GDPR).

In a case brought by OQ, the Wies­ba­den Admi­ni­stra­ti­ve Court refer­red the fol­lo­wing que­sti­ons, among others, to the ECJ Que­sti­on for preli­mi­na­ry ruling sub­mit­ted: Does the auto­ma­ted crea­ti­on of a cre­dit score, on which third par­ties base a decis­i­on, for exam­p­le on the estab­lish­ment of a con­trac­tu­al rela­ti­on­ship, alre­a­dy con­sti­tu­te an auto­ma­ted decision?

Cre­dit score as an auto­ma­ted decision 

Accor­ding to the ECJ, Art. 22 para. 1 GDPR has three cumu­la­ti­ve requi­re­ments, which must first be inter­pre­ted accor­ding to the wor­ding: What is requi­red is (i) a decis­i­on which (ii) is based sole­ly on auto­ma­ted pro­ce­s­sing and (iii) pro­du­ces legal effects con­cer­ning the data sub­ject or simi­lar­ly signi­fi­cant­ly affects the data sub­ject. The term “decis­i­on” is not defi­ned in the GDPR. EC 71 GDPR, accor­ding to which a decis­i­on also may include a mea­su­rebut con­firm the broad mea­ning of the term. As examp­les, the reci­tal men­ti­ons the auto­ma­tic rejec­tion of an online cre­dit appli­ca­ti­on or online recruit­ment pro­ce­du­res wit­hout any human intervention.

Sin­ce the term “decis­i­on” within the mea­ning of Art. 22 (1) GDPR can thus encom­pass […] seve­ral actions that can affect the data sub­ject in many ways, this term is broad enough to cover the result of the Cal­cu­la­ti­on of a person’s abili­ty to meet future payment obli­ga­ti­ons in the form of a pro­ba­bi­li­ty value.

The second con­di­ti­on is ful­fil­led in the pre­sent case, sin­ce an acti­vi­ty such as that of SCHUFA cor­re­sponds to the defi­ni­ti­on of pro­fil­ing in Art. 4 no. 4 GDPR. In addi­ti­on, the que­sti­on refer­red express­ly refers to the auto­ma­ted crea­ti­on of a pro­ba­bi­li­ty value. The third con­di­ti­on is also met. The third par­ty, such as a bank, is lar­ge­ly gui­ded in its actions (accor­ding to the que­sti­on refer­red) by the pro­ba­bi­li­ty value cal­cu­la­ted by SCHUFA. An ina­de­qua­te cre­dit score leads in almost all cases to the rejec­tion of the loan applied for, so that the cre­dit score at least signi­fi­cant­ly impairs the per­son con­cer­ned. Against this back­ground, the ECJ con­clu­des that the Deter­mi­na­ti­on of a cre­dit score by a cre­dit agen­cy to be clas­si­fi­ed as a decis­i­on if the cre­dit score is a plays a decisi­ve role in the gran­ting of a loan.

In order to fur­ther sup­port the inter­pre­ta­ti­on, the ECJ refers to the Con­textwhich con­ta­ins Art. 22 (1) GDPR, as well as the Pur­po­se and objec­ti­ves of the GDPR. The pur­po­se of Art. 22 is to pro­tect indi­vi­du­als from the spe­ci­fic risks to their rights and free­doms asso­cia­ted with the auto­ma­ted pro­ce­s­sing of per­so­nal data – inclu­ding pro­fil­ing – and the asso­cia­ted assess­ment of per­so­nal aspects. The­se par­ti­cu­lar risks are likely to impair the inte­rests and rights of the per­son con­cer­ned, in par­ti­cu­lar through any dis­cri­mi­na­to­ry effects. The inter­pre­ta­ti­on set out abo­ve and the broad mea­ning of the term Decis­i­on rein­force the effec­ti­ve pro­tec­tion at which the pro­vi­si­on is aimed.

If, on the other hand, a nar­row inter­pre­ta­ti­on would be pre­fer­red, accor­ding to which cre­dit scoring would be regard­ed as a pre­pa­ra­to­ry act, the­re would be a risk of cir­cum­ven­ti­on and a Legal pro­tec­tion gap. In this case, the deter­mi­na­ti­on of a pro­ba­bi­li­ty value would not be sub­ject to the spe­cial requi­re­ments of Art. 22 para. 2 and 4 GDPR, alt­hough the pro­ce­du­re is based on auto­ma­ted pro­ce­s­sing and has a signi­fi­cant impact, as the actions of the cre­dit insti­tu­ti­on are signi­fi­cant­ly gui­ded by the pro­ba­bi­li­ty value transmitted.

In addi­ti­on, the data sub­ject […] would not be able to exer­cise his or her right of access to the spe­ci­fic infor­ma­ti­on refer­red to in Artic­le 15(1)(h) GDPR with the cre­dit refe­rence agen­cy that deter­mi­nes the pro­ba­bi­li­ty value con­cer­ning him or her if the­re is no auto­ma­ted decis­i­on-making by that com­pa­ny. Second­ly, the third par­ty […] would not be able to pro­vi­de this spe­ci­fic infor­ma­ti­on becau­se it does not gene­ral­ly have it.

Notes

The ECJ’s rea­so­ning is thin and uncon­vin­cing in view of the scope of the decis­i­on for cre­dit agen­ci­es and other par­ties that pro­vi­de third par­ties with a basis for decis­i­on-making. The ECJ does not explain why, for exam­p­le, a pos­si­bly broad­ly under­s­tood con­cept of a decis­i­on should mean that it can encom­pass seve­ral acts. Appar­ent­ly, not only the cre­dit score, but also (at the same time) the sub­se­quent refu­sal of cre­dit should con­sti­tu­te a decis­i­on. Howe­ver, after a decis­i­on wort­hy of its name, the­re is still room for a fur­ther Decis­i­on on the same sub­ject of the requi­red scope no longer.

The Decis­i­on then with the cre­dit agen­cy, is in par­ti­cu­lar the­r­e­fo­re pro­ble­ma­ticbecau­se it is not within the con­trol of the cre­dit agen­cy to deter­mi­ne whe­ther the requi­re­ments of Art. 22 GDPR are met. The finan­cial insti­tu­ti­on deter­mi­nes whe­ther a per­son checks addi­tio­nal cri­te­ria, i.e. whe­ther the­re is exclu­si­ve auto­ma­ti­on, and whe­ther – depen­ding on the link to the score value – the cre­dit decis­i­on is posi­ti­ve or nega­ti­ve and the­r­e­fo­re has a rele­vant impact. Pla­cing the decis­i­on with the cre­dit agen­cy leads to the que­stionable result that, depen­ding on the beha­vi­or of third par­ties, the agen­cy is sub­ject to punis­ha­ble obli­ga­ti­ons, in par­ti­cu­lar to pro­vi­de infor­ma­ti­on and dis­clo­sure. In addi­ti­on, the cre­dit agen­cy often does not know whe­ther the finan­cial insti­tu­ti­on reli­es signi­fi­cant­ly on the cre­dit score and the­r­e­fo­re the cre­dit scoring is alre­a­dy sub­ject to Art. 22 GDPR.

In addi­ti­on gives the men­tio­ned Legal pro­tec­tion gap not. If the finan­cial insti­tu­ti­on makes an auto­ma­ted decis­i­on (based on the cre­dit score), it bears the asso­cia­ted obli­ga­ti­ons, in par­ti­cu­lar the obli­ga­ti­on to pro­vi­de infor­ma­ti­on about the logic invol­ved. The finan­cial insti­tu­ti­on can and must the­r­e­fo­re obtain the infor­ma­ti­on requi­red to ful­fill the right to infor­ma­ti­on from the cre­dit agen­cy. If the finan­cial insti­tu­ti­on does not pro­vi­de infor­ma­ti­on, it risks a fine. This will be incen­ti­ve enough for finan­cial insti­tu­ti­ons to have access to this infor­ma­ti­on con­trac­tual­ly gua­ran­teed by the cre­dit agency.

Less rele­vant to the que­sti­on refer­red here, but nevert­hel­ess remar­kab­le are also the fol­lo­wing points:

  • The ECJ also con­siders exclu­si­ve auto­ma­ti­on, con­di­ti­on (ii), to be given becau­se SCHUFA car­ri­es out pro­fil­ing. This is sur­pri­sing sin­ce, at least accor­ding to pre­vai­ling doc­tri­ne, pro­fil­ing does not have to be exclu­si­ve­ly auto­ma­ted. Why any pro­fil­ing should ful­fill this requi­re­ment, i.e. why the refe­rence to pro­fil­ing in Art. 22 para. 1 GDPR should be more than just an exam­p­le of auto­ma­ted pro­ce­s­sing, would have requi­red at least a sepa­ra­te justification.
  • Fur­ther­mo­re, the ECJ deri­ves from EC 71 GDPR that exclu­si­ve­ly auto­ma­ted pro­ce­s­sing - inclu­ding pro­fil­ing - requi­res the assess­ment of per­so­nal aspects rela­ting to the data sub­ject. The ECJ thus appears to take the con­tro­ver­si­al posi­ti­on in the lite­ra­tu­re that Art. 22 para. 1 GDPR pre­sup­po­ses the assess­ment of per­so­nal aspects. This posi­ti­on should be rejec­ted, part­ly becau­se it lacks a basis in the wor­ding. Of cour­se, most auto­ma­ted decis­i­ons are likely to invol­ve the assess­ment of per­so­nal aspects anyway.

Legal natu­re of Art. 22 (1) GDPR

Wit­hout neces­si­ty and wit­hout any real justi­fi­ca­ti­on, the CJEU also addres­ses the con­tro­ver­si­al que­sti­on of whe­ther Art. 22 para. 1 GDPR a gene­ral ban or pro­vi­des for a right of objection:

In this respect, it should be noted that […] Art. 22 para. 1 GDPR gives the data sub­ject the “right” not to be sub­ject to a decis­i­on based sole­ly on auto­ma­ted pro­ce­s­sing, inclu­ding pro­fil­ing. This pro­vi­si­on estab­lishes a fun­da­men­tal pro­hi­bi­ti­on, the vio­la­ti­on of which need not be asser­ted indi­vi­du­al­ly by such a per­son. As fol­lows from Artic­le 22(2) GDPR in con­junc­tion with reci­tal 71 of that regu­la­ti­on, the adop­ti­on of a decis­i­on based sole­ly on auto­ma­ted pro­ce­s­sing is only per­mis­si­ble in the cases refer­red to in Artic­le 22(2) […].

Alt­hough this view cor­re­sponds to the pre­vai­ling doc­tri­ne, it has recent­ly been que­stio­ned in the lite­ra­tu­re on various occa­si­ons and right­ly so. Of cour­se, EC 71 GDPR sta­tes that an auto­ma­ted decis­i­on howe­ver [should be allo­wed]if one of the cases in Art. 22 (2) GDPR applies. This wor­ding only makes sen­se if such decis­i­ons are gene­ral­ly pro­hi­bi­ted. Howe­ver, this under­stan­ding of EC 71 GDPR no basis in the wor­ding of Art. 22 para. 1 GDPR; a legal­ly non-bin­ding reci­tal is not able to over­ri­de the enac­ting part of the GDPR. In addi­ti­on to the wor­ding, the system also speaks against a pro­hi­bi­ti­on. The infor­ma­ti­on obli­ga­ti­ons that refer to Art. 22 para. 1 GDPR would not make sen­se in the case of a gene­ral pro­hi­bi­ti­on: they would obli­ge the con­trol­ler to pro­vi­de infor­ma­ti­on about a pro­hi­bi­ted acti­vi­ty. In addi­ti­on, the legislator’s link to the GDPR’s pre­de­ces­sor decree, which pro­vi­ded for a right of data sub­jects, also speaks in favor of con­ti­nui­ty of this legal natu­re. As usu­al, the ECJ igno­res the­se and other argu­ments in favor of a right to object.

Con­clu­ding remarks

The ECJ’s decis­i­on should Signi­fi­can­ce bey­ond cre­dit scoring in par­ti­cu­lar for the use of auto­ma­ted pre­pa­ra­ti­on and recom­men­da­ti­on systems (based on arti­fi­ci­al intel­li­gence). How the Ham­burg Data Pro­tec­tion Offi­cer sta­tes that, accor­ding to the stan­dards of the ECJ, com­pu­ter-gene­ra­ted sug­ge­sti­ons, such as a pre-sort­ing of appli­ca­ti­ons, can alre­a­dy be clas­si­fi­ed as a decis­i­on if they play a signi­fi­cant role in the decis­i­on-making process.

Also for the Switz­er­land is the decis­i­on not insi­gni­fi­cant. With the FADP, which came into force on Sep­tem­ber 1, 2023, Swiss law now also sub­jects auto­ma­ted decis­i­ons by pri­va­te con­trol­lers to spe­cial rules for the first time. The ECJ’s ruling is not auto­ma­ti­cal­ly decisi­ve for the inter­pre­ta­ti­on of Art. 21 FADP in par­ti­cu­lar, the obli­ga­ti­on to pro­vi­de infor­ma­ti­on in the case of auto­ma­ted decis­i­ons. Howe­ver, the FADP and GDPR for­mu­la­te the term “auto­ma­ted decis­i­on” only slight­ly dif­fer­ent­ly and the legis­la­tor (as the mate­ri­als show) has ori­en­ted its­elf con­cep­tual­ly on the GDPR. The­r­e­fo­re, the case law can­not be igno­red for Swiss law either.

Howe­ver, for the rea­sons men­tio­ned, the poor rea­so­ning is not con­vin­cing for Swiss law eit­her and it is the­r­e­fo­re to be hoped that Swiss court and aut­ho­ri­ty prac­ti­ce will devia­te from the ECJ ruling. After all, the Fede­ral Coun­cil in his Report on the legal frame­work for the prac­ti­ces of cre­dit refe­rence agen­ci­es took a dif­fe­rent and, in this respect, accu­ra­te posi­ti­on on this issue:

The cal­cu­la­ti­on of cre­dit­wort­hi­ness by the cre­dit agen­ci­es does not con­sti­tu­te an auto­ma­ted indi­vi­du­al decis­i­on within the mea­ning of the nDSG, but rather a decis­i­on-making aid, pro­vi­ded that the actu­al decis­i­on (e.g. refu­sing a purcha­se on account) is made by the cus­to­mers of the cre­dit agencies.

Final­ly, it should be noted that on the same day the ECJ also issued its Decis­i­on in cases C‑26/22 and C‑64/22, also con­cer­ning SCHUFA was deci­ded. The­se pro­ce­e­dings rela­te in par­ti­cu­lar to the admis­si­bi­li­ty of the reten­ti­on of data from public regi­sters by a cre­dit agency.

Aut­ho­ri­ty

Area

Topics

Rela­ted articles

Sub­scri­be