datenrecht.ch

ECJ C‑634/21 – Appli­ca­ti­ons GA: Cre­dit score (of SCHUFA) as an auto­ma­ted decision

On March 16, 2023, the Advo­ca­te Gene­ral (GA) of the ECJ, Pri­it Pika­mäe. Case C‑634/21 his opi­ni­on. sub­mit­ted. The pro­ce­e­dings con­cern the scope of appli­ca­ti­on of Art. 22 GDPR on auto­ma­ted decis­i­ons and the regu­la­to­ry lee­way of EU mem­ber sta­tes on scoring. Howe­ver, the GA also expres­sed its views on two other much-dis­cus­sed issues regar­ding auto­ma­ted decisions.

Initi­al situation

SCHUFA, the most important cre­dit agen­cy in Ger­ma­ny, coll­ects cre­dit­wort­hi­ness-rela­ted data on com­pa­nies and pri­va­te indi­vi­du­als and trans­mits it in return for a fee to finan­cial insti­tu­ti­ons, among others, for the pur­po­se of asses­sing the cre­dit­wort­hi­ness of per­sons inte­re­sted in a loan. To sim­pli­fy this assess­ment, it auto­ma­ti­cal­ly cal­cu­la­tes pro­ba­bi­li­ty values for cre­dit­wort­hi­ness (cre­dit scores) on the basis of mathe­ma­ti­cal and sta­tis­ti­cal methods.

A per­son inte­re­sted in cre­dit, “OQ”, was denied a cre­dit agree­ment based on the cre­dit score, whereu­pon she filed a request for can­cel­la­ti­on and infor­ma­ti­on with SCHUFA. SCHUFA infor­med OQ of its score and, in gene­ral terms, of its cal­cu­la­ti­on method, but refu­sed to dis­c­lo­se what data it had used to cal­cu­la­te the cre­dit score and how it had weigh­ted it. In doing so, it invo­ked trade secrets and argued that it did not make any auto­ma­ted decis­i­ons within the mea­ning of Artic­le 22 of the GDPR, but mere­ly pro­vi­ded finan­cial insti­tu­ti­ons with infor­ma­ti­on for their decis­i­on-making. The­r­e­fo­re, the­re was no cla­im against it for infor­ma­ti­on about the logic invol­ved (cf. Art. 15(1)(h) GDPR).

In respon­se to an action brought by OQ, the Wies­ba­den Admi­ni­stra­ti­ve Court refer­red two que­sti­ons to the ECJ for a preli­mi­na­ry ruling:

  • Does the auto­ma­ted crea­ti­on of a cre­dit score, which third par­ties use as the basis for a decis­i­on, for exam­p­le, on the estab­lish­ment of a con­trac­tu­al rela­ti­on­ship, alre­a­dy con­sti­tu­te an auto­ma­ted decision?
  • May the legis­la­tor of an EU mem­ber sta­te impo­se requi­re­ments on scoring that go bey­ond the GDPR (in this case with Sec­tion 31 of the Ger­man BDSG, which regu­la­tes the use of a pro­ba­bi­li­ty score)?

Cre­dit score as an auto­ma­ted decision 

To cla­ri­fy the first que­sti­on, the GA divi­ded Art. 22(1) GDPR into three requi­re­ments (in a some­what odd way):

  • First, the­re must be auto­ma­ted pro­ce­s­sing of per­so­nal data, of which pro­fil­ing is a sub­ca­te­go­ry. Unsur­pri­sin­gly (and right­ly), the GA qua­li­fi­ed SCHUFA’s scoring as profiling.
  • Next, a legal con­se­quence or signi­fi­cant impair­ment of the data sub­ject is requi­red, wher­eby Art. 22 GDPR covers “only serious effects”. In OQ’s situa­ti­on, as the GA con­clu­des some­what hasti­ly, the­re is a signi­fi­cant impact, espe­ci­al­ly sin­ce Reci­tal (EC) 71 of the GDPR men­ti­ons the auto­ma­tic rejec­tion of an online cre­dit appli­ca­ti­on as a typi­cal exam­p­le of an auto­ma­ted decision.
  • The third and cen­tral con­di­ti­on in the pre­sent case requi­res, first, a decis­i­on and, second, that this decis­i­on is based exclu­si­ve­ly on auto­ma­ted pro­ce­s­sing. The exclu­si­ve auto­ma­ti­on is given in the case of SCHUFA’s cre­dit scoring, but the que­sti­on ari­ses as to whe­re the decis­i­on is to be located.

Accor­ding to the GA, a decis­i­on implies an opi­ni­on on a spe­ci­fic mat­ter and – unli­ke a recom­men­da­ti­on – must be bin­ding. The term is to be under­s­tood broad­ly becau­se the­re is no legal defi­ni­ti­on. A qua­li­fi­ca­ti­on as a decis­i­on would then requi­re a case-by-case exami­na­ti­on of, among other things, the seve­ri­ty of the effects on the per­son concerned.

Which action is the rele­vant decis­i­on in con­stel­la­ti­ons such as the pre­sent one – gran­ting or refu­sing a loan by the finan­cial insti­tu­ti­on or scoring by SCHUFA – depends on the indi­vi­du­al case. The decisi­ve fac­tor is whe­ther the decis­i­on of the finan­cial insti­tu­ti­on is in fact pre­de­ter­mi­ned by the scoring, i.e. whe­ther it atta­ches the grea­test importance to the cre­dit score in its decision-making.

This depends on the inter­nal rules and prac­ti­ces of the finan­cial insti­tu­ti­on in que­sti­on, which gene­ral­ly may not give it any lee­way in app­ly­ing the score to a cre­dit application.

This was a que­sti­on of fact that was bet­ter asses­sed by the natio­nal court. Howe­ver, in view of the facts pre­sen­ted by the refer­ring court, accor­ding to which the finan­cial insti­tu­ti­on does not have to make the decis­i­on sole­ly depen­dent on the cre­dit score, but usual­ly does so to a decisi­ve ext­ent, the GA con­side­red the cre­dit score to be a “decis­i­on”.

Any other inter­pre­ta­ti­on would lead to a gap in legal protection:

The cre­dit agen­cy from which the infor­ma­ti­on requi­red for the data sub­ject could be obtai­ned is not obli­ged to pro­vi­de infor­ma­ti­on under Artic­le 15 (1) (h) of the GDPR becau­se it osten­si­bly does not enga­ge in its own “auto­ma­ted decis­i­on-making” within the mea­ning of Artic­le 15 (1) (h) of the GDPR, and the finan­cial insti­tu­ti­on that bases its decis­i­on-making on the auto­ma­ti­cal­ly gene­ra­ted score value and is obli­ged to pro­vi­de infor­ma­ti­on under Artic­le 15 (1) (h) of the GDPR can­not pro­vi­de the requi­red infor­ma­ti­on becau­se it does not have it.

Con­se­quent­ly, the finan­cial insti­tu­ti­on can­not review the cre­dit scoring in the event of a decis­i­on chall­enge (cf. Art. 22(3) GDPR) or ensu­re fair, trans­pa­rent and non-dis­cri­mi­na­to­ry pro­ce­s­sing through appro­pria­te mathe­ma­ti­cal or sta­tis­ti­cal pro­ce­du­res (cf. EC 71 GDPR). Also, only the cre­dit bureau could com­ply with the other data sub­ject rights, e.g. the right to rec­ti­fi­ca­ti­on or deletion.

Notes

Alre­a­dy the expl­ana­ti­ons on the term “decis­i­on” are not con­vin­cing. First of all, the con­clu­si­on of the GA only from the miss­ing legal defi­ni­ti­on to a broad under­stan­ding of the term is wrong. Next, the GA mixes up the requi­re­ment of a decis­i­on with the alre­a­dy affirm­ed signi­fi­cant impair­ment when it requi­res a case-by-case exami­na­ti­on of the seve­ri­ty of the impact for the decis­i­on. And final­ly, it remains unclear to what ext­ent the cre­dit score should have the bin­ding force pre­vious­ly deman­ded and not mere­ly be a recommendation.

The loca­ti­on of the “decis­i­on” at the cre­dit agen­cy also rai­ses que­sti­ons. As the refe­rence to the inter­nal rules and prac­ti­ces of the finan­cial insti­tu­ti­on shows, it is not within the power of the cre­dit agen­cy whe­ther a decis­i­on is sub­ject to Art. 22 GDPR. The finan­cial insti­tu­ti­on deter­mi­nes whe­ther a human being checks addi­tio­nal cri­te­ria, i.e. whe­ther the­re is no exclu­si­ve auto­ma­ti­on, and whe­ther – depen­ding on the link to the score value – the cre­dit decis­i­on is posi­ti­ve or nega­ti­ve and thus whe­ther the­re is a rele­vant impact. Pla­cing the decis­i­on with the cre­dit agen­cy leads to the pro­ble­ma­tic result that, depen­ding on the beha­vi­or of third par­ties, punis­ha­ble obli­ga­ti­ons are impo­sed on the cre­dit agen­cy, espe­ci­al­ly with regard to infor­ma­ti­on and dis­clo­sure. To make mat­ters worse, the cre­dit agen­cy will often not be awa­re of the­se inter­nal requi­re­ments of the finan­cial institutions.

Fur­ther­mo­re, the­re is no gap in legal pro­tec­tion. If the finan­cial insti­tu­ti­on makes an auto­ma­ted decis­i­on (based on the cre­dit score), it bears the asso­cia­ted obli­ga­ti­ons, in par­ti­cu­lar to pro­vi­de infor­ma­ti­on about the logic invol­ved. Artic­le 15 of the GDPR does not pro­vi­de for an excep­ti­on due to impos­si­bi­li­ty. The finan­cial insti­tu­ti­on can and must the­r­e­fo­re obtain the infor­ma­ti­on requi­red to ful­fill the right to infor­ma­ti­on from the cre­dit agen­cy. If the finan­cial insti­tu­ti­on does not pro­vi­de infor­ma­ti­on, it risks a signi­fi­cant fine accor­din­gly. This will also be incen­ti­ve enough for finan­cial insti­tu­ti­ons to have access to this infor­ma­ti­on con­trac­tual­ly gua­ran­teed by the cre­dit agen­cy. Fur­ther­mo­re, the­re is even less of a legal pro­tec­tion gap with regard to the rights to rec­ti­fi­ca­ti­on and dele­ti­on addres­sed by the GA, which the data sub­ject can assert against the cre­dit agen­cy anyway.

On the con­tra­ry, it is the case law pro­po­sed by the GA that opens a legal pro­tec­tion gap. If the data sub­ject has the rights to review and chall­enge (cf. Art. 22(3) GDPR) vis-à-vis the cre­dit agen­cy and not vis-à-vis the finan­cial insti­tu­ti­on, he or she can at most obtain a chan­ge in the cre­dit score, but loses an oppor­tu­ni­ty to influence the cre­dit decis­i­on, which is pro­ba­b­ly more rele­vant for him or her.

The GA’s impli­ci­t­ly expres­sed view that Art. 22 (1) GDPR does not alre­a­dy cover pro­fil­ing (which cau­ses a legal con­se­quence or a signi­fi­cant impair­ment), as is some­ti­mes argued in the doc­tri­ne, must nevert­hel­ess be endor­sed. Other­wi­se, he could have saved hims­elf the trou­ble of loca­ting the decis­i­on after he had qua­li­fi­ed cre­dit scoring as pro­fil­ing and affirm­ed a signi­fi­cant impairment.

Con­for­mi­ty of Sec­tion 31 BDSG with Euro­pean Law 

In the con­text of the second que­sti­on refer­red, the GA exami­ned rela­tively com­pre­hen­si­ve­ly whe­ther an ope­ning clau­se exists, i.e. a legal basis for the enact­ment of a natio­nal pro­vi­si­on such as Sec­tion 31 BDSG.

Artic­le 22(2)(b) of the GDPR does allow an excep­ti­on to the rest­ric­tions on auto­ma­ted decis­i­ons “based on legal pro­vi­si­ons of the Uni­on or the Mem­ber Sta­tes”. Howe­ver, the pro­vi­si­on can­not ser­ve as a legal basis, sin­ce Sec­tion 31 BDSG also covers “non-auto­ma­ted decis­i­ons” wit­hout dif­fe­ren­tia­ti­on and regu­la­tes the “use”, not the “crea­ti­on” of a pro­ba­bi­li­ty value.

An ope­ning clau­se could result from Art. 6(2) or (3) of the GDPR, upon con­side­ra­ti­on of which the GA concludes,

that Mem­ber Sta­tes may adopt more spe­ci­fic pro­vi­si­ons if the pro­ce­s­sing is “neces­sa­ry for com­pli­ance with a legal obli­ga­ti­on to which the con­trol­ler is sub­ject” or “neces­sa­ry for the per­for­mance of a task car­ri­ed out in the public inte­rest or in the exer­cise of offi­ci­al aut­ho­ri­ty vested in the con­trol­ler”. The­se con­di­ti­ons have the effect of nar­row­ly limi­ting the regu­la­to­ry power of the Mem­ber Sta­tes and thus pre­clude arbi­tra­ry recour­se to the ope­ning clau­ses pro­vi­ded for in the GDPR, which could fru­stra­te the objec­ti­ve of har­mo­ni­zing the law in the area of per­so­nal data protection.

The­re is no obli­ga­ti­on under natio­nal law to estab­lish a score value. It is true that a cre­dit agen­cy also acts in the public inte­rest, for exam­p­le by hel­ping to pro­tect con­su­mers from over­in­deb­ted­ness, to ensu­re the sta­bi­li­ty of the finan­cial system and to impro­ve access to cre­dit. Howe­ver, this is not (as would be requi­red) a mat­ter of public wel­fa­re, such as public health or social secu­ri­ty, i.e. clas­sic tasks of the sta­te. The­se ope­ning clau­ses are the­r­e­fo­re not rele­vant either.

Artic­le 6(1)(f) of the GDPR, the legal basis for data pro­ce­s­sing based on a legi­ti­ma­te inte­rest, does not con­tain an ope­ning clau­se and does not allow the Mem­ber Sta­tes to spe­ci­fy the legi­ti­ma­te inte­rest, as is the case under Sec­tion 31 of the BDSG.

In the opi­ni­on of the GA, it must the­r­e­fo­re be assu­med that natio­nal pro­vi­si­ons such as Sec­tion 31 BDSG are not com­pa­ti­ble with the GDPR. This is a justi­fia­ble posi­ti­on, which is also found in Ger­man doc­tri­ne. Despi­te the pos­si­ble inap­pli­ca­bi­li­ty of Sec­tion 31 BDSG, it can be assu­med that its value decis­i­ons will con­ti­n­ue to play an important role in the balan­cing of inte­rests pur­su­ant to Artic­le 6 (1) (f) GDPR (in Ger­man practice).

Obiter dic­ta

Legal natu­re of Art. 22 (1) GDPR

In addi­ti­on, the GA speaks casual­ly and wit­hout neces­si­ty on the que­sti­on, which is dis­pu­ted in the doc­tri­ne, whe­ther Art. 22 (1) GDPR pro­vi­des for a gene­ral pro­hi­bi­ti­on or a right of objection:

Not­wi­th­stan­ding the ter­mi­no­lo­gy used, the appli­ca­ti­on of Artic­le 22(1) GDPR does not requi­re that the data sub­ject actively invo­kes the right. Inde­ed, an inter­pre­ta­ti­on in light of the 71st reci­tal of this Regu­la­ti­on and taking into account the sche­me of this pro­vi­si­on, in par­ti­cu­lar its para­graph 2, which lists the cases in which auto­ma­ted pro­ce­s­sing is excep­tio­nal­ly allo­wed, leads to the con­clu­si­on that this pro­vi­si­on estab­lishes a gene­ral pro­hi­bi­ti­on of the decis­i­ons of the type descri­bed above.

Alt­hough this view cor­re­sponds to the (still) pre­do­mi­nant doc­tri­ne, it has recent­ly been right­ly que­stio­ned in the lite­ra­tu­re. The rea­son for this is not only the wor­ding, which also in other lan­guage ver­si­ons speaks for a right of data sub­jects – and not a pro­hi­bi­ti­on. The infor­ma­ti­on obli­ga­ti­ons that refer to Art. 22(1) GDPR would make no sen­se in the case of a gene­ral pro­hi­bi­ti­on: they would obli­ge the con­trol­ler to pro­vi­de infor­ma­ti­on about a pro­hi­bi­ted acti­vi­ty. In addi­ti­on, the legislator’s link to the pre­de­ces­sor decree of the GDPR, which pro­vi­ded for a right of data sub­jects, also argues for a con­ti­nui­ty of this legal natu­re. Final­ly, fur­ther argu­ments in favor of a right to object can be deri­ved from the pur­po­se of Art. 22(1) GDPR, which the GA omit­ted here.

Scope of the right to infor­ma­ti­on con­cer­ning auto­ma­ted decisions

Also wit­hout neces­si­ty, the GA com­men­ted on the scope of the right to infor­ma­ti­on in Art. 15(1)(h) GDPR, in par­ti­cu­lar on the importance of meaningful infor­ma­ti­on about the logic involved:

In my opi­ni­on, this pro­vi­si­on should be inter­pre­ted as also cove­ring, in prin­ci­ple, the cal­cu­la­ti­on method used by a cre­dit agen­cy to deter­mi­ne a score value, pro­vi­ded that the­re are no con­flic­ting inte­rests wort­hy of protection.
In this respect, refe­rence should be made to the 63rd reci­tal of the GDPR, which sta­tes, inter alia, that “[the right of access] [should] not pre­ju­di­ce the rights and free­doms of other per­sons, such as trade secrets or intellec­tu­al pro­per­ty rights and in par­ti­cu­lar copy­right in software”.

The GA thus oppo­ses the Euro­pean Data Pro­tec­tion Board, which in its gui­de­line on the right to infor­ma­ti­on wants to allow the reser­va­ti­on of busi­ness secrets and intellec­tu­al pro­per­ty rights to app­ly only with regard to the right to copy (cf. 15 (4) GDPR). Howe­ver, sin­ce this reser­va­ti­on appears to make sen­se espe­ci­al­ly in the case of infor­ma­ti­on about the logic invol­ved, the GA must be agreed with on this point.

The reser­va­ti­on requi­res a balan­cing, wher­eby a mini­mum of infor­ma­ti­on must be pro­vi­ded to the data sub­ject. The pro­tec­tion of busi­ness sec­re­cy or intellec­tu­al pro­per­ty is in prin­ci­ple a justi­fi­ed rea­son for a cre­dit agen­cy to refu­se to dis­c­lo­se the algo­rithm used to cal­cu­la­te the score value of the data subject.

The cre­dit agen­cy the­r­e­fo­re owes:

suf­fi­ci­ent­ly detail­ed expl­ana­ti­ons on the method for cal­cu­la­ting the score value and on the rea­sons […] that led to a cer­tain result. In gene­ral, the con­trol­ler should pro­vi­de the data sub­ject with gene­ral infor­ma­ti­on, in par­ti­cu­lar on fac­tors taken into account in the decis­i­on-making pro­cess and their weight­ing at an aggre­ga­ted level, which is also useful for chal­len­ging “decis­i­ons” within the mea­ning of Artic­le 22(1) GDPR on the part of the data subject.

What this means remains unclear. If the cre­dit agen­cy mere­ly owes gene­ral infor­ma­ti­on on the fac­tors taken into account, it pro­ba­b­ly does not have to name all the fac­tors. The “weightin­gs at aggre­ga­te level” also lea­ves room for inter­pre­ta­ti­on: the cre­dit agen­cy will have to make state­ments about the weight­ing, but it does not have to sta­te them spe­ci­fi­cal­ly. Sin­ce the GA wants to pro­tect the algo­rithm or the score for­mu­la, it should be suf­fi­ci­ent to indi­ca­te which fac­tors or fac­tor cate­go­ries are given more or less weight.

Con­clu­ding remarks

Opi­ni­ons of a GA are not bin­ding. The­r­e­fo­re, it remains to be hoped that the ECJ will con­sider the aspects dis­re­gard­ed by the GA, in par­ti­cu­lar in the first que­sti­on refer­red, and deci­de dif­fer­ent­ly. As a rule, howe­ver, the ECJ fol­lows the GA’s opi­ni­on, which would not be sur­pri­sing in the pre­sent case. A cor­re­spon­ding decis­i­on of the ECJ would fit seam­less­ly into the pre­vious, very data pro­tec­tion-fri­end­ly case law of the ECJ.

The out­co­me of the pro­ce­e­dings will also be signi­fi­cant for Swiss law, becau­se the nDSG, which will enter into force on Sep­tem­ber 1, 2023, will for the first time sub­ject auto­ma­ted decis­i­ons by pri­va­te data con­trol­lers to spe­cial rules. The con­side­ra­ti­ons of the GA and the pen­ding decis­i­on of the ECJ may not be trans­fer­red unseen to Art. 21 nDSG. Howe­ver, sin­ce the DPA and the GDPR for­mu­la­te the term “auto­ma­ted decis­i­on” in only slight­ly dif­fe­rent terms and the legis­la­tor (as can be seen in the mate­ri­als) has based its ter­mi­no­lo­gy on the GDPR, the Euro­pean case law should also influence the under­stan­ding in Swiss law.

Final­ly, it should be noted that on the same day the GA also sub­mit­ted his Opi­ni­on in Cases C‑26/22 and C‑64/22, which also con­cern SCHUFA. The que­sti­ons to be ans­we­red in the­se preli­mi­na­ry ruling pro­ce­e­dings rela­te in par­ti­cu­lar to the per­mis­si­bi­li­ty of the reten­ti­on of data from public regi­sters by a cre­dit agency.

Aut­ho­ri­ty

Area

Topics

Rela­ted articles

Sub­scri­be