datenrecht.ch

Par­lia­men­ta­ry Initia­ti­ve 23.438: Amend­ment to the FADP regar­ding par­ti­al­ly auto­ma­ted decisions

Within the frame­work of the June 15, 2023 sub­mit­ted par­lia­men­ta­ry initia­ti­ve 23.438 (“Adapt data pro­tec­tion legis­la­ti­on and sup­ple­ment par­ti­al­ly auto­ma­ted decis­i­ons based on arti­fi­ci­al intel­li­gence (AI)”), the new data pro­tec­tion law is to be sup­ple­men­ted with an obli­ga­ti­on to pro­vi­de infor­ma­ti­on when using AI.

Initi­al situation

The new Data Pro­tec­tion Act (nDSG) will enter into force on Sep­tem­ber 1, 2023. Art. 21 nDSG also intro­du­ces a spe­ci­fic duty to inform regar­ding auto­ma­ted indi­vi­du­al decis­i­ons. The pro­vi­si­on requi­res the con­trol­ler to inform data sub­jects about a decis­i­on that is exclu­si­ve­ly is based on auto­ma­ted pro­ce­s­sing and which is asso­cia­ted with a legal con­se­quence for them or signi­fi­cant­ly affects the data sub­ject. The FADP thus only pro­vi­des for ful­ly auto­ma­ted Decis­i­ons, a spe­ci­fic labe­l­ing obli­ga­ti­on and then cer­tain rights. In the text of the pro­po­sal, it is pre­cis­e­ly this point that is descri­bed as unsa­tis­fac­to­ry – this becau­se in prac­ti­ce AI systems are much more fre­quent­ly used in par­ti­al­ly auto­ma­ted pro­ce­du­res are used, but Art. 21 nDSG would then not be appli­ca­ble. The par­lia­men­ta­ry initia­ti­ve sub­mit­ted by the Green Group in the sum­mer ses­si­on is inten­ded to address this pro­blem with a new Art. 21bis nDSG.

Pro­po­sal Art. 21bis nDSG

Accor­ding to the text of the pro­po­sal, the nDSG is to be amen­ded as fol­lows (empha­sis by the aut­hor), alt­hough it is not enti­re­ly clear from the wor­ding whe­ther a new Artic­le 21bis is to be inser­ted or whe­ther Artic­le 21 nDSG its­elf is to be amended:

Artic­le 21bis Duty to pro­vi­de infor­ma­ti­on when using arti­fi­ci­al intelligence

1 The data con­trol­ler shall inform the data sub­ject of a decis­i­on to is based to a lar­ge ext­ent on arti­fi­ci­al intel­li­gence and which is asso­cia­ted with a legal con­se­quence for them or signi­fi­cant­ly affects them (AI-sup­port­ed indi­vi­du­al decis­i­on).

2 Upon request, it shall give the data sub­ject the oppor­tu­ni­ty to sta­te its posi­ti­on. The data sub­ject may request that the AI-sup­port­ed indi­vi­du­al decis­i­on is veri­fi­ed by a natu­ral person.

3 Para­graphs 1 and 2 do not app­ly if:

  1. the AI-sup­port­ed indi­vi­du­al decis­i­on is direct­ly rela­ted to the con­clu­si­on or per­for­mance of a con­tract bet­ween the data con­trol­ler and the data sub­ject and his or her request is gran­ted; or
  2. the data sub­ject has express­ly con­sen­ted to the decis­i­on being AI-sup­port­ed takes place.

4 If the AI-powered indi­vi­du­al decis­i­on by a fede­ral body, it must mark the decis­i­on accor­din­gly. Para­graph 2 does not app­ly if the per­son con­cer­ned does not have to be heard befo­re the decis­i­on is taken in accordance with Artic­le 30 para­graph 2 of the Admi­ni­stra­ti­ve Pro­ce­du­re Act of 20 Decem­ber 19686 (VwVG) or ano­ther fede­ral law.

Art. 21bis nDSG is prac­ti­cal­ly a copy of Art. 21 nDSG. The dif­fe­rence lies main­ly in the fact that “Decis­i­on based sole­ly on auto­ma­ted pro­ce­s­sing.” with “Decis­i­on that signi­fi­cant at arti­fi­ci­al intel­li­gence is based” is repla­ced and in the pro­po­sed artic­le is always refer­red to by “AI-sup­port­ed indi­vi­du­al decis­i­on” is the talk.

Justi­fi­ca­ti­on

The initia­ti­ve is justi­fi­ed by the incre­a­sing auto­ma­ti­on of pro­ce­s­ses. AI systems would in prac­ti­ce much more often in par­ti­al­ly auto­ma­ted pro­ce­s­ses - used as decis­i­on sup­port. Howe­ver, Art. 21 nDSG would pre­cis­e­ly not be appli­ca­ble in the­se cases, which is unsa­tis­fac­to­ry, becau­se a par­ti­al­ly auto­ma­ted decis­i­on can of cour­se have simi­lar­ly serious con­se­quen­ces as a ful­ly auto­ma­ted decis­i­on. Howe­ver, the nDSG would only help a data sub­ject if a ful­ly auto­ma­ted AI system car­ri­ed out every sin­gle step up to the decis­i­on its­elf, i.e. wit­hout human intervention.

The initia­tors are awa­re, moreo­ver, that the term “arti­fi­ci­al intel­li­gence” is neither men­tio­ned nor defi­ned in the nDSG, and the­r­e­fo­re refer to the plan­ned AI regu­la­ti­on of the EUwhich will con­tain a defi­ni­ti­on of “AI system”. The Swiss legis­la­tor will pro­ba­b­ly (have to) fol­low this defi­ni­ti­on in the future, as it is well known that the AI Regu­la­ti­on is a extra­ter­ri­to­ri­al effect and that in Switz­er­land it will pro­ba­b­ly also lead to the so-cal­led “.Brussels effect” will come becau­se the coun­try is clo­se­ly lin­ked to the EU inter­nal mar­ket and is depen­dent on mar­ket access (cf.

Spe­cial fea­ture of the advance

The poli­ti­cal deba­te sur­roun­ding this par­lia­men­ta­ry initia­ti­ve will be inte­re­st­ing in that fede­ral poli­ti­ci­ans will now have to deal inten­si­ve­ly with the issue of AI regu­la­ti­on and the use and func­tio­ning of AI systems in prac­ti­ce. This is against the back­ground that the initia­ti­ve is a so-cal­led “par­lia­men­ta­ry initia­ti­ve” and a com­mis­si­on must the­r­e­fo­re prepa­re a draft its­elf (if the Pa. Iv. is accept­ed in a first step). The pre­vious pro­po­sals, which were sub­mit­ted on the fede­ral level on the topic of AI, were main­ly moti­ons, postu­la­tes, inter­pel­la­ti­ons and que­sti­ons, which were addres­sed to the Fede­ral Coun­cil. In view of the cur­rent cir­cum­stances – we are expe­ri­en­cing in real time the Birth of a new gene­ra­ti­on of AI systems, espe­ci­al­ly in the area of gene­ra­ti­ve AI – a reflec­ti­ve deba­te, con­duc­ted direct­ly in the respon­si­ble com­mis­si­on, on how the use of AI should be regu­la­ted in Switz­er­land is desirable.

Out­look

It is not yet clear from the Swiss parliament’s web­site when the par­lia­men­ta­ry initia­ti­ve will be dealt with and in which Natio­nal Coun­cil com­mit­tee. Sin­ce the revi­si­on of the DPA was dealt with by the Sta­te Poli­cy Com­mit­tee (SPC) and data pro­tec­tion is one of its assi­gned are­as of exper­ti­se, it is obvious that the SPC will again be respon­si­ble for this matter.

Aut­ho­ri­ty

Area

Topics

Rela­ted articles

Sub­scri­be