Moti­on Gysi (23.4492): Arti­fi­ci­al intel­li­gence in the work­place. Streng­thening the par­ti­ci­pa­ti­on rights of employees

Moti­on Gysi (23.4492): Arti­fi­ci­al intel­li­gence in the work­place. Streng­thening the par­ti­ci­pa­ti­on rights of employees

Sub­mit­ted text

The Fede­ral Coun­cil is ins­truc­ted to streng­then the par­ti­ci­pa­ti­on rights of employees in the use of arti­fi­ci­al intel­li­gence (AI) and algo­rith­mic systems in the work­place at a sta­tu­to­ry level if the­se systems are used for Recom­men­da­ti­ons, fore­casts, decis­i­ons etc., which con­cern employees or Employee data use. In par­ti­cu­lar, the amend­ments are inten­ded to streng­then coll­ec­ti­ve co-deter­mi­na­ti­on. To this end, the right to have a say is to be expan­ded, infor­ma­ti­on rights streng­the­ned, coll­ec­ti­ve rights of action crea­ted and sanc­tion opti­ons exami­ned. The aim is to mini­mi­ze risks that ari­se for employees and to ensu­re that employees also benefit.

Justi­fi­ca­ti­on

As new stu­dies show, many employees in Switz­er­land fear for their jobs. This fear is often accom­pa­nied by uncer­tain­ty about which tech­no­lo­gies are used in the work­place and what their data is used for. A lack of trans­pa­ren­cy and uncer­tain­ty are not con­du­ci­ve to a good working rela­ti­on­ship and dimi­nish employees’ trust in the systems used. A lack of con­sul­ta­ti­on can lead to inju­sti­ce, as the con­se­quen­ces for various affec­ted par­ties are not ful­ly taken into account, as well as to nega­ti­ve effects on the health of employees – espe­ci­al­ly in the case of auto­ma­ted monitoring.

A new Legal opi­ni­on of the Uni­ver­si­ty of St. Gal­len shows the need for action: The right to par­ti­ci­pa­ti­on has various loopho­les and does not ade­qua­te­ly pro­tect the rights of employees. It is the­r­e­fo­re important to streng­then par­ti­ci­pa­ti­on rights. The law must defi­ne clear obli­ga­ti­ons for employers as to how employees are to be invol­ved and how infor­ma­ti­on rights are to be streng­the­ned. Employees should have access to exter­nal spe­cia­lists. Health-rela­ted systems should also be sub­ject to even stric­ter obli­ga­ti­ons to coope­ra­te. Ano­ther pro­blem is that both the data used and the effects on employees are often coll­ec­ti­ve. This is why coll­ec­ti­ve co-deter­mi­na­ti­on opti­ons and coll­ec­ti­ve rights of action are nee­ded. Sanc­tions could be impo­sed on employers who vio­la­te the par­ti­ci­pa­ti­on requirements.

State­ment of the Fede­ral Coun­cil of 14.2.24

The Fede­ral Coun­cil is awa­re that the incre­a­sing use of algo­rith­mic systems in the work­place is asso­cia­ted with uncer­tain­ties. The Par­ti­ci­pa­ti­on Act (SR 822.14) pro­vi­des for a gene­ral right to infor­ma­ti­on in this regard (Art. 9), which is sup­ple­men­ted by spe­cial rights of co-deter­mi­na­ti­on, name­ly in the area of occu­pa­tio­nal health (Art. 10 para. 1 let. a in con­junc­tion with Art. 48 para. 1 let. a of the Labor Act [ArG]). In addi­ti­on to the right to infor­ma­ti­on and the right to have a say, the­re are health pro­tec­tion regu­la­ti­ons that pro­hi­bit the use of moni­to­ring or con­trol systems that are inten­ded to moni­tor the beha­vi­or of employees in the work­place (Art. 26 para. 1 of Ordi­nan­ce 3 to the Employment Act [ArGV 3]). The Gen­der Equa­li­ty Act (GlG, SR 151.1), which pro­hi­bits dis­cri­mi­na­ti­on in employment rela­ti­on­ships under pri­va­te and public law, also applies in cases whe­re the employer uses arti­fi­ci­al intel­li­gence (AI). For its part, the Data Pro­tec­tion Act (DPA, SR 235.1) gua­ran­tees com­pre­hen­si­ve pro­tec­tion of employees’ per­so­nal data. With the revi­si­on of the FADP, the obli­ga­ti­on to pro­vi­de infor­ma­ti­on was streng­the­ned, par­ti­cu­lar­ly in the case of auto­ma­ted indi­vi­du­al decis­i­ons, and the pos­si­bi­li­ty of invol­ving a natu­ral per­son was crea­ted. Fur­ther­mo­re, Art. 22 FADP now also con­ta­ins an obli­ga­ti­on for the con­trol­ler to car­ry out a data pro­tec­tion impact assess­ment if pro­ce­s­sing may ent­ail a high risk to the per­so­na­li­ty or fun­da­men­tal rights of the data sub­ject, which may result from the use of new tech­no­lo­gies (Art. 22 para. 2 FADP) such as arti­fi­ci­al intel­li­gence. Final­ly, Artic­les 328 and 328b of the Code of Obli­ga­ti­ons gua­ran­tee the pro­tec­tion of employees’ privacy.

The cur­rent legal frame­work also con­ta­ins instru­ments for asser­ting rights. The can­to­nal labor inspec­to­ra­tes are respon­si­ble for ensu­ring com­pli­ance with labor law regu­la­ti­ons. Artic­le 59 ArG in par­ti­cu­lar pro­vi­des for cri­mi­nal sanc­tions for vio­la­ti­ons of health pro­tec­tion pro­vi­si­ons. In the event of vio­la­ti­ons of the Par­ti­ci­pa­ti­on Act, employees’ asso­cia­ti­ons can sue for a decla­ra­ti­on (Art. 15 para. 2 of the Par­ti­ci­pa­ti­on Act), and Artic­le 7 GlG pro­vi­des for the pos­si­bi­li­ty of lawsuits and com­plaints by orga­nizati­ons that have been in exi­stence for at least two years and which, accor­ding to their sta­tu­tes, pro­mo­te equa­li­ty bet­ween women and men or pro­tect the inte­rests of employees. In addi­ti­on, the Fede­ral Coun­cil in its Dis­patch of Decem­ber 10, 2021 on the amend­ment of the Swiss Code of Civil Pro­ce­du­re (repre­sen­ta­ti­ve action and coll­ec­ti­ve sett­le­ment) pro­po­sed to sub­stan­ti­al­ly streng­then coll­ec­ti­ve redress. This bill is curr­ent­ly being dis­cus­sed in par­lia­ment. On the basis of Artic­le 49 para­graph 1 FADP, the Fede­ral Data Pro­tec­tion and Infor­ma­ti­on Com­mis­sio­ner (FDPIC) may open an inve­sti­ga­ti­on ex offi­cio or upon noti­fi­ca­ti­on if the­re are suf­fi­ci­ent indi­ca­ti­ons that data pro­ce­s­sing may vio­la­te data pro­tec­tion regu­la­ti­ons. If neces­sa­ry, he can order the pro­ce­s­sing of the data to be ful­ly or par­ti­al­ly adju­sted, inter­rupt­ed or dis­con­tin­ued and the per­so­nal data to be ful­ly or par­ti­al­ly dele­ted or destroy­ed (Art. 51 FADP). The FADP also pro­vi­des for pen­al­ties for brea­ches of the duty of care or con­fi­den­tia­li­ty and for dis­re­gar­ding orders issued by the FDPIC. The can­tons are respon­si­ble for pro­se­cu­ting and asses­sing cri­mi­nal acts (Art. 60 ff. FADP).

AI is the­r­e­fo­re deve­lo­ping not in a legal vacu­um. The que­sti­on of whe­ther Swiss law is up to the chal­lenges posed by AI is curr­ent­ly being exami­ned. On Novem­ber 22, 2023, the Fede­ral Coun­cil DETEC and the FDFA ins­truc­tedto draw up an over­view of pos­si­ble regu­la­to­ry approa­ches for the use of arti­fi­ci­al intel­li­gence. The ana­ly­sis, which should be available by the end of 2024, will also iden­ti­fy sec­tor-spe­ci­fic regu­la­to­ry requi­re­ments in con­nec­tion with arti­fi­ci­al intel­li­gence. Based on this work, the Fede­ral Coun­cil will deci­de whe­ther the­re is a need for legis­la­ti­ve action and how this should be taken into account. The results of this work should not be prejudged.

Aut­ho­ri­ty

Area

Topics

Rela­ted articles

Sub­scri­be