Postu­la­te Béglé (16.3914): How to bring ethics to algorithms?

Postu­la­te Béglé (16.3914): How to bring ethics to algorithms?

Sub­mit­ted text

The Fede­ral Coun­cil is ins­truc­ted to exami­ne, what can be expec­ted or requi­red of algo­rith­ms at home and abroad from an ethi­cal per­spec­ti­ve. Algo­rith­ms are opaque, their accoun­ta­bi­li­ty is fuz­zy, their duties are limi­t­ed. How do algo­rith­ms work? Who can be cont­ac­ted in case of mis­in­for­ma­ti­on? Are algo­rith­ms sub­ject to Swiss law? The ever-gro­wing influence of algo­rith­ms must be con­trol­led wit­hout dimi­nis­hing their usefulness.

Justi­fi­ca­ti­on

Thanks to algo­rith­ms, we get hits in our Inter­net sear­ches. An algo­rithm is a sequence of ins­truc­tions used to hier­ar­chi­ze the infor­ma­ti­on con­su­med by bil­li­ons of peo­p­le. The cen­tral que­sti­on is: Which infor­ma­ti­on appears first on our screen? The sequence reflects a value system, even a world­view.

Nowa­days, pri­va­te com­pa­nies deci­de about it. The infor­ma­ti­on can be hier­ar­chi­zed accor­ding to four cri­te­ria: Popu­la­ri­ty (num­ber of views), aut­ho­ri­ty (referrals/hyperlinks), repu­ta­ti­on (num­ber of retweets/likes), pre­dic­ted beha­vi­or based on traces left on the Inter­net. Howe­ver, this hier­ar­chizati­on car­ri­es risks.

1. algo­rith­ms tend to inter­fe­re with the prin­ci­ple of freedom.

In order to enga­ge Inter­net users, they first dis­play the most acce­s­sed infor­ma­ti­on and tho­se that cor­re­spond to the user’s opi­ni­on, while lea­ving the rest asi­de. This crea­tes so-cal­led Fil­ter bubbles. The thre­at to demo­cra­cy is the­r­e­fo­re all the grea­ter becau­se an incre­a­sing num­ber of citi­zens are Qua­li­ty media spurn and get infor­ma­ti­on on social net­works instead.

2. algo­rith­ms exa­cer­ba­te inequalities.

Algo­rith­ms glo­ba­li­ze the opi­ni­on mar­ket and give the “best” exce­s­si­ve visi­bi­li­ty. The rest, espe­ci­al­ly the “avera­ge,” is often for­got­ten: 95 per­cent of Inter­net users con­su­me 0.03 per­cent of the con­tent (accor­ding to socio­lo­gist Domi­ni­que Cardon).

3. this may lead to discrimination.

Dyna­mic pri­cing is based on algo­rith­ms and can pena­li­ze loy­al, hur­ried, or depen­dent cus­to­mers by offe­ring them hig­her pri­ces, and deny attrac­ti­ve offers to cus­to­mers “wit­hout potential.”

With grea­ter trans­pa­ren­cy and clea­rer defi­ni­ti­on of respon­si­bi­li­ties, the poli­ti­cal, eco­no­mic, and social power of algo­rith­ms could be limited.

State­ment of the Fede­ral Coun­cil of 25.1.2018

Search, mee­ting, media, rating and mar­ket­place plat­forms on the Inter­net have beco­me an indis­pensable part of the modern world of life. They are chan­ging the lives of every indi­vi­du­al and will incre­a­sing­ly shape the basic values of socie­ty. The basis for the­se systems and plat­forms are algo­rith­ms that enable the neces­sa­ry data pro­ce­s­sing. In view of this deve­lo­p­ment, the Fede­ral Coun­cil shares the postulate’s assess­ment that the poten­ti­al risks posed by algo­rith­ms must be inve­sti­ga­ted if the oppor­tu­ni­ties are to be exploi­ted sus­tain­ab­ly. Howe­ver, algo­rith­ms must not be con­side­red in iso­la­ti­on, but holi­sti­cal­ly in the con­text of data pro­ce­s­sing and the inten­ded func­tion­a­li­ty of the systems and plat­forms. Against this back­ground, the Fede­ral Coun­cil has taken various measures.

- The mea­su­res taken in imple­men­ta­ti­on of the Moti­on Rech­stei­ner Paul 13.3841, “Expert Com­mis­si­on on the Future of Data Pro­ce­s­sing and Data Secu­ri­ty,” is inve­sti­ga­ting the topic of algo­rith­ms from various angles. In addi­ti­on to spe­ci­fic topics – such as digi­tal mani­pu­la­ti­on (big nud­ging, fil­ter bubble, etc.) – it is pur­suing over­ar­ching sets of issues. The­se include the que­sti­ons of the ext­ent to which ethi­cal prin­ci­ples in inter­ac­tion with legal requi­re­ments could pre­vent abu­se in data pro­ce­s­sing in gene­ral and in algo­rith­ms in par­ti­cu­lar, and what ethi­cal prin­ci­ples the­se might be. The report of the expert group with cor­re­spon­ding recom­men­da­ti­ons is expec­ted in mid-2018.

- In the area of per­so­nal data, the ongo­ing revi­si­on of the Data Pro­tec­tion Act various con­stel­la­ti­ons in which per­so­nal data is pro­ce­s­sed by algo­rith­ms. Thus, an obli­ga­ti­on to inform and hear the data sub­ject is intro­du­ced if a decis­i­on is made vis-à-vis the data sub­ject that is based exclu­si­ve­ly on auto­ma­ted data pro­ce­s­sing and has legal effects or signi­fi­cant effects on the data sub­ject. With the right to infor­ma­ti­on, the data sub­ject should also be able to demand fur­ther infor­ma­ti­on about the result, the occur­rence and the effects of the decis­i­on. The bill also con­ta­ins mea­su­res on pro­fil­ing, which is often based on the use of algo­rith­ms. Final­ly, data con­trol­lers are to be requi­red to prepa­re a data pro­tec­tion impact assess­ment if the pro­ce­s­sing could lead to a vio­la­ti­on of the data subject’s per­so­na­li­ty or fun­da­men­tal rights. Within this frame­work, mea­su­res to pro­tect the data sub­ject must also be examined.

- Rese­arch and edu­ca­ti­on: In 2015, the Fede­ral Coun­cil laun­ched the natio­nal rese­arch pro­gram on Big Data (NRP 75). At the end of last year, various pro­jects deal­ing with ethi­cal issues in the area of data pro­ce­s­sing and Big Data were appro­ved as part of Modu­le 2 (socie­tal, regu­la­to­ry and edu­ca­tio­nal measures).

The mea­su­res listed show that the topic of “Algo­rith­ms and Ethics on the Net” is alre­a­dy inte­gra­ted into ongo­ing acti­vi­ties. For the Fede­ral Coun­cil, con­ti­nuing and streng­thening the­se acti­vi­ties seems the most effec­ti­ve way to address the issue.

Aut­ho­ri­ty

Area

Topics

Rela­ted articles

Sub­scri­be