datenrecht.ch

Fede­ral Coun­cil: DSA à la Sui­s­se – lar­ge com­mu­ni­ca­ti­on plat­forms to be regulated

The Fede­ral Coun­cil has noti­fi­ed on April 5, 2023that he has a Regu­la­ti­on of lar­ge com­mu­ni­ca­ti­on plat­forms such as Goog­le, Face­book, You­Tube and Twit­ter, i.e., from “ope­ra­tors of lar­ge com­mu­ni­ca­ti­ons plat­forms (inter­me­dia­ries)”.

It has ins­truc­ted DETEC, with the invol­vement of the Fede­ral Office of Justi­ce, to End March 2024 to prepa­re a cor­re­spon­ding con­sul­ta­ti­on draft.

In doing so, the Fede­ral Coun­cil is gui­ded (“whe­re sen­si­ble” – one may be curious) by the Digi­tal Ser­vices Act (DSA) of the EU. He wants in particular

Streng­then the rights of users in Switz­er­land and requi­re more trans­pa­ren­cy from plat­forms wit­hout limi­ting the posi­ti­ve effects of plat­forms on free­dom of expression.

Fur­ther­mo­re

the lar­ge plat­forms […] a cont­act point and a Legal repre­sen­ta­ti­ve in Switz­er­land designate

Users who­se con­tent dele­ted or their account blocked The plat­form should be able to direct­ly request a review of the mea­su­re taken. In addi­ti­on, an inde­pen­dent Swiss arbi­tra­ti­on board be crea­ted. This is to be finan­ced by the platforms.

To crea­te trans­pa­ren­cy, the major plat­forms should Mark adver­ti­sing as such and with tar­get group spe­ci­fic adver­ti­sing publish the most important para­me­tersaccor­ding to which adver­ti­sing is play­ed. This makes it pos­si­ble to track who recei­ves a par­ti­cu­lar adver­ti­se­ment and for what reasons.

Users should pro­vi­de the plat­forms with calls for hat­red, depic­tions of vio­lence or thre­ats in a simp­le man­ner. can report. The plat­forms must send the mes­sa­ges check and the users about the result inform.

In this way, the Fede­ral Coun­cil wants to respond to the fin­ding that the plat­forms are “hard­ly regu­la­ted” today:

The systems that deci­de who gets to see what con­tent are not trans­pa­rent. Users also have a weak posi­ti­on vis-à-vis the plat­forms. This beco­mes appa­rent, for exam­p­le, when a plat­form blocks a user’s account or dele­tes con­tent that users dis­tri­bu­te. At pre­sent, users can­not defend them­sel­ves against such blockings and dele­ti­ons, or can do so only inadequately.

Aut­ho­ri­ty

Area

Topics

Rela­ted articles

Sub­scri­be