Takea­ways (AI):
  • Poli­ti­cal agree­ment on the Digi­tal Ser­vices Act (DSA) to crea­te uni­form rules in the digi­tal sin­gle market.
  • The DSA has extra­ter­ri­to­ri­al effectwhich also makes him Swiss pro­vi­ders relevant.
  • Regu­la­ti­ons distin­gu­ish bet­ween dif­fe­rent types of Inter­me­dia­ry ser­viceshow Access pro­vi­ders and Hosting ser­vices.
  • The DSA inclu­des exten­si­ve Due dili­gence and trans­pa­ren­cy obli­ga­ti­ons for pro­vi­ders to mode­ra­te ille­gal content.
  • Fur­nis­hing and Com­pli­ance-obli­ga­ti­ons are beco­ming stric­ter for Very lar­ge online plat­forms and search engines.

The Euro­pean Par­lia­ment and the Coun­cil have rea­ched a preli­mi­na­ry poli­ti­cal agree­ment on the text of the Digi­tal Ser­vices Act (“DSA”) was agreed. The DSA is part of the EU stra­tegy to ensu­re a uni­form digi­tal sin­gle mar­ket. The sta­ted goal of the DSA is to crea­te uni­form rules for swit­ching ser­vices. The scope of the DSA is defi­ned more broad­ly than the term “swit­ching ser­vices” would suggest.

Scope of the DSA and rele­van­ce for Switzerland

The term “inter­me­dia­ry ser­vices” covers not only plat­forms that act as agents to bro­ker con­tracts for pro­ducts or ser­vices bet­ween dif­fe­rent mar­ket par­ti­ci­pan­ts. Rather, the DSA also addres­ses access pro­vi­ders, search engi­nes, wire­less local area net­works, or cloud infras­truc­tu­re ser­vices. “Inter­me­dia­ry” the­r­e­fo­re descri­bes the sub­jec­ti­ve scope of the DSA bet­ter than the one in the (out­da­ted) Ger­man draft term “inter­me­dia­ry ser­vices” used.

How the Data Act has the DSA extra­ter­ri­to­ri­al effect. For Swiss pro­vi­ders of inter­me­dia­ry ser­vices, the DSA is the­r­e­fo­re also rele­vant if they do not have an estab­lish­ment in the EEA but pro­vi­de their ser­vices to reci­pi­en­ts of ser­vices (her­ein­af­ter: “users”) who are loca­ted in the EEA, offer. Accor­ding to the DSA, this is to be assu­med if the inter­me­dia­ry ser­vice can demon­stra­te a signi­fi­cant num­ber of EEA users or the ser­vice is tar­ge­ted at EEA users. Indi­ca­ti­ons for such tar­ge­ting may be in par­ti­cu­lar: (i) cur­ren­cy, (ii) deli­very in EEA sta­tes, or (iii) offe­ring of the app in the natio­nal app store of an EEA state.

It is also important to note that legal enti­ties can also be users. The DSA is no Con­su­mer pro­tec­tion law and recor­ded also B2B ser­vices.

New and repacka­ged regu­la­to­ry approa­ches – an overview

The DSA seeks to achie­ve the goal of crea­ting a level play­ing field for swit­ching ser­vices through new and repacka­ged regu­la­to­ry approaches.

Repacka­ged” are main­ly Lia­bi­li­ty pri­vi­le­ges for inter­me­dia­ry ser­vices, which were pre­vious­ly found in Art. 12 – 15 of Direc­ti­ve 2001/31/EC (“e‑commerce Direc­ti­ve”). Accor­din­gly, Art. 12 – 15 of the e‑commerce Direc­ti­ve are repla­ced by the lia­bi­li­ty pri­vi­le­ges in Chap­ter II of the DSA. In all other respects, the e‑commerce Direc­ti­ve remains in force. As befo­re, a distinc­tion is made in the lia­bi­li­ty pri­vi­le­ges bet­ween access pro­vi­ders, caching pro­vi­ders and hosting pro­vi­ders. To put it brief­ly, a pro­vi­der of inter­me­dia­ry ser­vices bene­fits from the lia­bi­li­ty pri­vi­le­ge if it limits its­elf to its inter­me­dia­ry role. Howe­ver, if it assu­mes an acti­ve role such that it gains know­ledge of or con­trol over ille­gal infor­ma­ti­on (e.g., by assum­ing edi­to­ri­al respon­si­bi­li­ty, deli­bera­te­ly coope­ra­ting with users to enga­ge in ille­gal acti­vi­ties, or fai­ling to act swift­ly after beco­ming awa­re of ille­gal con­tent), the lia­bi­li­ty pri­vi­le­ge does not apply.

(Basi­cal­ly) old wine in new skins is also the super­vi­so­ry regime. What the natio­nal data pro­tec­tion aut­ho­ri­ties are under the Gene­ral Data Pro­tec­tion Regu­la­ti­on (“GDPR”), the coor­di­na­tors for digi­tal ser­vices are under the DSA. The Euro­pean Data Pro­tec­tion Board (EDSA) is the equi­va­lent of the Euro­pean Digi­tal Ser­vices Board. Like the EDSA for data pro­tec­tion, the panel is inten­ded to con­tri­bu­te to the con­si­stent appli­ca­ti­on of the DSA and, in par­ti­cu­lar, to deve­lop gui­de­lines. The accoun­ta­bi­li­ty prin­ci­ple run­ning through the DSA is also remi­nis­cent of the GDPR.

What is new, howe­ver, is the lar­ge num­ber of due dili­gence and trans­pa­ren­cy obli­ga­ti­ons that the DSA impo­ses on pro­vi­ders of inter­me­dia­ry ser­vices. Howe­ver, pro­vi­ders of inter­me­dia­ry ser­vices are still not obli­ga­ted to search for ille­gal con­tent pre­ven­tively or wit­hout cause.

The pyra­mid of duties of the DSA

The DSA draft is reco­gniz­ab­ly dri­ven by the effort to place the obli­ga­ti­ons for pro­vi­ders of inter­me­dia­ry ser­vices in a rea­sonable rela­ti­on­ship to the natu­re of the ser­vices in que­sti­on. This results in a pyra­mid of obligations:

Cer­tain basic obli­ga­ti­ons app­ly to all pro­vi­ders of media­ti­on ser­vices. In addi­ti­on, the­re are fur­ther obli­ga­ti­ons for pro­vi­ders of Hosting ser­vicesi.e. ser­vices that store infor­ma­ti­on on behalf of the user.

Pro­vi­ders of online plat­forms must reck­on with even more com­pre­hen­si­ve and stric­ter obli­ga­ti­ons. Online plat­forms are such hosting ser­vices that dis­tri­bu­te infor­ma­ti­on stored on behalf of the user publicly (i.e. out­side clo­sed user groups). Instant mes­sa­ging ser­vices or e‑mail ser­vices are the­r­e­fo­re not cover­ed by the term “hosting ser­vice”. Small and medi­um enter­pri­ses (“SMEs”) are exempt from the obli­ga­ti­ons appli­ca­ble to online plat­forms. basi­cal­ly except. Howe­ver, for SMEs through who­se online plat­forms con­su­mers can con­clude distance con­tracts, the obli­ga­ti­ons appli­ca­ble in this regard shall nevert­hel­ess app­ly. SMEs are defi­ned as com­pa­nies with (i) fewer than 250 employees and (ii) a maxi­mum annu­al tur­no­ver of EUR 50 mil­li­on or a maxi­mum annu­al balan­ce sheet total of EUR 43 million.

The stric­test obli­ga­ti­ons then app­ly to very lar­ge online plat­forms and very lar­ge online search engi­nes. Online plat­forms or search engi­nes that have an avera­ge of at least 45 mil­li­on acti­ve users per month are con­side­red “very large”.

In the fol­lo­wing, the most rele­vant duties in the author’s opi­ni­on are pre­sen­ted, stag­ge­red accor­ding to pyra­mid levels. The fol­lo­wing pre­sen­ta­ti­on is the­r­e­fo­re not exhaustive.

Obli­ga­ti­ons appli­ca­ble to pro­vi­ders of all media­ti­on services

Pro­vi­ders wit­hout an estab­lish­ment in an EEA sta­te must first sub­mit a Legal repre­sen­ta­ti­ve order in an EEA sta­te (Art. 11). In addi­ti­on, all pro­vi­ders would have to have a cen­tral cont­act point as the point of cont­act for the super­vi­so­ry aut­ho­ri­ties (Art. 10). Unli­ke the legal repre­sen­ta­ti­ve, this does not have to be phy­si­cal­ly loca­ted in an EEA sta­te. The requi­re­ment to pro­vi­de a “sin­gle point of cont­act” for users (Art. 10a) is likely to alre­a­dy be met by many providers.

Pro­vi­ders of inter­me­dia­ry ser­vices will also be able to Revi­se GTC (Art. 12). The­se must now include, in par­ti­cu­lar, infor­ma­ti­on about rest­ric­tions on the infor­ma­ti­on pro­vi­ded by users. This inclu­des, in par­ti­cu­lar, infor­ma­ti­on about the pro­ce­s­ses and methods used for “con­tent mode­ra­ti­on”, inclu­ding algo­rith­mic decis­i­on-making and inter­nal com­plaint pro­ce­du­res. If a ser­vice is pre­do­mi­nant­ly direc­ted at minors or is hea­vi­ly used by them, GTCs must be writ­ten in lan­guage under­stan­da­ble to them.

Con­tent mode­ra­ti­on” refers to the acti­vi­ties of pro­vi­ders to iden­ti­fy, deter­mi­ne and com­bat ille­gal con­tent or con­tent that vio­la­tes the TOS pro­vi­ded by users. Examp­les of such acti­vi­ties include the rem­oval of the con­tent in que­sti­on, the sus­pen­si­on of the ser­vice or the blocking of the user account. What con­sti­tu­tes ille­gal con­tent must be deter­mi­ned in light of other EU law and the law of the Mem­ber Sta­tes. Examp­les of ille­gal con­tent include ter­ro­rist con­tent, the sale of coun­ter­feit pro­ducts, the sha­ring of pri­va­te images wit­hout con­sent, or brea­ches of con­su­mer law in the pro­vi­si­on of services.

About the­se acti­vi­ties (and any other obli­ga­ti­ons that app­ly to them as a hosting ser­vice pro­vi­der or – pos­si­bly very lar­ge – online plat­form or search engi­ne), inter­me­dia­ry ser­vice pro­vi­ders must annu­al­ly and publicly dis­c­lo­se Trans­pa­ren­cy reports to publish.

Addi­tio­nal obli­ga­ti­ons for hosting ser­vice providers

Hosting ser­vice pro­vi­ders are addi­tio­nal­ly obli­ged to pro­vi­de a Report­ing and redress pro­ce­du­res for suspec­ted ille­gal con­tent (Art. 14). In prac­ti­ce, pro­vi­ders are likely to incre­a­sing­ly use an input mask in the future (instead of sim­ply pro­vi­ding an e‑mail address) in order to faci­li­ta­te a report with the ele­ments men­tio­ned in Art. 14(2), as requi­red. The rece­ipt of such a noti­fi­ca­ti­on cau­ses the pro­vi­der to beco­me awa­re of the pre­su­med ille­gal con­tent (Art. 14 (3)). If the pro­vi­der does not act quick­ly on the report and the con­tent actual­ly pro­ves to be ille­gal, he loses the lia­bi­li­ty pri­vi­le­ge of Art. 5.

Deci­deIf the Pro­vi­der deci­des to (i) rest­rict the visi­bi­li­ty of the Con­tent (e.g. remo­ve, block or down­gra­de it in a ran­king), (ii) rest­rict the mone­tizati­on of the Con­tent, (iii) block or dis­con­ti­n­ue the Ser­vice in who­le or in part, or (iv) block or ter­mi­na­te the account of the affec­ted User, this is gene­ral­ly to justi­fy (Art. 15, her­ein­af­ter all mea­su­res tog­e­ther the “con­tent mea­su­res”). The justi­fi­ca­ti­on must in par­ti­cu­lar sta­te the rele­vant legal basis or GTC pro­vi­si­on from which inad­mis­si­bi­li­ty of the con­tent results and pro­vi­de infor­ma­ti­on on legal reme­dies (such as legal recour­se in court).

In addi­ti­on, hosting ser­vice pro­vi­ders must inform the com­pe­tent aut­ho­ri­ties about pos­si­ble cri­mes against life and limb (Art. 15a).

Addi­tio­nal obli­ga­ti­ons for online plat­form providers

For pro­vi­ders of online plat­forms, the­re are addi­tio­nal obli­ga­ti­ons in addi­ti­on to tho­se appli­ca­ble to hosting ser­vices. In par­ti­cu­lar, they are obli­ged to pro­vi­de a inter­nal com­plaint manage­ment system through which com­plaints about con­tent mea­su­res (taken or not taken) can be escala­ted (Art. 17). The com­plaints pro­ce­du­re is regu­la­ted in detail. As a legal reme­dy against the provider’s com­plaint decis­i­ons, the DSA pro­vi­des in par­ti­cu­lar for out-of-court dis­pu­te reso­lu­ti­on at a body appro­ved by the Digi­tal Ser­vices Coor­di­na­tor (Art. 18).

It is also note­wor­t­hy that pro­vi­ders of online plat­forms in case of fre­quent and obvious upload of ille­gal con­tent are obli­ga­ted in the future to Ser­vice towards the con­spi­cuous user after pri­or war­ning for sus­pend rea­sonable time (Art. 19 (1)). In Switz­er­land, a proac­ti­ve approach by hosting pro­vi­ders (which also include pro­vi­ders of online plat­forms) has so far only been expli­ci­t­ly regu­la­ted in Art. 39d URG. In con­trast to Art. 19(1) DSA, which pro­vi­des for the sus­pen­si­on of the Ser­vice Howe­ver, Art. 39d URG requi­res the pro­vi­der “only” to re-upload the rele­vant Plant to pre­vent.

Fur­ther­mo­re, pro­vi­ders of online plat­forms may not design user inter­faces in a mis­lea­ding man­ner (Ban on “dark pat­terns, Art. 23a). Users should be able to make vol­un­t­a­ry and infor­med decis­i­ons. In par­ti­cu­lar, can­cel­ling a ser­vice must be as easy as sig­ning up.

At Online adver­ti­sing is also to be achie­ved by means of addi­tio­nal Labe­l­ing obli­ga­ti­ons trans­pa­ren­cy is increa­sed (Art. 24). In the future, not only must the adver­ti­sing be iden­ti­fi­ed as such, but the adver­ti­ser must also be sta­ted. If the adver­ti­sing is per­so­na­li­zed, the most important para­me­ters for per­so­na­lizati­on must also be dis­c­lo­sed. Per­so­na­li­zed adver­ti­sing is not per­mit­ted (i) on the basis of par­ti­cu­lar­ly sen­si­ti­ve per­so­nal data and (ii) to minors.

Does the online plat­form prio­ri­ti­ze cer­tain infor­ma­ti­on (e.g., sel­ec­ting cer­tain posts that the plat­form belie­ves are of par­ti­cu­lar inte­rest to the user) or other­wi­se use a Recom­men­da­ti­on systemIf the cus­to­mer does not accept the recom­men­da­ti­on, the­re are fur­ther obli­ga­ti­ons atta­ched to it. For exam­p­le, the gene­ral terms and con­di­ti­ons must sta­te which cri­te­ria are used as the main basis for the recommendation.

Can be acce­s­sed via the online plat­form B2C distance sel­ling con­tracts are clo­sed, the pro­vi­ders of the online plat­forms are forced to Entre­pre­neur of the bro­ke­red products/services to iden­ti­fy (Art. 24c). Depen­ding on the cir­cum­stances, a copy of an iden­ti­ty docu­ment or an extra­ct from the com­mer­cial regi­ster must be reque­sted for iden­ti­fi­ca­ti­on pur­po­ses. In addi­ti­on, the pro­vi­der of the online plat­form must make rea­sonable efforts to veri­fy the infor­ma­ti­on pro­vi­ded by the entre­pre­neur. check.

In addi­ti­on, online plat­forms through which con­su­mers can con­clude distance con­tracts must be desi­gned in such a way that trad­ers are able, among other things, to pro­vi­de the man­da­to­ry infor­ma­ti­on requi­red under Uni­on law in rela­ti­on to the products/services they offer (Art. 24d). In addi­ti­on, the pro­vi­ders of the online plat­forms are obli­ged to veri­fy to the best of their abili­ty (“best effort”), whe­ther the entre­pre­neurs for exam­p­le, their pre-con­trac­tu­al and pro­duct-rela­ted Com­ply with infor­ma­ti­on requi­re­ments. Only then should entre­pre­neurs be allo­wed to offer their pro­ducts or ser­vices via the online plat­form. At this point, it should be empha­si­zed that the obli­ga­ti­ons rela­ting to the media­ti­on of distance con­tracts, also to be obser­ved by SME pro­vi­ders are.

Addi­tio­nal obli­ga­ti­ons appli­ca­ble to pro­vi­ders of very lar­ge online plat­forms or search engines

Even more exten­si­ve obli­ga­ti­ons app­ly to pro­vi­ders of very lar­ge online plat­forms or search engi­nes, such as the obli­ga­ti­on to

  • for the annu­al assess­ment of syste­mic risks (Art. 26) and to take any risk miti­ga­ti­on mea­su­res (Art. 27);
  • to have an audit car­ri­ed out annu­al­ly at its own expen­se (Art. 28);
  • offer at least one opti­on in recom­men­der systems that is not based on pro­fil­ing (Art. 29);
  • appoint an inde­pen­dent (exter­nal or inter­nal) com­pli­ance offi­cer (Art. 32); and
  • to sum­ma­ri­ze the GTC in a clear and under­stan­da­ble man­ner (Art. 12 para. 2a).

Ano­ther new fea­ture is that very lar­ge online plat­forms or search engi­nes will be regu­la­ted simi­lar­ly to a cri­ti­cal infras­truc­tu­re and can be requi­red by the EU Com­mis­si­on to take spe­ci­fic mea­su­res in the event of a cri­sis (Art. 27a). It is conceiva­ble, for exam­p­le, that pro­vi­ders will have to dis­play cer­tain war­nings at the request of the Commission.

Sanc­tions and enforcement

The DSA is to be imple­men­ted by means of a Users’ right of com­plaint to the Digi­tal Ser­vices Coor­di­na­tor (Art. 43), and through Co-regu­la­ti­on be enforced. For exam­p­le, “codes of con­duct” are to be drawn up tog­e­ther with pro­vi­ders and other stake­hol­ders to com­bat syste­mic risks or online adver­ti­sing (Art. 35 ff.).

In addi­ti­on, cer­tain vio­la­ti­ons of the DSA may be punis­ha­ble by Fines in the amount of max. 6% of the world­wi­de annu­al tur­no­ver be fined. Pen­al­ties can have a height of max. 5% of world­wi­de dai­ly tur­no­ver or inco­me achieve.

The com­pe­tent aut­ho­ri­ties also recei­ve Exten­si­ve inve­sti­ga­ti­ve and super­vi­so­ry powers such as an audit right or the right to take inte­rim mea­su­res. For very lar­ge Online platforms/search engi­nes is the EU Com­mis­si­on responsible.

Eva­lua­ti­on and prac­ti­cal advice

The first striking fea­ture is the pri­ma­ri­ly public law approach to regu­la­ti­on of the DSA. Unli­ke the GDPR, which regu­la­tes the rights of data sub­jects in detail (e.g., rights of access or dele­ti­on), the DSA is essen­ti­al­ly limi­t­ed to moni­to­ring obli­ga­ti­ons and does not pro­vi­de for a (civil) law cata­log of claims by users, rights hol­ders and/or other par­ties affec­ted by unlawful con­tent. The­se are gover­ned by other natio­nal or EU law (such as fair tra­ding law, data pro­tec­tion law or intellec­tu­al pro­per­ty law).

The gra­dua­ted cata­log of duties accor­ding to the type of ser­vice, which also takes SMEs into account, is to be welcomed.

Exce­s­si­ve howe­ver – and the­se obli­ga­ti­ons unfort­u­n­a­te­ly also app­ly to SMEs – are, in the author’s view, the Duties from online plat­form pro­vi­ders with regard to distance con­tracts. This applies in par­ti­cu­lar to the obli­ga­ti­on of pro­vi­ders to check whe­ther entre­pre­neurs meet their pre-con­trac­tu­al infor­ma­ti­on obli­ga­ti­ons (e.g. con­su­mer right of with­dra­wal) and any fur­ther infor­ma­ti­on obli­ga­ti­ons (e.g. with regard to pro­duct safe­ty). In this way, pro­vi­ders are moved away from being inter­me­dia­ries and clo­ser to being con­tent pro­vi­ders who exer­cise con­trol over infor­ma­ti­on. The “best-effort” rule with regard to this veri­fi­ca­ti­on requi­re­ment is well-inten­tio­ned, but does litt­le to help the bot­tom line. Even with a “best-effort” obli­ga­ti­on, the pro­vi­der must at least take action and deal with the pos­si­ble pro­blems that may ari­se for the Entre­pre­neur appli­ca­ble infor­ma­ti­on requi­re­ments. This is likely to invol­ve a con­sidera­ble amount of time and effort, and SMEs are unli­kely to be able to cope with it. This is all the more true as pre-con­trac­tu­al infor­ma­ti­on requi­re­ments are con­stant­ly being expan­ded in EU law (as fore­seen, for exam­p­le, in the Commission’s pro­po­sal on the Data Act con­cer­ning net­work­ed devices).

Fur­ther, the DSA could be used with some EU regu­la­ti­ons bet­ter dove­tail­ed are to be intro­du­ced. This inclu­des, for exam­p­le, the AI Act, which con­ta­ins trans­pa­ren­cy and other obli­ga­ti­ons with regard to arti­fi­ci­al intel­li­gence systems. The over­lap bet­ween the DSA and Art. 17 of the Copy­right Direc­ti­ve EU 2019/790 (DSM-RL) is also striking. Art. 17(4)(c) DSM-RL requi­res ser­vice pro­vi­ders, upon rece­ipt of a duly justi­fi­ed The DSA does not requi­re that a noti­ce from a right hol­der be acted upon imme­dia­te­ly in order to block access to the copy­righ­ted con­tent or to remo­ve it. It remains open whe­ther a noti­ce under the DSA is now only to be con­side­red “suf­fi­ci­ent­ly sub­stan­tia­ted” if it con­ta­ins the details set out in Art. 14(2) DSA. While under Art. 17(4)(c) DSA the ser­vice pro­vi­der is also requi­red to block or dele­te the suspec­ted inf­rin­ging con­tent wit­hout undue delay in order not to incur lia­bi­li­ty its­elf, under Art. 17(3) DSA it may be requi­red to rever­se the blocking or dele­ti­on of the con­tent as a result of a com­plaint decis­i­on. In con­trast, the DSA does not pro­vi­de for an inter­nal escala­ti­on mecha­nism by means of a complaint.

The prac­ti­cal appli­ca­ti­on of the DSA is the­r­e­fo­re likely to rai­se exci­ting que­sti­ons for com­pa­nies. Com­pa­nies should keep in mind that the EU Com­mis­si­on has lear­ned a lot sin­ce the GDPR. For exam­p­le, the enforce­ment gap par­ti­al­ly cri­ti­ci­zed under the GDPR is addres­sed in the event of inac­tion by the lead super­vi­so­ry aut­ho­ri­ty. Under cer­tain con­di­ti­ons, the EU Com­mis­si­on can cau­se the (so far inac­ti­ve) coor­di­na­tor for digi­tal ser­vices to take inve­sti­ga­ti­ve and super­vi­so­ry mea­su­res (Art. 45a (3)).

With a auto­no­mous repro­duc­tion of the DSA through Switz­er­land is based on the State­ment of the Fede­ral Coun­cil of August 25, 2021 rather not to be expec­ted. The­re it can be read that the Fede­ral Admi­ni­stra­ti­on wan­ted to take mea­su­res, if neces­sa­ry, to pre­vent Switz­er­land from being exclu­ded from the DSA (for exam­p­le with regard to mar­ket access rest­ric­tions) dis­ad­van­ta­ges may ari­se. As far as can be seen, no such mea­su­res have been com­mu­ni­ca­ted to date.

AI-gene­ra­ted takea­ways can be wrong.