The Fede­ral Coun­cil today after some delay sub­mit­ted a draft law on the regu­la­ti­on of online plat­forms such as Insta­gram, X and Goog­le for con­sul­ta­ti­on (Fede­ral Act on Com­mu­ni­ca­ti­on Plat­forms and Search Engi­nes, KomPG):

The con­sul­ta­ti­on will last until Febru­ary 16, 2026.

The KomPG is inten­ded to streng­then the rights of users of com­mu­ni­ca­ti­on plat­forms and search engi­nes and pro­mo­te trans­pa­ren­cy (Art. 1). The draft is clo­se­ly based on the Digi­tal Ser­vices Act of the EUbut is not a copy of it.

For exam­p­le, the Scope of the KomPG on par­ti­cu­lar­ly Lar­ge plat­forms and search engi­nessuch as X (Twit­ter), You­Tube, Insta­gram or Goog­le Search, which occu­py a cen­tral posi­ti­on in public com­mu­ni­ca­ti­on. The scope of appli­ca­ti­on is deli­bera­te­ly nar­row: only very lar­ge plat­forms and search engi­nes are affec­ted, i.e. tho­se that are used by at least 10% of the Swiss popu­la­ti­on every month (Art. 2), regard­less of whe­re the pro­vi­ders are based. Small and medi­um-sized pro­vi­ders are exempt. Unli­ke the DSA, the KomPG not on pass-through ser­vices, hosting or caching ser­vices be appli­ca­ble. Non-pro­fit ser­vices wit­hout an eco­no­mic pur­po­se (e.g. Wiki­pe­dia) are also exempt.

The draft essen­ti­al­ly pro­vi­des for the following:

  • Pro­vi­ders of very lar­ge plat­forms and search engi­nes must have a Noti­fi­ca­ti­on pro­ce­du­re for suspec­ted cri­mi­nal­ly rele­vant con­tent (e.g. vio­lence, hate speech, defa­ma­ti­on), exami­ne and sub­stan­tia­te reports and inform tho­se affec­ted (Art. 4 – 5).
  • Users who­se Con­tent remo­ved or accounts blocked (“rest­ric­ti­ve mea­su­res”) are entit­led to a noti­fi­ca­ti­on with rea­sons and access to a free inter­nal com­plaints pro­ce­du­re (Art. 6 – 7).
  • Dis­pu­tes can also be sub­mit­ted to an aut­ho­ri­zed Arbi­tra­ti­on board who­se pro­ce­du­re is man­da­to­ry for plat­forms (Art. 8 – 12).
  • The AGB must con­tain cer­tain mini­mum con­tent – in par­ti­cu­lar on mode­ra­ti­on and recom­men­da­ti­on systems – and be publicly acce­s­si­ble in three offi­ci­al lan­guages and with a sum­ma­ry. Signi­fi­cant chan­ges must be actively com­mu­ni­ca­ted (Art. 13).
  • The fol­lo­wing app­ly Pro­hi­bi­ti­ons on due dili­gence, trans­pa­ren­cy and dis­cri­mi­na­ti­on in con­tent mode­ra­ti­on and in report­ing and com­plaints pro­ce­du­res (Art. 14).
  • Third-par­ty adver­ti­sing must be visi­bly labe­led. Users should have easy access to infor­ma­ti­on about the per­so­na­lizati­on systems; lar­ge plat­forms must also main­tain a public adver­ti­sing archi­ve (Art. 15 – 16).
  • For Recom­men­da­ti­on systems infor­ma­ti­on on the algo­rith­ms used (para­me­ters, weight­ing) must be dis­c­lo­sed in the GTC. In addi­ti­on, a usa­ge opti­on wit­hout pro­fil­ing must be pro­vi­ded (Art. 18).
  • Pro­vi­ders are obli­ged to annu­al­ly Trans­pa­ren­cy reports (inclu­ding on mode­ra­ti­on and com­plaints) and to publish a risk assess­ment on syste­mic impacts on demo­cra­cy, public secu­ri­ty and fun­da­men­tal rights (Art. 19 – 20);
  • Plat­forms must have a Cont­act point in Switz­er­land and – if domic­i­led abroad – a Legal repre­sen­ta­ti­on (Art. 21 – 23).
  • The­re is an annu­al inde­pen­dent Eva­lua­ti­on com­pli­ance with the obli­ga­ti­ons by the pro­vi­ders (Art. 24 – 25).
  • Rese­ar­chers and civil socie­ty orga­nizati­ons are to be given the fol­lo­wing con­di­ti­ons Access to plat­form data to inve­sti­ga­te syste­mic risks (Art. 26).
  • The Super­vi­si­on is the respon­si­bi­li­ty of the Fede­ral Office of Com­mu­ni­ca­ti­ons (OFCOM). It can order admi­ni­stra­ti­ve mea­su­res up to and inclu­ding the tem­po­ra­ry blocking of ser­vices (Art. 27 ff.).
  • Sanc­tions up to 6 % of the world­wi­de annu­al tur­no­ver are pro­vi­ded for (Art. 34).