The Federal Council today after some delay submitted a draft law on the regulation of online platforms such as Instagram, X and Google for consultation (Federal Act on Communication Platforms and Search Engines, KomPG):
- Media release
- Draft law (preliminary draft, VE KomPG)
- Explanatory report VE-KomPG
The consultation will last until February 16, 2026.
The KomPG is intended to strengthen the rights of users of communication platforms and search engines and promote transparency (Art. 1). The draft is closely based on the Digital Services Act of the EUbut is not a copy of it.
For example, the Scope of the KomPG on particularly Large platforms and search enginessuch as X (Twitter), YouTube, Instagram or Google Search, which occupy a central position in public communication. The scope of application is deliberately narrow: only very large platforms and search engines are affected, i.e. those that are used by at least 10% of the Swiss population every month (Art. 2), regardless of where the providers are based. Small and medium-sized providers are exempt. Unlike the DSA, the KomPG not on pass-through services, hosting or caching services be applicable. Non-profit services without an economic purpose (e.g. Wikipedia) are also exempt.
The draft essentially provides for the following:
- Providers of very large platforms and search engines must have a Notification procedure for suspected criminally relevant content (e.g. violence, hate speech, defamation), examine and substantiate reports and inform those affected (Art. 4 – 5).
- Users whose Content removed or accounts blocked (“restrictive measures”) are entitled to a notification with reasons and access to a free internal complaints procedure (Art. 6 – 7).
- Disputes can also be submitted to an authorized Arbitration board whose procedure is mandatory for platforms (Art. 8 – 12).
- The AGB must contain certain minimum content – in particular on moderation and recommendation systems – and be publicly accessible in three official languages and with a summary. Significant changes must be actively communicated (Art. 13).
- The following apply Prohibitions on due diligence, transparency and discrimination in content moderation and in reporting and complaints procedures (Art. 14).
- Third-party advertising must be visibly labeled. Users should have easy access to information about the personalization systems; large platforms must also maintain a public advertising archive (Art. 15 – 16).
- For Recommendation systems information on the algorithms used (parameters, weighting) must be disclosed in the GTC. In addition, a usage option without profiling must be provided (Art. 18).
- Providers are obliged to annually Transparency reports (including on moderation and complaints) and to publish a risk assessment on systemic impacts on democracy, public security and fundamental rights (Art. 19 – 20);
- Platforms must have a Contact point in Switzerland and – if domiciled abroad – a Legal representation (Art. 21 – 23).
- There is an annual independent Evaluation compliance with the obligations by the providers (Art. 24 – 25).
- Researchers and civil society organizations are to be given the following conditions Access to platform data to investigate systemic risks (Art. 26).
- The Supervision is the responsibility of the Federal Office of Communications (OFCOM). It can order administrative measures up to and including the temporary blocking of services (Art. 27 ff.).
- Sanctions up to 6 % of the worldwide annual turnover are provided for (Art. 34).