- Political agreement on the Digital Services Act (DSA) to create uniform rules in the digital single market.
- The DSA has extraterritorial effectwhich also makes him Swiss providers relevant.
- Regulations distinguish between different types of Intermediary serviceshow Access providers and Hosting services.
- The DSA includes extensive Due diligence and transparency obligations for providers to moderate illegal content.
- Furnishing and Compliance-obligations are becoming stricter for Very large online platforms and search engines.
The European Parliament and the Council have reached a preliminary political agreement on the text of the Digital Services Act (“DSA”) was agreed. The DSA is part of the EU strategy to ensure a uniform digital single market. The stated goal of the DSA is to create uniform rules for switching services. The scope of the DSA is defined more broadly than the term “switching services” would suggest.
Content
ToggleScope of the DSA and relevance for Switzerland
The term “intermediary services” covers not only platforms that act as agents to broker contracts for products or services between different market participants. Rather, the DSA also addresses access providers, search engines, wireless local area networks, or cloud infrastructure services. “Intermediary” therefore describes the subjective scope of the DSA better than the one in the (outdated) German draft term “intermediary services” used.
How the Data Act has the DSA extraterritorial effect. For Swiss providers of intermediary services, the DSA is therefore also relevant if they do not have an establishment in the EEA but provide their services to recipients of services (hereinafter: “users”) who are located in the EEA, offer. According to the DSA, this is to be assumed if the intermediary service can demonstrate a significant number of EEA users or the service is targeted at EEA users. Indications for such targeting may be in particular: (i) currency, (ii) delivery in EEA states, or (iii) offering of the app in the national app store of an EEA state.
It is also important to note that legal entities can also be users. The DSA is no Consumer protection law and recorded also B2B services.
New and repackaged regulatory approaches – an overview
The DSA seeks to achieve the goal of creating a level playing field for switching services through new and repackaged regulatory approaches.
“Repackaged” are mainly Liability privileges for intermediary services, which were previously found in Art. 12 – 15 of Directive 2001/31/EC (“e‑commerce Directive”). Accordingly, Art. 12 – 15 of the e‑commerce Directive are replaced by the liability privileges in Chapter II of the DSA. In all other respects, the e‑commerce Directive remains in force. As before, a distinction is made in the liability privileges between access providers, caching providers and hosting providers. To put it briefly, a provider of intermediary services benefits from the liability privilege if it limits itself to its intermediary role. However, if it assumes an active role such that it gains knowledge of or control over illegal information (e.g., by assuming editorial responsibility, deliberately cooperating with users to engage in illegal activities, or failing to act swiftly after becoming aware of illegal content), the liability privilege does not apply.
(Basically) old wine in new skins is also the supervisory regime. What the national data protection authorities are under the General Data Protection Regulation (“GDPR”), the coordinators for digital services are under the DSA. The European Data Protection Board (EDSA) is the equivalent of the European Digital Services Board. Like the EDSA for data protection, the panel is intended to contribute to the consistent application of the DSA and, in particular, to develop guidelines. The accountability principle running through the DSA is also reminiscent of the GDPR.
What is new, however, is the large number of due diligence and transparency obligations that the DSA imposes on providers of intermediary services. However, providers of intermediary services are still not obligated to search for illegal content preventively or without cause.
The pyramid of duties of the DSA
The DSA draft is recognizably driven by the effort to place the obligations for providers of intermediary services in a reasonable relationship to the nature of the services in question. This results in a pyramid of obligations:
Certain basic obligations apply to all providers of mediation services. In addition, there are further obligations for providers of Hosting servicesi.e. services that store information on behalf of the user.
Providers of online platforms must reckon with even more comprehensive and stricter obligations. Online platforms are such hosting services that distribute information stored on behalf of the user publicly (i.e. outside closed user groups). Instant messaging services or e‑mail services are therefore not covered by the term “hosting service”. Small and medium enterprises (“SMEs”) are exempt from the obligations applicable to online platforms. basically except. However, for SMEs through whose online platforms consumers can conclude distance contracts, the obligations applicable in this regard shall nevertheless apply. SMEs are defined as companies with (i) fewer than 250 employees and (ii) a maximum annual turnover of EUR 50 million or a maximum annual balance sheet total of EUR 43 million.
The strictest obligations then apply to very large online platforms and very large online search engines. Online platforms or search engines that have an average of at least 45 million active users per month are considered “very large”.
In the following, the most relevant duties in the author’s opinion are presented, staggered according to pyramid levels. The following presentation is therefore not exhaustive.
Obligations applicable to providers of all mediation services
Providers without an establishment in an EEA state must first submit a Legal representative order in an EEA state (Art. 11). In addition, all providers would have to have a central contact point as the point of contact for the supervisory authorities (Art. 10). Unlike the legal representative, this does not have to be physically located in an EEA state. The requirement to provide a “single point of contact” for users (Art. 10a) is likely to already be met by many providers.
Providers of intermediary services will also be able to Revise GTC (Art. 12). These must now include, in particular, information about restrictions on the information provided by users. This includes, in particular, information about the processes and methods used for “content moderation”, including algorithmic decision-making and internal complaint procedures. If a service is predominantly directed at minors or is heavily used by them, GTCs must be written in language understandable to them.
Content moderation” refers to the activities of providers to identify, determine and combat illegal content or content that violates the TOS provided by users. Examples of such activities include the removal of the content in question, the suspension of the service or the blocking of the user account. What constitutes illegal content must be determined in light of other EU law and the law of the Member States. Examples of illegal content include terrorist content, the sale of counterfeit products, the sharing of private images without consent, or breaches of consumer law in the provision of services.
About these activities (and any other obligations that apply to them as a hosting service provider or – possibly very large – online platform or search engine), intermediary service providers must annually and publicly disclose Transparency reports to publish.
Additional obligations for hosting service providers
Hosting service providers are additionally obliged to provide a Reporting and redress procedures for suspected illegal content (Art. 14). In practice, providers are likely to increasingly use an input mask in the future (instead of simply providing an e‑mail address) in order to facilitate a report with the elements mentioned in Art. 14(2), as required. The receipt of such a notification causes the provider to become aware of the presumed illegal content (Art. 14 (3)). If the provider does not act quickly on the report and the content actually proves to be illegal, he loses the liability privilege of Art. 5.
DecideIf the Provider decides to (i) restrict the visibility of the Content (e.g. remove, block or downgrade it in a ranking), (ii) restrict the monetization of the Content, (iii) block or discontinue the Service in whole or in part, or (iv) block or terminate the account of the affected User, this is generally to justify (Art. 15, hereinafter all measures together the “content measures”). The justification must in particular state the relevant legal basis or GTC provision from which inadmissibility of the content results and provide information on legal remedies (such as legal recourse in court).
In addition, hosting service providers must inform the competent authorities about possible crimes against life and limb (Art. 15a).
Additional obligations for online platform providers
For providers of online platforms, there are additional obligations in addition to those applicable to hosting services. In particular, they are obliged to provide a internal complaint management system through which complaints about content measures (taken or not taken) can be escalated (Art. 17). The complaints procedure is regulated in detail. As a legal remedy against the provider’s complaint decisions, the DSA provides in particular for out-of-court dispute resolution at a body approved by the Digital Services Coordinator (Art. 18).
It is also noteworthy that providers of online platforms in case of frequent and obvious upload of illegal content are obligated in the future to Service towards the conspicuous user after prior warning for suspend reasonable time (Art. 19 (1)). In Switzerland, a proactive approach by hosting providers (which also include providers of online platforms) has so far only been explicitly regulated in Art. 39d URG. In contrast to Art. 19(1) DSA, which provides for the suspension of the Service However, Art. 39d URG requires the provider “only” to re-upload the relevant Plant to prevent.
Furthermore, providers of online platforms may not design user interfaces in a misleading manner (Ban on “dark patterns, Art. 23a). Users should be able to make voluntary and informed decisions. In particular, cancelling a service must be as easy as signing up.
At Online advertising is also to be achieved by means of additional Labeling obligations transparency is increased (Art. 24). In the future, not only must the advertising be identified as such, but the advertiser must also be stated. If the advertising is personalized, the most important parameters for personalization must also be disclosed. Personalized advertising is not permitted (i) on the basis of particularly sensitive personal data and (ii) to minors.
Does the online platform prioritize certain information (e.g., selecting certain posts that the platform believes are of particular interest to the user) or otherwise use a Recommendation systemIf the customer does not accept the recommendation, there are further obligations attached to it. For example, the general terms and conditions must state which criteria are used as the main basis for the recommendation.
Can be accessed via the online platform B2C distance selling contracts are closed, the providers of the online platforms are forced to Entrepreneur of the brokered products/services to identify (Art. 24c). Depending on the circumstances, a copy of an identity document or an extract from the commercial register must be requested for identification purposes. In addition, the provider of the online platform must make reasonable efforts to verify the information provided by the entrepreneur. check.
In addition, online platforms through which consumers can conclude distance contracts must be designed in such a way that traders are able, among other things, to provide the mandatory information required under Union law in relation to the products/services they offer (Art. 24d). In addition, the providers of the online platforms are obliged to verify to the best of their ability (“best effort”), whether the entrepreneurs for example, their pre-contractual and product-related Comply with information requirements. Only then should entrepreneurs be allowed to offer their products or services via the online platform. At this point, it should be emphasized that the obligations relating to the mediation of distance contracts, also to be observed by SME providers are.
Additional obligations applicable to providers of very large online platforms or search engines
Even more extensive obligations apply to providers of very large online platforms or search engines, such as the obligation to
- for the annual assessment of systemic risks (Art. 26) and to take any risk mitigation measures (Art. 27);
- to have an audit carried out annually at its own expense (Art. 28);
- offer at least one option in recommender systems that is not based on profiling (Art. 29);
- appoint an independent (external or internal) compliance officer (Art. 32); and
- to summarize the GTC in a clear and understandable manner (Art. 12 para. 2a).
Another new feature is that very large online platforms or search engines will be regulated similarly to a critical infrastructure and can be required by the EU Commission to take specific measures in the event of a crisis (Art. 27a). It is conceivable, for example, that providers will have to display certain warnings at the request of the Commission.
Sanctions and enforcement
The DSA is to be implemented by means of a Users’ right of complaint to the Digital Services Coordinator (Art. 43), and through Co-regulation be enforced. For example, “codes of conduct” are to be drawn up together with providers and other stakeholders to combat systemic risks or online advertising (Art. 35 ff.).
In addition, certain violations of the DSA may be punishable by Fines in the amount of max. 6% of the worldwide annual turnover be fined. Penalties can have a height of max. 5% of worldwide daily turnover or income achieve.
The competent authorities also receive Extensive investigative and supervisory powers such as an audit right or the right to take interim measures. For very large Online platforms/search engines is the EU Commission responsible.
Evaluation and practical advice
The first striking feature is the primarily public law approach to regulation of the DSA. Unlike the GDPR, which regulates the rights of data subjects in detail (e.g., rights of access or deletion), the DSA is essentially limited to monitoring obligations and does not provide for a (civil) law catalog of claims by users, rights holders and/or other parties affected by unlawful content. These are governed by other national or EU law (such as fair trading law, data protection law or intellectual property law).
The graduated catalog of duties according to the type of service, which also takes SMEs into account, is to be welcomed.
Excessive however – and these obligations unfortunately also apply to SMEs – are, in the author’s view, the Duties from online platform providers with regard to distance contracts. This applies in particular to the obligation of providers to check whether entrepreneurs meet their pre-contractual information obligations (e.g. consumer right of withdrawal) and any further information obligations (e.g. with regard to product safety). In this way, providers are moved away from being intermediaries and closer to being content providers who exercise control over information. The “best-effort” rule with regard to this verification requirement is well-intentioned, but does little to help the bottom line. Even with a “best-effort” obligation, the provider must at least take action and deal with the possible problems that may arise for the Entrepreneur applicable information requirements. This is likely to involve a considerable amount of time and effort, and SMEs are unlikely to be able to cope with it. This is all the more true as pre-contractual information requirements are constantly being expanded in EU law (as foreseen, for example, in the Commission’s proposal on the Data Act concerning networked devices).
Further, the DSA could be used with some EU regulations better dovetailed are to be introduced. This includes, for example, the AI Act, which contains transparency and other obligations with regard to artificial intelligence systems. The overlap between the DSA and Art. 17 of the Copyright Directive EU 2019/790 (DSM-RL) is also striking. Art. 17(4)(c) DSM-RL requires service providers, upon receipt of a duly justified The DSA does not require that a notice from a right holder be acted upon immediately in order to block access to the copyrighted content or to remove it. It remains open whether a notice under the DSA is now only to be considered “sufficiently substantiated” if it contains the details set out in Art. 14(2) DSA. While under Art. 17(4)(c) DSA the service provider is also required to block or delete the suspected infringing content without undue delay in order not to incur liability itself, under Art. 17(3) DSA it may be required to reverse the blocking or deletion of the content as a result of a complaint decision. In contrast, the DSA does not provide for an internal escalation mechanism by means of a complaint.
The practical application of the DSA is therefore likely to raise exciting questions for companies. Companies should keep in mind that the EU Commission has learned a lot since the GDPR. For example, the enforcement gap partially criticized under the GDPR is addressed in the event of inaction by the lead supervisory authority. Under certain conditions, the EU Commission can cause the (so far inactive) coordinator for digital services to take investigative and supervisory measures (Art. 45a (3)).
With a autonomous reproduction of the DSA through Switzerland is based on the Statement of the Federal Council of August 25, 2021 rather not to be expected. There it can be read that the Federal Administration wanted to take measures, if necessary, to prevent Switzerland from being excluded from the DSA (for example with regard to market access restrictions) disadvantages may arise. As far as can be seen, no such measures have been communicated to date.