What is the EU Digital Services Act?
The Digital Services Act (DSA) is an EU Regulation (2022/2065) aiming to assure fairness, trust, and safety in the digital environment. It preserves and upgrades the liability exemptions for online intermediaries that exist in the European framework since 2000. It exempts digital infrastructure-layer services, such as internet access providers, and application-layer services, such as social networks and file-hosting services, from liability for third-party content.
Simultaneously, DSA imposes due diligence obligations concerning the design and operation of such services, in order to ensure a safe, transparent and predictable online ecosystem. These due diligence obligations aim to regulate the general design of services, content moderation practices, advertising, and transparency, including sharing of information. The due diligence obligations focus mainly on the process and design rather than the content itself, and usually correspond to the size and social relevance of various services. Very large online platforms and very large online search engines are subject to the most extensive risk mitigation responsibilities, which are subject to independent auditing.
Overview of the due diligence obligations
The real novelty of the DSA lies in the creation of fully-fledged tiers of due diligence obligations (see the picture above). Instead of “liability”, the DSA insists on the “responsibility”, that grows with the service’s size and societal impact. These obligations are not preconditions of the liability exemptions. However, they are clearly meant to legislatively balance the liability assurances with some societal responsibilities as to trust, safety, and fairness in these services. Like all due diligence obligations, they relate more to the process rather than substance.
The goal of due diligence obligations is to strengthen the fundamental rights of individuals. The tiers of due diligence obligations apply depending on the type and size of service. Some services are only subject to the universal layer of obligations, while other services are subject to all four layers. For Very Large Online Platforms (VLOPs), the most regulated of all services, due diligence obligations have a cake structure, with special obligations on the very top, and universal obligations at the bottom. Chapter III of the DSA is structured around the four basic tiers of due diligence obligations depicted on the picture above.
Key due diligence obligations
- Content moderation: providers of digital services must disclose the restrictions they impose on user-generated content, providers point of contact and submission interfaces for electronic notices, explain their individual decisions through statements of reasons, and sometimes also provide free-of-charge appeals mechanisms. Content moderation is understood broadly as including also any visibility restrictions on the user-generated content.
- Fair design practices: providers of digital services prohibited from operating advertised based on profiling using sensitive personal data of adults, or personal data of children, are obliged to disclose who pays for advertising, and collect information from those who sell on a platform to consumers, must refrain from using manipulating and aggressive design practices, and must protect minors. Very large online platforms (VLOPs) must further publish advertising archives, and allow opt-out from the recommender systems to ranking that is not profiling-based.
- Risk management: providers of digital services must protect minors, and if they attract more than 45 million monthly active users in the EU, they will be designated as very large online platforms (VLOPs). As a result, they must conduct annual risk assessment, and audits, and risk assessment of any of their new features that can have critical impact.
For a very short overview of the DSA, watch the recording of my talk at Stanford Cyberpolicy Center.
Who is regulated?
- Infrastructure Services: providers of internet access, content delivery networks, caching services, and other technical are regulated only in light-touch way.
- Hosting Services: providers of digital services that store user generated content have obligations with respect to content moderation, such as prior disclosure of rules, explanation of individual decisions, and notification of crimes to authorities.
- Online Platforms: providers of digital services that store user-generated content and distribute it to the public are regulated more extensively. They have additional content moderation obligations (e.g., internal and external appeals), and fair design obligations (e.g., transparency obligations, protection of minors).
- Very Large Online Platforms (VLOPs): providers of digital services that store user-generated content and distribute it to the public and have more than 45 million monthly active users in the EU have the most obligations. They have an obligation to generally assess and mitigate risk on their platform, and undergo auditing annually.
- Very Large Online Search Engines (VLOSEs): providers of digital services that consist of search engines and who have more than 45 million monthly active users who search for content are subject to similar obligations to very large online platforms.
Further resources
Fur further resources, consult https://husovec.eu/dsa/