Digital Decade: The Digital Services Act
Exploring the legislative landscape for digital platforms and services in the EU
Introduction
The Digital Services Act (the "DSA") will regulate how digital platforms and services handle illegal or potentially harmful online content by establishing a powerful transparency and accountability framework. The obligations placed on different online companies are designed to be proportionate to their role, size and impact in the online ecosystem.
Background and status update
On 5 July 2022, the DSA was formally announced by the European Parliament alongside the Digital Markets Act (the "DMA") as part of its "Digital Services Package", which is aimed to improve online safety and to enhance users' fundamental rights online. The DSA was published in the Official Journal of the EU on 27 October 2022.
Effective date
The DSA came into force in the EU on 16 November 2022 and the majority of its provisions will apply from 17 February 2024.
Deep dive
Key provisions
- What is its primary purpose
The DSA will regulate how digital platforms and services handle illegal or potentially harmful online content by establishing a powerful transparency and accountability framework. - Who does it apply to
The DSA is far-reaching in scope and applies to providers of "intermediary services".
Intermediary services consist of the following:
"Mere conduit" services | services that provide access to, or the transmission of, information over a communication network. Examples of mere conduit services include internet exchange points, virtual private networks, domain name registries and voice over IP. |
"Caching" services | services that provide for the automatic, intermediate and temporary storage of information for technical purposes to facilitate the efficient transmission of information to other users. Examples of caching services include content delivery networks, reverse proxies and content adaptation proxies. |
"Hosting" services | services that provide for the permanent storage of information at the request of a user. Examples of hosting services include cloud computing services, web hosting services or any other service which allows for the sharing of information or content online. |
Online search engines | services that perform searches of all websites in line with queries submitted by recipients of the service. |
- Territorial scope
The DSA applies to intermediary services with users that (i) are either located in the EU or (ii) have their place of establishment in the EU, irrespective of the place of establishment of the intermediary service. - Key provisions / requirements
The obligations imposed by the DSA attach to the specific service provided, as opposed to the service provider. As a result, an organisation could be subject to more stringent obligations in respect of certain services it provides and less stringent obligations in respect of other separate and distinguishable services.
Generally, the DSA imposes:- new obligations to identify and remove illegal content, transparency obligations on steps taken to combat illegal information, and transparency requirements in relation to online advertisements;
- requirements for special protection measures where platforms are accessible by minors, to ensure their online safety; and
- a ban on the use of dark patterns, and a ban on targeting advertising at minors using their personal data.
The obligations in the DSA are structured to match the size, role and impact of the specific online intermediary. In this respect, micro and small enterprises (namely companies with fewer than 250 employees, an annual turnover not exceeding €50 million, and/or an annual balance sheet total not exceeding €43 million) benefit from certain exclusions under the DSA so as to avoid disproportionate burden.
The DSA adopts a tiered approach to regulation by imposing varying obligations on different categories of intermediary service (save for in respect of the exclusions referred to above):
- Intermediary service providers: The DSA contains a number of general obligations that apply to all providers of intermediary services. Such obligations include, but are not limited to, requirements to designate single points of contact and/or legal representatives to facilitate communication with relevant authorities. Other requirements relate to the content and accessibility of the intermediary service's terms and conditions and the publication of annual reports on content moderation.
- Hosting services: The definition of "hosting service" encapsulates a number of sub-categories that are subject to varying requirements. Generally, however, all hosting services are subject to standard obligations that are intended to ensure that steps are taken to identify and remove illegal content. For example, by implementing notification mechanisms for suspected illegal content and requirements to notify law enforcement of suspected criminal offences taking place or likely to take place. In addition, hosting services are required to provide users with a transparent statement of reasons in the event that content is removed from the service or a user's access to the service is restricted for illegal content or activity, or if such content or activity goes against the organisation's terms and conditions.
- Online platforms: Online platforms are a sub-category of hosting service that, as part of the service provided and at the request of the user, make information available to a potentially unlimited number of third parties – for example, social media networks, online marketplaces and app stores. The key differentiation between other hosting platforms is the unlimited dissemination of information (with such dissemination being more than a merely minor or ancillary feature or function of the service). In addition to increased obligations relating to complaint handling, advertising transparency and the ability for users to make free and informed decisions, online platforms are subject to further requirements that are intended to facilitate the identification and removal of illegal content. For example, online platforms are required to prioritise notices submitted by "trusted flaggers" (i.e. independent entities that have been designated such status by relevant authorities on the basis that they have particular expertise and competence in tackling illegal content). In addition, online platforms are specifically required to put measures in place to ensure high levels of privacy, safety and security for minors.
- Online marketplaces: Online marketplaces (i.e. online platforms that allow consumers to conclude distance contracts with traders) are subject to further requirements, in addition to those specified above. Such additional requirements include the implementation of measures that are intended to ensure adequate traceability of traders and to assess the reliability of information provided by such traders, including in relation to the guarantee that the products and services offered on the marketplace comply with applicable laws.
- Very large online platforms and search engines: The most stringent requirements under the DSA apply to "very large online platforms" ("VLOPs") and very large online search engines ("VLOSEs"). The European Commission is responsible for designating entities as VLOPs or VLOSEs, however, only those entities with at least 45 million monthly active EU consumers of the service will qualify. Under the DSA, VLOSEs and VLOPs must, among other requirements, identify, analyse and mitigate systemic risks, undergo annual audits and establish an independent formal compliance function to monitor and assess compliance with the DSA. On 25 April 2023, the European Commission designated 17 VLOPs and 2 VLOSE's: DSA Designations.
- Regulatory supervision
Under the DSA, the European Commission is granted powers to directly supervise VLOPs and VLOSEs. In addition, the DSA requires each Member State to appoint a "Digital Services Coordinator" who will be responsible for supervising other intermediary service providers that fall within the regulatory ambit of the DSA. - Enforcement and fines
Non-compliance with the DSA could lead to fines of up to 6% of global turnover or a ban on operating in the EU single market for repeat serious offenders.
Other regulatory developments to be aware of
A number of other regulatory developments, both in the EU and UK, are being rolled out to increase responsibilities for digital services and to enhance safeguards for users. Impacted businesses must therefore understand how these may affect their current and future digital platform endeavours. These developments include:
- The EU's Digital Markets Act – set to regulate the main services provided by the biggest online platforms operating in the EU and will require these platforms, known as gatekeepers, to perform a series of obligations and prohibitions in their daily operations to ensure fair and open digital markets.
- The UK Data Protection and Digital Information Bill - set to change (among other things) the rules on how personal data is processed by automated systems automated processing regulation. Our data protection bulletin (published regularly on our hub here) is tracking the progress of this new legislation.
- The UK Online Safety Bill – seeking to regulate certain online services and to tackle illegal and harmful online content. This contains provisions in line with those in the DSA in respect of the EU.
What you should be doing now in preparation
In advance of February 2024, organisations should take steps to assess and understand the requirements they are subject to under the DSA and to implement steps to prepare for the new requirements that lay ahead. Actions to consider include:
- mapping the scope of services provided and taking steps to understand the specific obligations and requirements that attach to each category of service;
- implement policies to ensure that operational compliance with applicable requirements under the DSA; and
- carry out risk assessments to assess whether adequate systems and procedures are in place to identify and remove illegal content.
Conclusion: our thoughts
With increased regulation of digital platforms and services, providers of intermediary services must adapt their practices to ensure the responsible and transparent management of online risks and to proactively manage illegal content. Given the significant risk of enforcement action for failure to comply, we encourage organisations to take proactive steps, including to:
- assess the extent to which the DSA will apply to them, and develop a roadmap for compliance;
- review existing policies and processes to ensure they are appropriate and update them where necessary for the new rules in the DSA;
- implement technology solutions that can aid compliance, such as content filtering and data management systems, and reporting, take down and transparency tools; and
- consider how the DSA rules may interrelate with other legislation applicable to the business, including the Digital Markets Act and Online Safety Bill, and pending regulation of the use of AI (such as under the EU's AI Act).