EU’s Digital Services Act Just Became Applicable: Outlining Ten Key Areas of Interplay with the GDPR
DSA: What’s in a Name?
The European Union’s (EU) Digital Services Act (DSA) is a first-of-its-kind regulatory framework, with which the bloc hopes to set an international benchmark for regulating online intermediaries and improving online safety. The DSA establishes a range of legal obligations, from content removal requirements, prohibitions to engage in manipulative design and to display certain online advertising targeted to users profiled on the basis of sensitive characteristics, to sweeping accountability obligations requiring audits of algorithms and assessments of systemic risks for the largest of platforms.
The DSA is part of the EU’s effort to expand its digital regulatory framework to address the challenges posed by online services. It reflects the EU’s regulatory approach of comprehensive legal frameworks which strive to protect fundamental rights, including in digital environments. The DSA should not be read by itself: it is applicable on top of the EU’s General Data Protection Regulation (GDPR), alongside the Digital Markets Act (DMA), as well as other regulations and directives of the EU’s Data Strategy legislative package.
The Act introduces strong protections against both individual and systemic harms online, and also places digital platforms under a unique new transparency and accountability framework. To address the varying levels of risks and responsibilities associated with different types of digital services, the Act distinguishes online intermediaries depending on the type of business service, size, and impact, setting up different levels of obligations.
Given the structural and “systemic” significance of certain firms in the digital services ecosystem, the regulation places stricter obligations on Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). These firms will have to abide by higher transparency standards, provide access to (personal) data to competent authorities and researchers, and identify, analyze, assess, and mitigate systemic risks linked to their services. Such systemic risks have been classified into four different categories (Recitals 80-84): illegal content; fundamental rights (freedom of expression, media pluralism, children’s rights, consumer protection, and non-discrimination, inter alia); public security and electoral/democratic processes; and public health protection, with a specific focus on minors, physical and mental well-being, and gender-based violence.
The European Commission designated VLOPs and VLOSEs earlier this year (see Table 1), based on criteria laid out in the DSA and a threshold number of 45 million monthly users across the EU. The DSA obligations for these designated online platforms became applicable on August 25, 2023, with the exception of a transparency database whose publication was postponed for a month following complaints. The full regulation becomes applicable for all covered starting on February 17, 2024.
Table 1 – Mapping the 19 Designated Companies by the European Commission (April 2023)
Type of service | Company | Digital Service | Type of designation |
Social Media | Alphabet | Youtube | VLOP |
Meta | VLOP | ||
Meta | VLOP | ||
Bytedance | TikTok | VLOP | |
Microsoft | VLOP | ||
Snap | Snapchat | VLOP | |
VLOP | |||
X (formerly known as Twitter) | X (formerly known as Twitter) | VLOP | |
App Stores | Alphabet | Google App Store | VLOP |
Apple | Apple App Store | VLOP | |
Wiki | Wikimedia | Wikimedia | VLOP |
Marketplaces | Amazon* | Amazon Marketplace | VLOP |
Alphabet | Google Shopping | VLOP | |
Alibaba | AliExpress | VLOP | |
Booking.com | Booking.com | VLOP | |
Zalando* | Zalando | VLOP | |
Maps | Alphabet | Google Maps | VLOP |
Search | Alphabet | Google Search | VLOSE |
Microsoft | Bing | VLOSE |
— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —
* The companies that sought to challenge their designation as ‘VLOPs’ – the European General Court will be addressing these challenges and will determine whether the European Commission’s designation shall be upheld.
— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —
However, VLOPs and VLOSEs are not the only regulated entities. All intermediaries that offer their services to users based in the EU, including online platforms such as app stores, collaborative economy platforms, and social media platforms, fall within the scope of the regulation, regardless of their number of users. Notably, micro and small-sized enterprises that do not meet the VLOP/VLOSE criteria, as defined by EU law, are exempted from some of the legal obligations. While “regular” online platforms may have scaled down requirements compared to VLOPs/VLOSEs, their new legal obligations are nonetheless significant and include, among others, transparency regarding their recommendation systems, setting up internal complaint-handling mechanisms, prohibitions on designing their platforms in a way that deceives or manipulates users, and prohibitions on presenting ads based on profiling using special categories of personal data, including personal data of minors.
All providers of intermediary services, including online platforms, covered by the DSA are also “controllers” under the GDPR to the extent that they process personal data and decide on the means and purposes of such processing. As a consequence, they have to comply with both these legal frameworks at the same time. While the DSA stipulates, pursuant to Recital 10, that the GDPR and the ePrivacy Directive serve as governing rules for personal data protection, some DSA provisions intertwine with GDPR obligations in complex ways, requiring further analysis. For instance, some of the key obligations in the DSA refer to “profiling” as defined by the GDPR, while others create a legal requirement for VLOPs and VLOSEs to give access to personal data to researchers or competent authorities.
After a brief overview of the scope of application of the DSA and a summary of its key obligations based on the type of covered entity (see Table 2), this blog maps out ten key areas where the DSA and the GDPR interact in consequential ways and reflects on the impact of this interaction on the enforcement of the DSA. The ten interplay areas we are highlighting are:
- Manipulative design in online interfaces;
- Targeted advertising based on sensitive data;
- Targeted advertising and protection of minors;
- Recommender systems free-of-profiling;
- Recommender systems and advertising transparency;
- Access to data for researchers and competent authorities;
- Takedown of illegal content;
- Risk Assessments;
- Compliance function and the DSA legal representative;
- Intermediary liability and the obligation to provide information.
The DSA Applies to Intermediary Services of Various Types and Sizes and Has Broad Extraterritorial Effect
The DSA puts in place a horizontal framework of layered responsibilities targeted at different types of online intermediary services, including:
(1) Intermediary services offering network infrastructure | Including “mere conduit services” (e.g. internet access, content delivery networks, WiFi hotspots); “caching services” (e.g. automatic, intermediate, and temporary storage of information); and “hosting services” (e.g. cloud and web-hosting services). |
(2) Online platform services | Providers bringing together sellers and consumers, such as online marketplaces, app stores, collaborative economy platforms, social media platforms, and providers that disseminate information to the public. |
(3) Very Large Online Platforms (VLOPs) | Reaching at least 45 million active recipients in the EU on a monthly basis (10% of the EU population). |
(4) Very Large Online Search Engines (VLOSEs) | Reaching at least 45 million active recipients in the EU on a monthly basis (10% of the EU population). |
Recitals 13 and 14 of the DSA highlight the importance of “disseminating information to the public” as a benchmark for which online platforms fall under the scope of the Regulation and the specific category of hosting services. For instance, Recital 14 explains that emails or private messaging services fall outside the definition of online platforms “as they are used for interpersonal communication between a finite number of persons determined by the sender of the communication.” However, the DSA obligations for online platforms may still apply to them if such services “allow the making available of information to a potentially unlimited number of recipients, … such as through public groups or open channels.”
Important carve-outs are made in the DSA for micro and small-sized enterprises, as defined by EU law, that do not meet the VLOP/VLOSE criteria. These firms are exempted from some of the legal obligations, in particular from making available an annual report on the content moderation they engage in, as well as the more substantial additional obligations imposed on providers of online platforms in Articles 20 to 28 – such as the prohibition to display ads based on profiling conducted on special categories of personal data, and obligations for platforms allowing consumers to conclude distance contracts with traders in Articles 29 to 32.
These carve-outs come in contrast with the broad applicability of the GDPR to entities of all sizes. This means, for instance, that even if micro and small-sized enterprises that are online platforms do not have to comply with the prohibitions related to displaying ads based on profiling using special categories of personal data and profiling of minors, they continue to fall under the scope of the GDPR and its requirements that impact such profiling.
The DSA has extra-territorial effect and global coverage, similar to the GDPR, since it captures companies regardless of whether they are established in the EU or not, as long as the recipients of their services have their place of establishment or are located in the EU (Article 2).
The DSA Just Became Applicable to VLOPs and VLOSEs and Will Continue to Roll Out to All Online Platforms
The Act requires that platforms and search engines publish their average monthly number of active users/recipients with the EU-27 (Article 24 – check the European Commission’s guidance on the matter). The first round of sharing those numbers was due on February 17, 2023. Based on the information shared through that exercise, the Commission designated the VLOPs and VLOSEs with additional obligations because of the “systemic risks that they pose to consumers and society,” (Article 33). The designation announcement was made public on April 25.
Four months after the designation, on August 25, 2023, the DSA provisions became applicable to VLOPs and VLOSEs through Article 92. This means that the designated platforms must already implement their obligations, such as conducting risk assessments, increasing transparency of recommender systems, and offering an alternative feed of content not subject to recommender systems based on profiling (see an overview of their obligations in Table 2).
As of February 17, 2024, all providers of intermediary services must comply with a set of general obligations (Articles 11-32), with certain exceptions for micro and small enterprises as explained above.
Table 2 – List of DSA Obligations as Distributed Among Different Categories of Intermediary Service Providers
Pillar obligations | Set of Obligations | Intermediary Services | Hosting Services | Online Platforms | VLOPs/VLOSEs |
Transparency measures | Transparency reporting (Article 15) | 🚩 | 🚩 | 🚩 | 🚩 |
Requirements on terms and conditions wrt fundamental rights (Article 14) | 🚩 | 🚩 | 🚩 | 🚩 | |
Statement of reasons (Article 17) | 🚩 | 🚩 | |||
Notice-and-action and obligation to provide information to users (Article 16) | 🚩 | 🚩 | 🚩 | ||
Recommender system transparency (Articles 27 and 38) | 🚩 | 🚩 | |||
User-facing transparency of online advertising (Article 24) | 🚩 | 🚩 | |||
Online advertising transparency (Article 39) | 🚩 | ||||
User choice for access to information (Article 42) | 🚩 | ||||
Oversight structure to address the complexity of the online intermediary services ecosystem | Cooperation with national authorities following orders (Article 11) | 🚩 | 🚩 | 🚩 | 🚩 |
Points of contact for recipients of service (Article 12) and, where necessary, legal representatives (Article 13) | 🚩 | 🚩 | 🚩 | 🚩 | |
Internal complaint and handling system (Article 20) and redress mechanism (Article 32) and out-of-court dispute settlement (Article 21) | 🚩 | 🚩 | |||
Independent auditing and public accountability (Article 37) | 🚩 | ||||
Option for recommender systems not based on profiling (Article 38) | 🚩 | ||||
Supervisory fee (Article 43) | 🚩 | ||||
Crisis response mechanism and cooperation process (Article 36) | 🚩 | ||||
Manipulative Design | Online interface design and organization (Article 25) | 🚩 | 🚩 | ||
Measures to counter illegal goods, services, or content online | Trusted flaggers (Article 22) | 🚩 | 🚩 | ||
Measures and protection against misuse (Article 23) | 🚩 | 🚩 | |||
Targeted advertising based on sensitive data (Article 26) | 🚩 | 🚩 | |||
Online protection of minors (Article 28) | 🚩 | 🚩 | |||
Traceability of traders (Articles 30-32) | 🚩 | 🚩 | |||
Reporting criminal offenses (Article 18) | 🚩 | 🚩 | |||
Risk management obligations and compliance officer (Article 41) | 🚩 | 🚩 | |||
Risk assessment and mitigation of risks (Articles 34-35) | 🚩 | ||||
Codes of conduct (Articles 45-47) | 🚩 | ||||
Access to data for researchers | Data sharing with authorities and researchers (Article 40) | 🚩 |
From Risk Assessments to Profiling and Transparency Requirements – Key Points of Interplay Between the DSA and GDPR
While the DSA and the GDPR serve different purposes and objectives at face value, ultimately both aim to protect fundamental rights in a data-driven economy and society, on the one hand, and reinforce the European single market, on the other hand. The DSA aims to establish rules for digital services and their responsibilities toward content moderation and combating systemic risks, so as to ensure user safety, safeguard fairness and trust in the digital environment, and enhance a “single market for digital services.” Notably, providing digital services is inextricably linked to processing data, including personal data. The GDPR seeks to protect individuals in relation to how their personal data is processed, ensuring that such processing respects their fundamental rights, while at the same time seeking to promote the free movement of personal data within the EU.
While the two regulations do not have the same taxonomy of regulated actors, the broad scope of the GDPR’s definitions of “controllers” and “processing of personal data” are such that all intermediaries covered by the DSA are also controllers under the GDPR in relation to any processing of personal data they engage in and for which they establish the means and purposes of processing. Some intermediaries might also be “processors” under the GDPR in specific situations, a fact that needs to be assessed on a case-by-case basis. Overall, this overlap triggers the application of both regulations, with the GDPR seemingly taking precedence over most of the DSA (Recital 10 of the DSA), with the exception of the intermediary liability rules in the DSA as the updated eCommerce Directive, which take precedence over the GDPR (Article 2(4) of the GDPR).
The DSA mentions the GDPR 19 times in its text across recitals and articles, with “profiling” as defined by the GDPR playing a prominent role in core obligations for all online platforms. These include the two prohibitions to display ads based on profiling that use sensitive personal data or the data of minors, and the obligation that any VLOPs and VLOSEs that use recommender systems must provide at least one option for their recommender systems not based on profiling. The GDPR plays an additional role in setting the definition for sensitive data (“special categories of data”) in its Article 9, which the DSA specifically refers to for the prohibition of displaying ads based on profiling done on such data. In addition to these cross-references, where it will be essential to apply the two legal frameworks consistently, there are other areas of overlap that create complexity for compliance, at the minimum, but also risks for inconsistencies (such as the DSA risk assessment processes and the GDPR Data Protection Impact Assessment). Additional overlaps may confuse individuals concerned regarding the best legal framework to rely on for removing their personal data from online platforms, as the DSA sets up a framework for takedown requests for illegal content that may also include personal data and the GDPR provides individuals with the right to obtain erasure of their personal data in specific contexts.
In this complex web of legal provisions, here are the elements of interaction between the two legal frameworks that stand out. As the applicability of the DSA rolls out on top of GDPR compliance programs and mechanisms, other such areas may surface.
- Manipulative Design (or “Dark Patterns”) in Online Interfaces
These are practices that “materially distort or impair, either on purpose or in effect, the ability of recipients of the service to make autonomous and informed choices or decisions,” per Recital 67 DSA. Both the GDPR and the DSA address these practices, either directly or indirectly. The GDPR, on the one hand, offers protection against manipulative design in cases that involve processing of personal data. The protections are relevant for complying with provisions detailing lawful grounds for processing, requiring data minimization, setting out how valid consent can be obtained and withdrawn, or how controllers must apply Data Protection by Design and by Default when building their systems and processes.
Building on this ground, Article 25 of the DSA, read in conjunction with Recital 67, includes a ban on providers of online platforms to “design, organize or operate their online interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions.” The ban seems to be applicable only to online platforms as defined in Article 3(i) of the DSA, as a subcategory of the wide spectrum of intermediary services. Importantly, the DSA specifies that the ban on dark patterns does not apply to practices covered by the Unfair Commercial Practices Directive (UCPD) or the GDPR. Article 25(3) of the DSA highlights that the Commission is empowered to issue guidelines on how the ban on manipulative design applies to specific practices, so further clarity is expected. And since the protection vested by the GDPR against manipulative design will remain relevant and primarily applicable, it will be essential for consistency that these guidelines are developed in close collaboration with Data Protection Authorities (DPAs).
- Targeted Advertising Based on Sensitive Data
Article 26(3) and Recital 68 of the DSA underline a prohibition of the providers of online platforms to “present” ads to users stemming from profiling them, as defined by Article 4(4) of the GDPR, based on sensitive personal data, as defined by Article 9 of the GDPR. Such personal data include race, religion, health status, and sexual orientation, among others on a limited list. However, it is important to mention that case law from the Court of Justice of the EU (CJEU) may further complicate the application of this provision. In particular, Case C-184/20 OT, in a judgment published a year ago, expanded “special categories of personal data” under the GDPR to also cover any personal data from which a sensitive characteristic may be inferred. Additionally, the very recent CJEU judgment in Case C-252/21 Meta v. Bundeskartellamtmakes important findings regarding how social media services as a category of online platforms can lawfully engage in profiling of their users pursuant to the GDPR, including for personalized ads. While the DSA prohibition is concerned with “presenting” ads based on profiling using sensitive data, rather than with the activity of profiling itself, it must be read in conjunction with the obligations in the GDPR for processing personal data for profiling and with the relevant CJEU case-law. To this end, the European Data Protection Board has published relevant guidelines for automated decision-making and profiling in general, but also specifically on targeting of social media users.
- Targeted Advertising and Protection of Minors
Recital 71 of the GDPR already provides that solely automated decision-making, including profiling, with legal or similarly significant effects should not apply to children – a rule that is relevant for any type of context, such as educational services, and not only for online platforms. The DSA enhances this protection when it comes to online platforms, prohibiting the presentation of ads on their interface based on profiling by using personal data of users “when they are aware with reasonable certainty that the recipient of the service is a minor” (Article 28 of the DSA). Additionally, in line with the principle of data minimization provided by Article 5(1) of the GDPR, this DSA prohibition should not lead the provider of the online platform to “maintain, acquire or process” more personal data than it already has in order to assess if the recipient of the service is a minor. While this provision addresses all online platforms, VLOPs and VLOSEs are expected to take “targeted measures to protect the rights of the child, including age verification and parental control tools” as part of their obligation in Article 35(1)(j) to put in place mitigation measures tailored to their specific systemic risks identified following the risk assessment process. As highlighted in a recent FPF infographic and report on age assurance technology, age verification measures may require processing of additional personal data than what the functioning of the online service requires, which could be at odds with the data minimization principles in the absence of additional safeguards. This is an example where the two regulations complement each other.
In recent years, DPAs have been increasingly regulating the processing of personal data of minors. For instance, in the EU, the Irish Data Protection Commission published Fundamentals for a Child-Oriented Approach to Data Processing, the Italian Garante often includes the protection of children in its high-profile enforcement decisions (see, for instance, the TikTok and ChatGPT cases), and the CNIL in France published recommendations to enhance the protection of children online and launched several initiatives to enhance digital rights of children. This is another area where collaboration with DPAs will be very important for consistent application of the DSA.
- Recommender Systems and Advertising Transparency
A significant area of overlap between the DSA and the GDPR relates to transparency. A key purpose of the DSA is to increase overall transparency related to online platforms, manifesting through several obligations, while transparency related to how one’s personal data are processed is an overarching principle of the GDPR. Relevant areas for this principle in the GDPR are found in Article 5, through extensive notice obligations in Articles 13 and 14, data access obligations in Article 15, and underpinned by modalities on how to communicate to individuals in Article 12. Two of the DSA obligations that increase transparency are laid out in Article 27, which imposes on providers of online platforms transparency related to how recommender systems work, and in Article 26, which imposes transparency related to advertising on online platforms. To implement the latter obligation, the DSA requires, per Recital 68, that the “recipients of a service should have information directly accessible from the online interface where the advertisement is presented, on the main parameters used for determining that a specific advertisement is presented to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling.”
As for transparency related to recommender systems, Recital 70 of the DSA explains that online platforms should consistently ensure that users are appropriately informed about how recommender systems impact the way information is displayed and can influence how information is presented to them. “They should clearly present the parameters for such recommender systems in an easily comprehensible manner” to ensure that the users “understand how information is prioritized for them,” including where information is prioritized “based on profiling and their online behavior.” Notably, Articles 13(2)(f) and 14(2)(g) of the GDPR require that notices to individuals whose personal data is processed include “meaningful information about the logic involved, as well as the significance and the envisaged consequences” of automated decision-making, including profiling. These provisions should be read and applied together, complementing each other, to ensure consistency. This is another area where collaboration between DPAs and the enforcers of the DSA would be desirable. To understand the way in which DPAs have been applying this requirement so far, this case-law overview on automated decision-making under the GDPR published by the Future of Privacy Forum last year is helpful.
- Recommender Systems Free-of-Profiling
“Profiling” as defined by the GDPR also plays an important role in one of the key obligations of VLOPs and VLOSEs: to offer users an alternative feed of content not based on profiling. Technically, this stems from an obligation in Article 38 of the DSA for VLOPs and VLOSEs to “provide at least one option for each of their recommender systems which is not based on profiling.” The DSA explains in Recital 70 that a core part of an online platform’s business is the manner in which information is prioritized and presented on its online interface to facilitate and optimize access to information for users: “This is done, for example, by algorithmically suggesting, ranking and prioritizing information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients.”
The DSA text further explains that “such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online, including to facilitate the search of relevant information,” as well as playing an important role “in the amplification of certain messages, the viral dissemination of information and the stimulation of online behavior.” Additionally, as part of their obligations to assess and mitigate risks on their platforms, VLOPs and VLOSEs may need to adjust the design of their recommender systems. Recital 94 of the DSA explains that they could achieve this “by taking measures to prevent or minimize biases that lead to the discrimination of persons in vulnerable situations, in particular where such adjustment is in accordance with Article 9 of the GDPR,” where Article 9 establishes conditions for processing sensitive personal data.
- Access to Data for Researchers and Competent Authorities
Article 40 of the DSA includes an obligation for VLOPs and VLOSEs to provide access to the data necessary to monitor their compliance with the regulation to competent authorities (Digital Services Coordinators designated at the national level in the EU Member State of their establishment or the European Commission). This includes access to data related to algorithms, based on a reasoned request and within a reasonable period specified in the request. Additionally, they also have an obligation to provide access to vetted researchers following a request of their Digital Services Coordinator of establishment “for the sole purpose of conducting research that contributes to the detection, identification, and understanding of systemic risks” in the EU, and “to the assessment of the adequacy, efficiency, and impacts of the risk mitigations measures.” This obligation presupposes that the platforms may be required to explain the design, logic of the functioning, and the testing of their algorithmic systems, in accordance with Article 40 and its corresponding Recital 34.
Providing access to online platforms’ data entails, in virtually all cases, providing access to personal data as well, which brings this processing under the scope of the GDPR and triggers its obligations. Recital 98 of the DSA highlights that providers and researchers alike should pay particular attention to safeguarding the rights of individuals related to the processing of personal data granted by the GDPR. Recital 98 adds that “providers should anonymize or pseudonymize personal data except in those cases that would render impossible the research purpose pursued.” Notably, the data access obligations in the DSA are subject to further specification through delegated acts, to be adopted by the European Commission. These acts are expected to “lay down the specific conditions under which such sharing of data with researchers can take place” in compliance with the GDPR, as well as “relevant objective indicators, procedures and, where necessary, independent advisory mechanisms in support of sharing of data.” This is another area where the DPAs and the DSA enforcers should closely collaborate.
- Takedown of Illegal Content
Core to the DSA are obligations for hosting services, including online platforms, to remove illegal content: Article 16 of the DSA outlines this obligation based on a notice-and-action mechanism initiated at the notification of any individual or entity. The GDPR confers rights on individuals to request erasure of their personal data (Article 17 of the GDPR) under certain conditions, as well as the right to request rectification of their data (Article 16 of the GDPR). These rights of the “data subject” under the GDPR aim to strengthen individuals’ control over how their personal data is collected, used, and disseminated. Article 3(h) of the DSA defines “illegal content” as “any information that, in itself or in relation to an activity … is not in compliance with Union law or the law of any Member State…, irrespective of the precise subject matter or nature of that law.” As a result, to the extent that “illegal content” as defined by the DSA is also personal data, an individual may potentially use either of the avenues, depending on how the overlap of the two provisions is further clarified in practice. Notably, one of the grounds for obtaining erasure of personal data is if “the personal data has been unlawfully processed,” and therefore processed not in compliance with the GDPR, which is Union law.
Article 16 of the DSA highlights an obligation for hosting services, including online platforms, to put mechanisms in place to facilitate the submission of sufficiently precise and adequately substantiated notices. Article 12 of the GDPR, on another hand, requires controllers to facilitate the exercise of data subject rights, including erasure, and to communicate information on the action taken without undue delay and in any case no longer than one month after receiving the request. The DSA does not prescribe a specific timeline to deal with notices for removal of illegal content, other than “without undue delay.” All hosting services and online platforms whose activity falls under the GDPR have internal processes set up to respond to data subject requests, which could potentially be leveraged in setting up mechanisms to remove illegal content pursuant notices as requested by the DSA. However, a key differentiator is that in the DSA content removal requests can also come from authorities (see Article 9 of the DSA) and from “trusted flaggers” (Article 22), in addition to any individual or entity – each of these situations under their own conditions. In contrast, erasure requests under the GDPR can only be submitted by data subjects (individuals whose personal data is processed), either directly or through intermediaries acting on their behalf. DPAs may also impose the erasure of personal data, but only as a measure pursuant to an enforcement action.
VLOPs/VLOSEs will have to additionally design mitigation measures ensuring the adoption of content moderation processes, including the speed and quality of processing notices related to specific types of illegal content and its expeditious removal.
- Risk Assessments
The DSA, pursuant to Article 34, obliges VLOPs/VLOSEs to conduct a risk assessment at least once per year to identify, analyze, and assess “systemic risks stemming from the design or functioning of their service and its related systems,” including algorithmic systems. The same entities are very likely subject to the obligation to conduct a Data Protection Impact Assessment (DPIA) under Article 35 of the GDPR, as at least some of their processing operations, like using personal data for recommender systems or profiling users based on personal data to display online advertising, meet the criteria that trigger the DPIA obligation. A DPIA is required in particular where processing of personal data “using new technologies, and taking into account the nature, scope, context, and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons.”
There are four systemic risks that the DSA asks to be included in the risk assessment: dissemination of illegal content; any actual or foreseeable negative effects on the exercise of specific fundamental rights, among which the right to respect for private life and the right to the protection of personal data are mentioned; any actual or foreseeable negative effects on civic discourse, electoral processes and public security; and any actual foreseeable negative effects in relation to gender-based violence, the protection of public health and minors, and serious negative consequences to the person’s physical and mental well-being.
Among the elements that a DPIA under the GDPR must include is “an assessment of the risks to the rights and freedoms of data subjects” that may arise from how controllers process personal data through new technologies, such as algorithmic systems. Other elements that must be included are the measures envisaged to address these risks, similar to how Article 35 of the DSA requires VLOPs/VLOSEs to put mitigation measures in place tailored to the identified risks. The EDPB has also published guidelines on how to conduct DPIAs.
When conducting the risk assessments required by the DSA, VLOPs/VLOSEs must take into account whether and how specific factors enumerated in Article 34(2) influence any of the systemic risks mentioned. Most factors to consider are linked to how VLOPs/VLOSEs process personal data, such as the design of their algorithmic systems, the systems for selecting and presenting advertisements, and generally their data-related practices.
Both DSA risk assessments and DPIAs are ex-ante risk assessment obligations and both involve some level of engagement with supervisory authorities. The scope of the assessments differ, with the DSA focused on systemic risks and risks that go beyond impact on fundamental rights, and the GDPR’s DPIA focused on any risks that novel processing of personal data may pose on fundamental rights and freedoms and on assessments unique to data protection. However, they also have areas of clear overlap where processing of personal data is involved. DPIAs can potentially feed into DSA risk assessments, and the two processes should be implemented consistently.
- Compliance Function and the DSA Legal Representative
Under the DSA, in accordance with Article 41, the designated VLOPs/VLOSEs will be obliged to establish a “compliance function,” which can be composed of several compliance officers. This function must be (i) independent from their operational functions; (ii) allocated with sufficient authority, stature and resources; and must have (iii) access to the management body of the provider to monitor the compliance of that provider with the DSA. On top of that, the compliance function will have to cooperate with the Digital Services Coordinator of the establishment, ensure that all risks are identified through the risk assessments and that the mitigation measures are effective, as well as inform and advise the management and employees of the provider in relation to DSA obligations.
All providers of the services designated as VLOPs and VLOSEs who are also controllers under the GDPR are under an obligation to appoint a Data Protection Officer (DPO), as they very likely meet the criteria required by Article 37 of the GDPR due to the nature and scope of their processing activities involving personal data. There are similarities between the compliance function and the DPO, including their independence, reporting to the highest management level, their key task to monitor compliance with the whole regulation that creates their role, or their task to cooperate with the competent supervisory authorities. Appointing two independent roles that have a powerful internal position and with roles that may overlap to a certain extent will require consistency and coordination, which can be supported by further guidance from DPAs and DSA supervisory authorities.
Another role in the application of the two regulations that has many similarities is the role of a “representative” in the EU, in the situations of extraterritorial applicability of the DSA and the GDPR covering entities that do not have an establishment in the EU. In the DSA, this obligation pertains to all online service providers, pursuant to Article 13. If they are processing personal data in the context of targeting their services to individual recipients in the EU or if they monitor the recipients’ behavior, the service provider triggers the extraterritorial application of the GDPR as well. In such cases, they also need to appoint a GDPR representative, in accordance with Article 27. Under the GDPR, the representative acts as a mere “postal box” or point of correspondence between the non-EU controller and processor on one hand and DPAs or data subjects on the other hand, with liability that does not go beyond its own statutory obligations. In contrast, Article 13(3) of the DSA suggests that the “legal representative” could be held liable for failures of the intermediary service providers to comply with the DSA. Providers must mandate their legal representatives for the purpose of being addressed “in addition to or instead of” them by competent authorities, per Article 13(2) of the DSA.
Recital 44 of the DSA clarifies that the obligation to appoint a “sufficiently mandated” legal representative “should allow for the effective oversight and, where necessary, enforcement of this regulation in relation to those providers.” The legal representative must have “the necessary powers and resources to cooperate with the relevant authorities” and the DSA envisages that there may be situations where providers even appoint in this role “a subsidiary undertaking of the same group as the provider, or its parent undertaking, if that subsidiary or parent undertaking is established in the Union.” Recital 44 of the DSA also clarifies that the legal representative may also only function as a point of contact, “provided the relevant requirements of this regulation are complied with.” This could mean that if other structures are in place to ensure an entity on behalf of the provider can be held liable for non-compliance by a provider with the DSA, the representative can also function just as a “postal box.”
- Intermediary Liability and the Obligation to Provide Information
Finally, the GDPR and the DSA intersect in areas where data protection, privacy, and intermediary liability overlap.
The GDPR, per Article 2, stresses that its provisions shall be read without prejudice to the e-Commerce Directive (2000/31/EC), in particular, to “the liability rules of intermediary service providers in Articles 12 to 15 of that Directive”. However, the DSA, pursuant to its Article 89, stipulates that while Articles 12 to 15 of the e-Commerce Directive become null, relevant “references to Articles 12 to 15 of Directive 2000/31/EC shall be construed as references to Articles 4, 5, 6 and 8 of this Regulation, respectively.”
The DSA deals with the liability of intermediary services providers, especially through Articles 4 to 10. With respect to Article 10, addressing orders to provide information, the DSA emphasizes the strong envisaged cooperation between intermediary service providers, national authorities, and also the Digital Services Coordinators as enforcers. This could potentially involve the sharing of information, including in certain cases that already collected personal data, in order to combat illegal content online. The GDPR actively passes the baton on intermediary liability on the DSA, but in the eventuality of data sharing and processing, the intermediary service providers should ensure that they comply with the protections of the GDPR (in particular sections 2 and 3). This overlap signals yet another instance where the two Regulations will be complementary to each other, this time in the case of intermediary liability and the obligation to provide information.
The DSA Will Be Enforced Through a Complex Web of Authorities, And The Interplay With The GDPR Complicates It
Enforcement in such a complex space will be challenging. In a departure from the approach promoted by the GDPR, where enforcement is ensured primarily at the national level and through the One Stop Shop mechanism for cross-border cases coordinated through the European Data Protection Board, the DSA centralizes enforcement of the DSA at the EU level when it comes to VLOPs and VLOSEs, leaving it in the hands of the European Commission.
However, Member States will also be playing a role in ensuring enforcement of the DSA against the intermediary services providers who are not VLOPs and VLOSEs. Each Member State must designate one or more competent authorities for the enforcement of the DSA, and if they designate more, they must choose one to be appointed as their Digital Services Coordinator (DSC). The deadline to designate DSCs is February 2024. Challenges come with the designation of national competent authorities left to the Member States, as it seems that there is no consistent approach related to what type of authority will be most appropriately positioned to enforce the Act. Not all Member States have appointed their DSCs for the time being, but there is a broad spectrum of enforcers that Member States plan to rely on, creating a scattered landscape.
Table 3 – Authorities Designated or Considered for Designation as Digital Services Coordinators Across the EU Member States (Source: Euractiv)
Digital Services Coordinators | Member States |
Media Regulator | Belgium, Hungary, Ireland and Slovakia |
Consumer Protection Authority | Finland and the Netherlands |
Telecoms Regulator | Czech Republic, Germany, Greece, Italy, Poland, Slovenia and Sweden |
Competition Authority | Spain |
The Digital Services Coordinators will be closely collaborating and coordinating with the European Board for Digital Services, which will be undertaking an advisory capacity (Articles 61-63 of the DSA), in order to ensure consistent cross-border enforcement. Member States are also tasked to adopt national rules on penalties applicable to infringements of the DSA, including fines that can go up to 6% of the annual worldwide turnover of the provider of intermediary services concerned in the preceding financial year (Article 52 of the DSA). Complaints can be submitted to DSCs by recipients of the services and by any body, organization, or association mandated to exercise rights conferred by the DSA to recipients. With respect to VLOPs and VLOSEs, the European Commission can issue fines not exceeding 6% of the annual worldwide turnover in the preceding year, following decisions of non-compliance which can also ask platforms to take necessary measures to remediate the infringements. Moreover, the Commission can also order interim measures before an investigation is completed, where there is an urgency due to the risk of serious damage to the recipients of the service.
The recipients of the service, including users of online platforms, also have a right to seek compensation from providers of intermediary services for damages or loss they suffer due to infringements of the DSA (Article 54 of the DSA). The DSA also applies in out-of-court dispute resolution mechanisms with regard to decisions of online platforms related to illegal content (Article 21 of the DSA), independent audits in relation to how VLOPs/VLOSEs comply with their obligations (Article 37 of DSA), and voluntary codes of conduct adopted at the Union level to tackle various systemic risks (Article 45), including codes of conduct for online advertising (Article 46) and for accessibility to online services (Article 47).
The newly established European Centre for Algorithmic Transparency (ECAT) also plays a role in this enforcement equation. The ECAT will be supporting the Commission in its assessment of VLOPs/VLOSEs with regard to risk management and mitigation obligations. Moreover, it will be particularly relevant to issues pertaining to recommender systems, information retrieval, and search engines. The ECAT will use a principles-based approach to assessing fairness, accountability, and transparency. However, the DSA is not the only regulation relevant to the use of algorithms and AI by platforms: the GDPR, the upcoming Digital Markets Act, the EU AI Act, and the European Data Act add to this complicated landscape.
The various areas of interplay between the DSA and the GDPR outlined above require consistent interpretation and application of the law. However, there is no formal role recognized in the enforcement and oversight structure of the DSA for cooperation or coordination, specifically among DPAs, the European Data Protection Board, or the European Data Protection Supervisor. This should not be an impediment to setting up processes for such cooperation and coordination within their respective competencies, as the rollout of the DSA will likely reveal the complexity of the interplay between the two legislative frameworks even beyond the ten areas outlined above.
Editor: Alexander Thompson