From the DMCA to the DSA: A Transatlantic Dialogue on Online Platform Regulation and Copyright external link

Verfassungsblog, 2024

Copyright, DMCA, DSA, Online platforms

Bibtex

Online publication{nokey, title = {From the DMCA to the DSA: A Transatlantic Dialogue on Online Platform Regulation and Copyright}, author = {Quintais, J.}, url = {https://verfassungsblog.de/from-the-dmca-to-the-dsa/?s=09}, year = {2024}, date = {2024-02-19}, journal = {Verfassungsblog}, keywords = {Copyright, DMCA, DSA, Online platforms}, }

Using Terms and Conditions to apply Fundamental Rights to Content Moderation

German Law Journal, 2023

Abstract

Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platformʼs terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards. If this is possible Article 14 may fulfil its revolutionary potential.

Content moderation, Digital services act, Freedom of expression, Online platforms, platform regulation, terms and conditions

Bibtex

Article{nokey, title = {Using Terms and Conditions to apply Fundamental Rights to Content Moderation}, author = {Quintais, J. and Appelman, N. and Fahy, R.}, doi = {https://doi.org/10.1017/glj.2023.53}, year = {2023}, date = {2023-07-11}, journal = {German Law Journal}, abstract = {Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platformʼs terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards. If this is possible Article 14 may fulfil its revolutionary potential.}, keywords = {Content moderation, Digital services act, Freedom of expression, Online platforms, platform regulation, terms and conditions}, }

Improving Data Access for Researchers in the Digital Services Act external link

Dergacheva, D., Katzenbach, C., Schwemer, S. & Quintais, J.
2023

Abstract

Joint submission in response to the Call for Evidence on the Delegated Regulation on data access provided for in the Digital Services Act (DSA). Article 40 DSA is a crucial provision to operationalize the regulation’s risk mitigation provisions vis-a-vis very large online platforms (VLOPs) and very large search engines (VLOSEs). In essence, Article 40 DSA enables data access to Digital Services Coordinators (DSCs) or the Commission, “vetted researchers” and other researchers, provided certain conditions are met. Our submission is predominantly concerned with the data access for vetted researchers and researchers in relation to VLOPs.

academic research, data access, Digital services act, Online platforms

Bibtex

Online publication{nokey, title = {Improving Data Access for Researchers in the Digital Services Act}, author = {Dergacheva, D. and Katzenbach, C. and Schwemer, S. and Quintais, J.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4465846}, year = {2023}, date = {2023-06-01}, abstract = {Joint submission in response to the Call for Evidence on the Delegated Regulation on data access provided for in the Digital Services Act (DSA). Article 40 DSA is a crucial provision to operationalize the regulation’s risk mitigation provisions vis-a-vis very large online platforms (VLOPs) and very large search engines (VLOSEs). In essence, Article 40 DSA enables data access to Digital Services Coordinators (DSCs) or the Commission, “vetted researchers” and other researchers, provided certain conditions are met. Our submission is predominantly concerned with the data access for vetted researchers and researchers in relation to VLOPs.}, keywords = {academic research, data access, Digital services act, Online platforms}, }

An end to shadow banning? Transparency rights in the Digital Services Act between content moderation and curation download

Computer Law & Security Review, vol. 48, 2023

Abstract

This paper offers a legal perspective on the phenomenon of shadow banning: content moderation sanctions which are undetectable to those affected. Drawing on recent social science research, it connects current concerns about shadow banning to novel visibility management techniques in content moderation, such as delisting and demotion. Conventional moderation techniques such as outright content removal or account suspension can be observed by those affected, but these new visibility often cannot. This lends newfound significance to the legal question of moderation transparency rights. The EU Digital Services Act (DSA) is analysed in this light, as the first major legislation to regulate transparency of visibility remedies. In effect, its due process framework prohibits shadow banning with only limited exceptions. In doing so, the DSA surfaces tensions between two competing models for content moderation: as rule-bound administration or as adversarial security conflict. I discuss possible interpretations and trade-offs for this regime, and then turn to a more fundamental problem: how to define visibility reduction as a category of content moderation actions. The concept of visibility reduction or ‘demotions’ is central to both the shadow banning imaginary and to the DSA's safeguards, but its meaning is far from straightforward. Responding to claims that demotion is entirely relative, and therefore not actionable as a category of content moderation sanctions, I show how visibility reduction can still be regulated when defined as ex post adjustments to engagement-based relevance scores. Still, regulating demotion in this way will not cover all exercises of ranking power, since it manifests not only in individual cases of moderation but also through structural acts of content curation; not just by reducing visibility, but by producing visibility.

content curation, Content moderation, DSA, Online platforms, Transparency

Bibtex

Article{nokey, title = {An end to shadow banning? Transparency rights in the Digital Services Act between content moderation and curation}, author = {Leerssen, P.}, url = {https://www.ivir.nl/nl/publications/comment-an-end-to-shadow-banning-transparency-rights-in-the-digital-services-act-between-content-moderation-and-curation/endtoshadowbanning/}, doi = {https://doi.org/10.1016/j.clsr.2023.105790}, year = {2023}, date = {2023-04-11}, journal = {Computer Law & Security Review}, volume = {48}, pages = {}, abstract = {This paper offers a legal perspective on the phenomenon of shadow banning: content moderation sanctions which are undetectable to those affected. Drawing on recent social science research, it connects current concerns about shadow banning to novel visibility management techniques in content moderation, such as delisting and demotion. Conventional moderation techniques such as outright content removal or account suspension can be observed by those affected, but these new visibility often cannot. This lends newfound significance to the legal question of moderation transparency rights. The EU Digital Services Act (DSA) is analysed in this light, as the first major legislation to regulate transparency of visibility remedies. In effect, its due process framework prohibits shadow banning with only limited exceptions. In doing so, the DSA surfaces tensions between two competing models for content moderation: as rule-bound administration or as adversarial security conflict. I discuss possible interpretations and trade-offs for this regime, and then turn to a more fundamental problem: how to define visibility reduction as a category of content moderation actions. The concept of visibility reduction or ‘demotions’ is central to both the shadow banning imaginary and to the DSA\'s safeguards, but its meaning is far from straightforward. Responding to claims that demotion is entirely relative, and therefore not actionable as a category of content moderation sanctions, I show how visibility reduction can still be regulated when defined as ex post adjustments to engagement-based relevance scores. Still, regulating demotion in this way will not cover all exercises of ranking power, since it manifests not only in individual cases of moderation but also through structural acts of content curation; not just by reducing visibility, but by producing visibility.}, keywords = {content curation, Content moderation, DSA, Online platforms, Transparency}, }

Copyright Content Moderation in the EU: Conclusions and Recommendations download

Quintais, J., Katzenbach, C., Schwemer, S., Dergacheva, D., Riis, T., Mezei, P. & Harkai, I.
2023

Abstract

This report is a deliverable in the reCreating Europe project. The report describes and summarizes the results of our research on the mapping of the EU legal framework and intermediaries’ practices on copyright content moderation and removal. In particular, this report summarizes the results of our previous deliverables and tasks, namely: (1) our Final Report on mapping of EU legal framework and intermediaries’ practices on copyright content moderation and removal; and (2) our Final Evaluation and Measuring Report - impact of moderation practices and technologies on access and diversity. Our previous reports contain a detailed description of the legal and empirical methodology underpinning our research and findings. This report focuses on bringing together these findings in a concise format and advancing policy recommendations.

Content moderation, Copyright, Digital services act, Digital Single Market, intermediaries, Online platforms, terms and conditions

Bibtex

Report{nokey, title = {Copyright Content Moderation in the EU: Conclusions and Recommendations}, author = {Quintais, J. and Katzenbach, C. and Schwemer, S. and Dergacheva, D. and Riis, T. and Mezei, P. and Harkai, I.}, url = {https://www.ivir.nl/nl/publications/copyright-content-moderation-in-the-eu-conclusions-and-recommendations/ssrn-id4403423/}, year = {2023}, date = {2023-03-30}, abstract = {This report is a deliverable in the reCreating Europe project. The report describes and summarizes the results of our research on the mapping of the EU legal framework and intermediaries’ practices on copyright content moderation and removal. In particular, this report summarizes the results of our previous deliverables and tasks, namely: (1) our Final Report on mapping of EU legal framework and intermediaries’ practices on copyright content moderation and removal; and (2) our Final Evaluation and Measuring Report - impact of moderation practices and technologies on access and diversity. Our previous reports contain a detailed description of the legal and empirical methodology underpinning our research and findings. This report focuses on bringing together these findings in a concise format and advancing policy recommendations.}, keywords = {Content moderation, Copyright, Digital services act, Digital Single Market, intermediaries, Online platforms, terms and conditions}, }

Impact of content moderation practices and technologies on access and diversity external link

Schwemer, S., Katzenbach, C., Dergacheva, D., Riis, T. & Quintais, J.
2023

Abstract

This Report presents the results of research carried out as part of Work Package 6 “Intermediaries: Copyright Content Moderation and Removal at Scale in the Digital Single Market: What Impact on Access to Culture?” of the project “ReCreating Europe”, particularly on Tasks 6.3 (Evaluating Legal Frameworks on the Different Levels (EU vs. national, public vs. private) and 6.4 (Measuring the impact of moderation practices and technologies on access and diversity). This work centers on a normative analysis of the existing public and private legal frameworks with regard to intermediaries and cultural diversity, and on the actual impact on intermediaries’ content moderation on diversity.

Content moderation, Copyright, Digital services act, Digital Single Market, intermediaries, Online platforms, terms and conditions

Bibtex

Report{nokey, title = {Impact of content moderation practices and technologies on access and diversity}, author = {Schwemer, S. and Katzenbach, C. and Dergacheva, D. and Riis, T. and Quintais, J.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4380345}, year = {2023}, date = {2023-03-23}, abstract = {This Report presents the results of research carried out as part of Work Package 6 “Intermediaries: Copyright Content Moderation and Removal at Scale in the Digital Single Market: What Impact on Access to Culture?” of the project “ReCreating Europe”, particularly on Tasks 6.3 (Evaluating Legal Frameworks on the Different Levels (EU vs. national, public vs. private) and 6.4 (Measuring the impact of moderation practices and technologies on access and diversity). This work centers on a normative analysis of the existing public and private legal frameworks with regard to intermediaries and cultural diversity, and on the actual impact on intermediaries’ content moderation on diversity.}, keywords = {Content moderation, Copyright, Digital services act, Digital Single Market, intermediaries, Online platforms, terms and conditions}, }

How platforms govern users’ copyright-protected content: Exploring the power of private ordering and its implications download

Quintais, J., De Gregorio, G. & Magalhães, J.C.
Computer Law & Security Review, vol. 48, 2023

Abstract

Online platforms provide primary points of access to information and other content in the digital age. They foster users’ ability to share ideas and opinions while offering opportunities for cultural and creative industries. In Europe, ownership and use of such expressions is partly governed by a complex web of legislation, sectoral self- and co-regulatory norms. To an important degree, it is also governed by private norms defined by contractual agreements and informal relationships between users and platforms. By adopting policies usually defined as Terms of Service and Community Guidelines, platforms almost unilaterally set use, moderation and enforcement rules, structures and practices (including through algorithmic systems) that govern the access and dissemination of protected content by their users. This private governance of essential means of access, dissemination and expression to (and through) creative content is hardly equitable, though. In fact, it is an expression of how platforms control what users – including users-creators – can say and disseminate online, and how they can monetise their content. As platform power grows, EU law is adjusting by moving towards enhancing the responsibility of platforms for content they host. One crucial example of this is Article 17 of the new Copyright Directive (2019/790), which fundamentally changes the regime and liability of “online content-sharing service providers” (OCSSPs). This complex regime, complemented by rules in the Digital Services Act, sets out a new environment for OCSSPs to design and carry out content moderation, as well as to define their contractual relationship with users, including creators. The latter relationship is characterized by significant power imbalance in favour of platforms, calling into question whether the law can and should do more to protect users-creators. This article addresses the power of large-scale platforms in EU law over their users’ copyright-protected content and its effects on the governance of that content, including on its exploitation and some of its implications for freedom of expression. Our analysis combines legal and empirical methods. We carry our doctrinal legal research to clarify the complex legal regime that governs platforms’ contractual obligations to users and content moderation activities, including the space available for private ordering, with a focus on EU law. From the empirical perspective, we conducted a thematic analysis of most versions of the Terms of Services published over time by the three largest social media platforms in number of users – Facebook, Instagram and YouTube – so as to identify and examine the rules these companies have established to regulate user-generated content, and the ways in which such provisions shifted in the past two decades. In so doing, we unveil how foundational this sort of regulation has always been to platforms’ functioning and how it contributes to defining a system of content exploitation.

CDSM Directive, Content moderation, Copyright, creators, Digital services act, online content, Online platforms, platform regulation, private ordering, terms of service

Bibtex

Article{nokey, title = {How platforms govern users’ copyright-protected content: Exploring the power of private ordering and its implications}, author = {Quintais, J. and De Gregorio, G. and Magalhães, J.C.}, url = {https://www.ivir.nl/nl/publications/how-platforms-govern-users-copyright-protected-content-exploring-the-power-of-private-ordering-and-its-implications/computer_law_and_security_review_2023/}, doi = {https://doi.org/10.1016/j.clsr.2023.105792}, year = {2023}, date = {2023-02-24}, journal = {Computer Law & Security Review}, volume = {48}, pages = {}, abstract = {Online platforms provide primary points of access to information and other content in the digital age. They foster users’ ability to share ideas and opinions while offering opportunities for cultural and creative industries. In Europe, ownership and use of such expressions is partly governed by a complex web of legislation, sectoral self- and co-regulatory norms. To an important degree, it is also governed by private norms defined by contractual agreements and informal relationships between users and platforms. By adopting policies usually defined as Terms of Service and Community Guidelines, platforms almost unilaterally set use, moderation and enforcement rules, structures and practices (including through algorithmic systems) that govern the access and dissemination of protected content by their users. This private governance of essential means of access, dissemination and expression to (and through) creative content is hardly equitable, though. In fact, it is an expression of how platforms control what users – including users-creators – can say and disseminate online, and how they can monetise their content. As platform power grows, EU law is adjusting by moving towards enhancing the responsibility of platforms for content they host. One crucial example of this is Article 17 of the new Copyright Directive (2019/790), which fundamentally changes the regime and liability of “online content-sharing service providers” (OCSSPs). This complex regime, complemented by rules in the Digital Services Act, sets out a new environment for OCSSPs to design and carry out content moderation, as well as to define their contractual relationship with users, including creators. The latter relationship is characterized by significant power imbalance in favour of platforms, calling into question whether the law can and should do more to protect users-creators. This article addresses the power of large-scale platforms in EU law over their users’ copyright-protected content and its effects on the governance of that content, including on its exploitation and some of its implications for freedom of expression. Our analysis combines legal and empirical methods. We carry our doctrinal legal research to clarify the complex legal regime that governs platforms’ contractual obligations to users and content moderation activities, including the space available for private ordering, with a focus on EU law. From the empirical perspective, we conducted a thematic analysis of most versions of the Terms of Services published over time by the three largest social media platforms in number of users – Facebook, Instagram and YouTube – so as to identify and examine the rules these companies have established to regulate user-generated content, and the ways in which such provisions shifted in the past two decades. In so doing, we unveil how foundational this sort of regulation has always been to platforms’ functioning and how it contributes to defining a system of content exploitation.}, keywords = {CDSM Directive, Content moderation, Copyright, creators, Digital services act, online content, Online platforms, platform regulation, private ordering, terms of service}, }

Using Terms and Conditions to Apply Fundamental Rights to Content Moderation external link

German Law Journal (forthcoming), 2022

Abstract

Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platform's terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards, and therefore allowing Article 14 to fulfil its revolutionary potential.

Content moderation, Digital services act, Freedom of expression, Online platforms, platform regulation, terms and conditions

Bibtex

Article{nokey, title = {Using Terms and Conditions to Apply Fundamental Rights to Content Moderation}, author = {Quintais, J. and Appelman, N. and Fahy, R.}, url = {https://osf.io/f2n7m/}, year = {2022}, date = {2022-11-25}, journal = {German Law Journal (forthcoming)}, abstract = {Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platform\'s terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards, and therefore allowing Article 14 to fulfil its revolutionary potential.}, keywords = {Content moderation, Digital services act, Freedom of expression, Online platforms, platform regulation, terms and conditions}, }

Using Terms and Conditions to apply Fundamental Rights to Content Moderation: Is Article 12 DSA a Paper Tiger? external link

Digital services act, DSA, frontpage, Fundamental rights, Online platforms, terms and conditions

Bibtex

Online publication{Appelman2021, title = {Using Terms and Conditions to apply Fundamental Rights to Content Moderation: Is Article 12 DSA a Paper Tiger?}, author = {Appelman, N. and Quintais, J. and Fahy, R.}, url = {https://verfassungsblog.de/power-dsa-dma-06/}, doi = {https://doi.org/10.17176/20210901-233103-0.}, year = {0901}, date = {2021-09-01}, keywords = {Digital services act, DSA, frontpage, Fundamental rights, Online platforms, terms and conditions}, }

The Interplay between the Digital Services Act and Sector Regulation: How Special is Copyright? external link

Quintais, J. & Schwemer, S.
European Journal of Risk Regulation, vol. 13, iss. : 2, pp: 191-217, 2022

Abstract

On 15 December 2020, the European Commission published its proposal for a Regulation on a Single Market for Digital Services (Digital Services Act). It carries out a regulatory overhaul of the 21-year- old horizontal rules on intermediary liability in the Directive and introduces new due diligence obligations for intermediary services. Our analysis illuminates an important point that has so far received little attention: how would the Digital Services Act’s rules interact with existing sector-specific lex specialis rules? In this paper, we look specifically at the intersection of the Digital Services Act with the regime for online content sharing service providers (OCSSPs) set forth in art. 17 of Directive (EU) 2019/790 on copyright in the Digital Single Market (CDSM Directive). At first glance, these regimes do not appear to overlap as the rules on copyright are lex specialis to the Digital Services Act. A closer look shows a more complex and nuanced picture. Our analysis concludes that the DSA will apply to OCSSPs insofar as it contains rules that regulate matters not covered by art. 17 CDSM Directive, as well as specific rules on matters where art. 17 leaves margin of discretion to Member States. This includes, to varying degrees, rules in the DSA relating to the liability of intermediary providers and to due diligence obligations for online platforms of different sizes. Importantly, we consider that such rules apply even where art. 17 CDSM Directive contains specific (but less precise) regulation on the matter. From a normative perspective, this might be a desirable outcome, to the extent that the DSA aims to establish “uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected”. Based on our analysis, we suggest a number of clarifications that might be help achieve that goal.

Art. 17 CDSM Directive, Content moderation, Copyright, Digital services act, frontpage, Online platforms

Bibtex

Article{Quintais2021e, title = {The Interplay between the Digital Services Act and Sector Regulation: How Special is Copyright?}, author = {Quintais, J. and Schwemer, S.}, url = {https://www.ivir.nl/ejrr_2022/}, doi = {https://doi.org/https://doi.org/10.1017/err.2022.1}, year = {0314}, date = {2022-03-14}, journal = {European Journal of Risk Regulation}, volume = {13}, issue = {2}, pages = {191-217}, abstract = {On 15 December 2020, the European Commission published its proposal for a Regulation on a Single Market for Digital Services (Digital Services Act). It carries out a regulatory overhaul of the 21-year- old horizontal rules on intermediary liability in the Directive and introduces new due diligence obligations for intermediary services. Our analysis illuminates an important point that has so far received little attention: how would the Digital Services Act’s rules interact with existing sector-specific lex specialis rules? In this paper, we look specifically at the intersection of the Digital Services Act with the regime for online content sharing service providers (OCSSPs) set forth in art. 17 of Directive (EU) 2019/790 on copyright in the Digital Single Market (CDSM Directive). At first glance, these regimes do not appear to overlap as the rules on copyright are lex specialis to the Digital Services Act. A closer look shows a more complex and nuanced picture. Our analysis concludes that the DSA will apply to OCSSPs insofar as it contains rules that regulate matters not covered by art. 17 CDSM Directive, as well as specific rules on matters where art. 17 leaves margin of discretion to Member States. This includes, to varying degrees, rules in the DSA relating to the liability of intermediary providers and to due diligence obligations for online platforms of different sizes. Importantly, we consider that such rules apply even where art. 17 CDSM Directive contains specific (but less precise) regulation on the matter. From a normative perspective, this might be a desirable outcome, to the extent that the DSA aims to establish “uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected”. Based on our analysis, we suggest a number of clarifications that might be help achieve that goal.}, keywords = {Art. 17 CDSM Directive, Content moderation, Copyright, Digital services act, frontpage, Online platforms}, }