“Must-carry”, Special Treatment and Freedom of Expression on Online Platforms: A European Story external link

Kuczerawy, A. & Quintais, J.
2024

Abstract

This paper examines the role of "must-carry" obligations in the regulation of online platforms, arguing that these obligations are better understood as special treatment rules rather than direct analogues of traditional broadcasting regulation. By analysing the development of such rules within the European Union, particularly through the Digital Services Act (DSA) and the European Media Freedom Act (EMFA), the paper explores how these provisions aim to safeguard freedom of expression, ensure access to trustworthy information, enhance media pluralism, and regulate platform behaviour. The analysis extends to national-level laws and court decisions in Germany, The Netherlands, the United Kingdom, and Poland, illustrating how these countries have grappled with similar challenges in applying and contextualizing special treatment rules. Through a detailed examination of these frameworks, the paper critiques the risks of these rules, including their potential to entrench power imbalances, amplify state narratives, and complicate efforts to counter disinformation. Additionally, the paper highlights the broader implications of granting privileged status to legacy media and political actors, questioning whether such measures align with democratic principles and the rule of law. Ultimately, the paper argues that while these rules may offer a response to platform dominance, their implementation risks undermining the equality of speech and shifting the focus of freedom of expression toward a privilege for select groups.

Content moderation, Digital services act, EU law, European Media Freedom Act, must carry, platform regulation

Bibtex

Online publication{nokey, title = {“Must-carry”, Special Treatment and Freedom of Expression on Online Platforms: A European Story}, author = {Kuczerawy, A. and Quintais, J.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5064244}, year = {2024}, date = {2024-12-19}, abstract = {This paper examines the role of \"must-carry\" obligations in the regulation of online platforms, arguing that these obligations are better understood as special treatment rules rather than direct analogues of traditional broadcasting regulation. By analysing the development of such rules within the European Union, particularly through the Digital Services Act (DSA) and the European Media Freedom Act (EMFA), the paper explores how these provisions aim to safeguard freedom of expression, ensure access to trustworthy information, enhance media pluralism, and regulate platform behaviour. The analysis extends to national-level laws and court decisions in Germany, The Netherlands, the United Kingdom, and Poland, illustrating how these countries have grappled with similar challenges in applying and contextualizing special treatment rules. Through a detailed examination of these frameworks, the paper critiques the risks of these rules, including their potential to entrench power imbalances, amplify state narratives, and complicate efforts to counter disinformation. Additionally, the paper highlights the broader implications of granting privileged status to legacy media and political actors, questioning whether such measures align with democratic principles and the rule of law. Ultimately, the paper argues that while these rules may offer a response to platform dominance, their implementation risks undermining the equality of speech and shifting the focus of freedom of expression toward a privilege for select groups.}, keywords = {Content moderation, Digital services act, EU law, European Media Freedom Act, must carry, platform regulation}, }

Contesting personalized recommender systems: a cross-country analysis of user preferences external link

Starke, C., Metikoš, L., Helberger, N. & Vreese, C.H. de
Information, Communication & Society, 2024

Abstract

Very Large Online Platforms (VLOPs) such as Instagram, TikTok, and YouTube wield substantial influence over digital information flows using sophisticated algorithmic recommender systems (RS). As these systems curate personalized content, concerns have emerged about their propensity to amplify polarizing or inappropriate content, spread misinformation, and infringe on users’ privacy. To address these concerns, the European Union (EU) has recently introduced a new regulatory framework through the Digital Services Act (DSA). These proposed policies are designed to bolster user agency by offering contestability mechanisms against personalized RS. As their effectiveness ultimately requires individual users to take specific actions, this empirical study investigates users’ intention to contest personalized RS. The results of a pre-registered survey across six countries – Brazil, Germany, Japan, South Korea, the UK, and the USA – involving 6,217 respondents yield key insights: (1) Approximately 20% of users would opt out of using personalized RS, (2) the intention for algorithmic contestation is associated with individual characteristics such as users’ attitudes towards and awareness of personalized RS as well as their privacy concerns, (3) German respondents are particularly inclined to contest personalized RS. We conclude that amending Art. 38 of the DSA may contribute to leveraging its effectiveness in fostering accessible user contestation and algorithmic transparency.

Algorithmic contestation, Digital services act, Personalisation, recommender systems

Bibtex

Article{nokey, title = {Contesting personalized recommender systems: a cross-country analysis of user preferences}, author = {Starke, C. and Metikoš, L. and Helberger, N. and Vreese, C.H. de}, url = {https://www.tandfonline.com/doi/full/10.1080/1369118X.2024.2363926}, doi = {https://doi.org/10.1080/1369118X.2024.2363926}, year = {2024}, date = {2024-07-03}, journal = {Information, Communication & Society}, abstract = {Very Large Online Platforms (VLOPs) such as Instagram, TikTok, and YouTube wield substantial influence over digital information flows using sophisticated algorithmic recommender systems (RS). As these systems curate personalized content, concerns have emerged about their propensity to amplify polarizing or inappropriate content, spread misinformation, and infringe on users’ privacy. To address these concerns, the European Union (EU) has recently introduced a new regulatory framework through the Digital Services Act (DSA). These proposed policies are designed to bolster user agency by offering contestability mechanisms against personalized RS. As their effectiveness ultimately requires individual users to take specific actions, this empirical study investigates users’ intention to contest personalized RS. The results of a pre-registered survey across six countries – Brazil, Germany, Japan, South Korea, the UK, and the USA – involving 6,217 respondents yield key insights: (1) Approximately 20% of users would opt out of using personalized RS, (2) the intention for algorithmic contestation is associated with individual characteristics such as users’ attitudes towards and awareness of personalized RS as well as their privacy concerns, (3) German respondents are particularly inclined to contest personalized RS. We conclude that amending Art. 38 of the DSA may contribute to leveraging its effectiveness in fostering accessible user contestation and algorithmic transparency.}, keywords = {Algorithmic contestation, Digital services act, Personalisation, recommender systems}, }

Copyright Content Moderation in the European Union: State of the Art, Ways Forward and Policy Recommendations external link

Quintais, J., Katzenbach, C., Schwemer, S., Dergacheva, D., Riis, T., Mezei, P., Harkai, I. & Magalhães, J.C.
IIC, vol. 55, pp: 157-177, 2024

Abstract

This Opinion describes and summarises the results of the interdisciplinary research carried out by the authors during the course of a three-year project on intermediaries’ practices regarding copyright content moderation. This research includes the mapping of the EU legal framework and intermediaries’ practices regarding copyright content moderation, the evaluation and measuring of the impact of moderation practices and technologies on access and diversity, and a set of policy recommendations. Our recommendations touch on the following topics: the definition of “online content-sharing service provider”; the recognition and operationalisation of user rights; the complementary nature of complaint and redress safeguards; the scope of permissible preventive filtering; the clarification of the relationship between Art. 17 of the new Copyright Directive and the Digital Services Act; monetisation and restrictive content moderation actions; recommender systems and copyright content moderation; transparency and data access for researchers; trade secret protection and transparency of content moderation systems; the relationship between the copyright acquis, the Digital Services Act and the upcoming Artificial Intelligence Act; and human competences in copyright content moderation.

Content moderation, Copyright, Digital services act, Digital Single Market, intermediaries, Platforms

Bibtex

Article{nokey, title = {Copyright Content Moderation in the European Union: State of the Art, Ways Forward and Policy Recommendations}, author = {Quintais, J. and Katzenbach, C. and Schwemer, S. and Dergacheva, D. and Riis, T. and Mezei, P. and Harkai, I. and Magalhães, J.C.}, url = {https://link.springer.com/article/10.1007/s40319-023-01409-5}, doi = {https://doi.org/10.1007/s40319-023-01409-5}, year = {2024}, date = {2024-01-01}, journal = {IIC}, volume = {55}, pages = {157-177}, abstract = {This Opinion describes and summarises the results of the interdisciplinary research carried out by the authors during the course of a three-year project on intermediaries’ practices regarding copyright content moderation. This research includes the mapping of the EU legal framework and intermediaries’ practices regarding copyright content moderation, the evaluation and measuring of the impact of moderation practices and technologies on access and diversity, and a set of policy recommendations. Our recommendations touch on the following topics: the definition of “online content-sharing service provider”; the recognition and operationalisation of user rights; the complementary nature of complaint and redress safeguards; the scope of permissible preventive filtering; the clarification of the relationship between Art. 17 of the new Copyright Directive and the Digital Services Act; monetisation and restrictive content moderation actions; recommender systems and copyright content moderation; transparency and data access for researchers; trade secret protection and transparency of content moderation systems; the relationship between the copyright acquis, the Digital Services Act and the upcoming Artificial Intelligence Act; and human competences in copyright content moderation.}, keywords = {Content moderation, Copyright, Digital services act, Digital Single Market, intermediaries, Platforms}, }

Using Terms and Conditions to apply Fundamental Rights to Content Moderation

German Law Journal, 2023

Abstract

Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platformʼs terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards. If this is possible Article 14 may fulfil its revolutionary potential.

Content moderation, Digital services act, Freedom of expression, Online platforms, platform regulation, terms and conditions

Bibtex

Article{nokey, title = {Using Terms and Conditions to apply Fundamental Rights to Content Moderation}, author = {Quintais, J. and Appelman, N. and Fahy, R.}, doi = {https://doi.org/10.1017/glj.2023.53}, year = {2023}, date = {2023-07-11}, journal = {German Law Journal}, abstract = {Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platformʼs terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards. If this is possible Article 14 may fulfil its revolutionary potential.}, keywords = {Content moderation, Digital services act, Freedom of expression, Online platforms, platform regulation, terms and conditions}, }

Improving Data Access for Researchers in the Digital Services Act external link

Dergacheva, D., Katzenbach, C., Schwemer, S. & Quintais, J.
2023

Abstract

Joint submission in response to the Call for Evidence on the Delegated Regulation on data access provided for in the Digital Services Act (DSA). Article 40 DSA is a crucial provision to operationalize the regulation’s risk mitigation provisions vis-a-vis very large online platforms (VLOPs) and very large search engines (VLOSEs). In essence, Article 40 DSA enables data access to Digital Services Coordinators (DSCs) or the Commission, “vetted researchers” and other researchers, provided certain conditions are met. Our submission is predominantly concerned with the data access for vetted researchers and researchers in relation to VLOPs.

academic research, data access, Digital services act, Online platforms

Bibtex

Online publication{nokey, title = {Improving Data Access for Researchers in the Digital Services Act}, author = {Dergacheva, D. and Katzenbach, C. and Schwemer, S. and Quintais, J.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4465846}, year = {2023}, date = {2023-06-01}, abstract = {Joint submission in response to the Call for Evidence on the Delegated Regulation on data access provided for in the Digital Services Act (DSA). Article 40 DSA is a crucial provision to operationalize the regulation’s risk mitigation provisions vis-a-vis very large online platforms (VLOPs) and very large search engines (VLOSEs). In essence, Article 40 DSA enables data access to Digital Services Coordinators (DSCs) or the Commission, “vetted researchers” and other researchers, provided certain conditions are met. Our submission is predominantly concerned with the data access for vetted researchers and researchers in relation to VLOPs.}, keywords = {academic research, data access, Digital services act, Online platforms}, }

Copyright Content Moderation in the EU: Conclusions and Recommendations download

Quintais, J., Katzenbach, C., Schwemer, S., Dergacheva, D., Riis, T., Mezei, P. & Harkai, I.
2023

Abstract

This report is a deliverable in the reCreating Europe project. The report describes and summarizes the results of our research on the mapping of the EU legal framework and intermediaries’ practices on copyright content moderation and removal. In particular, this report summarizes the results of our previous deliverables and tasks, namely: (1) our Final Report on mapping of EU legal framework and intermediaries’ practices on copyright content moderation and removal; and (2) our Final Evaluation and Measuring Report - impact of moderation practices and technologies on access and diversity. Our previous reports contain a detailed description of the legal and empirical methodology underpinning our research and findings. This report focuses on bringing together these findings in a concise format and advancing policy recommendations.

Content moderation, Copyright, Digital services act, Digital Single Market, intermediaries, Online platforms, terms and conditions

Bibtex

Report{nokey, title = {Copyright Content Moderation in the EU: Conclusions and Recommendations}, author = {Quintais, J. and Katzenbach, C. and Schwemer, S. and Dergacheva, D. and Riis, T. and Mezei, P. and Harkai, I.}, url = {https://www.ivir.nl/publications/copyright-content-moderation-in-the-eu-conclusions-and-recommendations/ssrn-id4403423/}, year = {2023}, date = {2023-03-30}, abstract = {This report is a deliverable in the reCreating Europe project. The report describes and summarizes the results of our research on the mapping of the EU legal framework and intermediaries’ practices on copyright content moderation and removal. In particular, this report summarizes the results of our previous deliverables and tasks, namely: (1) our Final Report on mapping of EU legal framework and intermediaries’ practices on copyright content moderation and removal; and (2) our Final Evaluation and Measuring Report - impact of moderation practices and technologies on access and diversity. Our previous reports contain a detailed description of the legal and empirical methodology underpinning our research and findings. This report focuses on bringing together these findings in a concise format and advancing policy recommendations.}, keywords = {Content moderation, Copyright, Digital services act, Digital Single Market, intermediaries, Online platforms, terms and conditions}, }

Impact of content moderation practices and technologies on access and diversity external link

Schwemer, S., Katzenbach, C., Dergacheva, D., Riis, T. & Quintais, J.
2023

Abstract

This Report presents the results of research carried out as part of Work Package 6 “Intermediaries: Copyright Content Moderation and Removal at Scale in the Digital Single Market: What Impact on Access to Culture?” of the project “ReCreating Europe”, particularly on Tasks 6.3 (Evaluating Legal Frameworks on the Different Levels (EU vs. national, public vs. private) and 6.4 (Measuring the impact of moderation practices and technologies on access and diversity). This work centers on a normative analysis of the existing public and private legal frameworks with regard to intermediaries and cultural diversity, and on the actual impact on intermediaries’ content moderation on diversity.

Content moderation, Copyright, Digital services act, Digital Single Market, intermediaries, Online platforms, terms and conditions

Bibtex

Report{nokey, title = {Impact of content moderation practices and technologies on access and diversity}, author = {Schwemer, S. and Katzenbach, C. and Dergacheva, D. and Riis, T. and Quintais, J.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4380345}, year = {2023}, date = {2023-03-23}, abstract = {This Report presents the results of research carried out as part of Work Package 6 “Intermediaries: Copyright Content Moderation and Removal at Scale in the Digital Single Market: What Impact on Access to Culture?” of the project “ReCreating Europe”, particularly on Tasks 6.3 (Evaluating Legal Frameworks on the Different Levels (EU vs. national, public vs. private) and 6.4 (Measuring the impact of moderation practices and technologies on access and diversity). This work centers on a normative analysis of the existing public and private legal frameworks with regard to intermediaries and cultural diversity, and on the actual impact on intermediaries’ content moderation on diversity.}, keywords = {Content moderation, Copyright, Digital services act, Digital Single Market, intermediaries, Online platforms, terms and conditions}, }

How platforms govern users’ copyright-protected content: Exploring the power of private ordering and its implications download

Quintais, J., De Gregorio, G. & Magalhães, J.C.
Computer Law & Security Review, vol. 48, 2023

Abstract

Online platforms provide primary points of access to information and other content in the digital age. They foster users’ ability to share ideas and opinions while offering opportunities for cultural and creative industries. In Europe, ownership and use of such expressions is partly governed by a complex web of legislation, sectoral self- and co-regulatory norms. To an important degree, it is also governed by private norms defined by contractual agreements and informal relationships between users and platforms. By adopting policies usually defined as Terms of Service and Community Guidelines, platforms almost unilaterally set use, moderation and enforcement rules, structures and practices (including through algorithmic systems) that govern the access and dissemination of protected content by their users. This private governance of essential means of access, dissemination and expression to (and through) creative content is hardly equitable, though. In fact, it is an expression of how platforms control what users – including users-creators – can say and disseminate online, and how they can monetise their content. As platform power grows, EU law is adjusting by moving towards enhancing the responsibility of platforms for content they host. One crucial example of this is Article 17 of the new Copyright Directive (2019/790), which fundamentally changes the regime and liability of “online content-sharing service providers” (OCSSPs). This complex regime, complemented by rules in the Digital Services Act, sets out a new environment for OCSSPs to design and carry out content moderation, as well as to define their contractual relationship with users, including creators. The latter relationship is characterized by significant power imbalance in favour of platforms, calling into question whether the law can and should do more to protect users-creators. This article addresses the power of large-scale platforms in EU law over their users’ copyright-protected content and its effects on the governance of that content, including on its exploitation and some of its implications for freedom of expression. Our analysis combines legal and empirical methods. We carry our doctrinal legal research to clarify the complex legal regime that governs platforms’ contractual obligations to users and content moderation activities, including the space available for private ordering, with a focus on EU law. From the empirical perspective, we conducted a thematic analysis of most versions of the Terms of Services published over time by the three largest social media platforms in number of users – Facebook, Instagram and YouTube – so as to identify and examine the rules these companies have established to regulate user-generated content, and the ways in which such provisions shifted in the past two decades. In so doing, we unveil how foundational this sort of regulation has always been to platforms’ functioning and how it contributes to defining a system of content exploitation.

CDSM Directive, Content moderation, Copyright, creators, Digital services act, online content, Online platforms, platform regulation, private ordering, terms of service

Bibtex

Article{nokey, title = {How platforms govern users’ copyright-protected content: Exploring the power of private ordering and its implications}, author = {Quintais, J. and De Gregorio, G. and Magalhães, J.C.}, url = {https://www.ivir.nl/publications/how-platforms-govern-users-copyright-protected-content-exploring-the-power-of-private-ordering-and-its-implications/computer_law_and_security_review_2023/}, doi = {https://doi.org/10.1016/j.clsr.2023.105792}, year = {2023}, date = {2023-02-24}, journal = {Computer Law & Security Review}, volume = {48}, pages = {}, abstract = {Online platforms provide primary points of access to information and other content in the digital age. They foster users’ ability to share ideas and opinions while offering opportunities for cultural and creative industries. In Europe, ownership and use of such expressions is partly governed by a complex web of legislation, sectoral self- and co-regulatory norms. To an important degree, it is also governed by private norms defined by contractual agreements and informal relationships between users and platforms. By adopting policies usually defined as Terms of Service and Community Guidelines, platforms almost unilaterally set use, moderation and enforcement rules, structures and practices (including through algorithmic systems) that govern the access and dissemination of protected content by their users. This private governance of essential means of access, dissemination and expression to (and through) creative content is hardly equitable, though. In fact, it is an expression of how platforms control what users – including users-creators – can say and disseminate online, and how they can monetise their content. As platform power grows, EU law is adjusting by moving towards enhancing the responsibility of platforms for content they host. One crucial example of this is Article 17 of the new Copyright Directive (2019/790), which fundamentally changes the regime and liability of “online content-sharing service providers” (OCSSPs). This complex regime, complemented by rules in the Digital Services Act, sets out a new environment for OCSSPs to design and carry out content moderation, as well as to define their contractual relationship with users, including creators. The latter relationship is characterized by significant power imbalance in favour of platforms, calling into question whether the law can and should do more to protect users-creators. This article addresses the power of large-scale platforms in EU law over their users’ copyright-protected content and its effects on the governance of that content, including on its exploitation and some of its implications for freedom of expression. Our analysis combines legal and empirical methods. We carry our doctrinal legal research to clarify the complex legal regime that governs platforms’ contractual obligations to users and content moderation activities, including the space available for private ordering, with a focus on EU law. From the empirical perspective, we conducted a thematic analysis of most versions of the Terms of Services published over time by the three largest social media platforms in number of users – Facebook, Instagram and YouTube – so as to identify and examine the rules these companies have established to regulate user-generated content, and the ways in which such provisions shifted in the past two decades. In so doing, we unveil how foundational this sort of regulation has always been to platforms’ functioning and how it contributes to defining a system of content exploitation.}, keywords = {CDSM Directive, Content moderation, Copyright, creators, Digital services act, online content, Online platforms, platform regulation, private ordering, terms of service}, }

Using Terms and Conditions to Apply Fundamental Rights to Content Moderation external link

German Law Journal (forthcoming), 2022

Abstract

Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platform's terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards, and therefore allowing Article 14 to fulfil its revolutionary potential.

Content moderation, Digital services act, Freedom of expression, Online platforms, platform regulation, terms and conditions

Bibtex

Article{nokey, title = {Using Terms and Conditions to Apply Fundamental Rights to Content Moderation}, author = {Quintais, J. and Appelman, N. and Fahy, R.}, url = {https://osf.io/f2n7m/}, year = {2022}, date = {2022-11-25}, journal = {German Law Journal (forthcoming)}, abstract = {Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platform\'s terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards, and therefore allowing Article 14 to fulfil its revolutionary potential.}, keywords = {Content moderation, Digital services act, Freedom of expression, Online platforms, platform regulation, terms and conditions}, }

European Copyright Society – Comment on Copyright and the Digital Services Act Proposal external link

Peukert, A., Husovec, M., Kretschmer, M., Mezei, P. & Quintais, J.
IIC - International Review of Intellectual Property and Competition Law , vol. 53, iss. : 3, pp: 358-376, 2022

Auteursrecht, Digital services act, european copyright society, frontpage

Bibtex

Article{nokey, title = {European Copyright Society – Comment on Copyright and the Digital Services Act Proposal}, author = {Peukert, A. and Husovec, M. and Kretschmer, M. and Mezei, P. and Quintais, J.}, url = {https://www.ivir.nl/iic_2022/}, doi = {https://doi.org/10.1007/s40319-022-01154-1}, year = {0314}, date = {2022-03-14}, journal = {IIC - International Review of Intellectual Property and Competition Law }, volume = {53}, issue = {3}, pages = {358-376}, keywords = {Auteursrecht, Digital services act, european copyright society, frontpage}, }