How platforms govern users’ copyright-protected content: Exploring the power of private ordering and its implications download

Quintais, J., De Gregorio, G. & Magalhães, J.C.
Computer Law & Security Review, vol. 48, 2023

Abstract

Online platforms provide primary points of access to information and other content in the digital age. They foster users’ ability to share ideas and opinions while offering opportunities for cultural and creative industries. In Europe, ownership and use of such expressions is partly governed by a complex web of legislation, sectoral self- and co-regulatory norms. To an important degree, it is also governed by private norms defined by contractual agreements and informal relationships between users and platforms. By adopting policies usually defined as Terms of Service and Community Guidelines, platforms almost unilaterally set use, moderation and enforcement rules, structures and practices (including through algorithmic systems) that govern the access and dissemination of protected content by their users. This private governance of essential means of access, dissemination and expression to (and through) creative content is hardly equitable, though. In fact, it is an expression of how platforms control what users – including users-creators – can say and disseminate online, and how they can monetise their content. As platform power grows, EU law is adjusting by moving towards enhancing the responsibility of platforms for content they host. One crucial example of this is Article 17 of the new Copyright Directive (2019/790), which fundamentally changes the regime and liability of “online content-sharing service providers” (OCSSPs). This complex regime, complemented by rules in the Digital Services Act, sets out a new environment for OCSSPs to design and carry out content moderation, as well as to define their contractual relationship with users, including creators. The latter relationship is characterized by significant power imbalance in favour of platforms, calling into question whether the law can and should do more to protect users-creators. This article addresses the power of large-scale platforms in EU law over their users’ copyright-protected content and its effects on the governance of that content, including on its exploitation and some of its implications for freedom of expression. Our analysis combines legal and empirical methods. We carry our doctrinal legal research to clarify the complex legal regime that governs platforms’ contractual obligations to users and content moderation activities, including the space available for private ordering, with a focus on EU law. From the empirical perspective, we conducted a thematic analysis of most versions of the Terms of Services published over time by the three largest social media platforms in number of users – Facebook, Instagram and YouTube – so as to identify and examine the rules these companies have established to regulate user-generated content, and the ways in which such provisions shifted in the past two decades. In so doing, we unveil how foundational this sort of regulation has always been to platforms’ functioning and how it contributes to defining a system of content exploitation.

CDSM Directive, Content moderation, Copyright, creators, Digital services act, online content, Online platforms, platform regulation, private ordering, terms of service

Bibtex

Article{nokey, title = {How platforms govern users’ copyright-protected content: Exploring the power of private ordering and its implications}, author = {Quintais, J. and De Gregorio, G. and Magalhães, J.C.}, url = {https://www.ivir.nl/publications/how-platforms-govern-users-copyright-protected-content-exploring-the-power-of-private-ordering-and-its-implications/computer_law_and_security_review_2023/}, doi = {https://doi.org/10.1016/j.clsr.2023.105792}, year = {2023}, date = {2023-02-24}, journal = {Computer Law & Security Review}, volume = {48}, pages = {}, abstract = {Online platforms provide primary points of access to information and other content in the digital age. They foster users’ ability to share ideas and opinions while offering opportunities for cultural and creative industries. In Europe, ownership and use of such expressions is partly governed by a complex web of legislation, sectoral self- and co-regulatory norms. To an important degree, it is also governed by private norms defined by contractual agreements and informal relationships between users and platforms. By adopting policies usually defined as Terms of Service and Community Guidelines, platforms almost unilaterally set use, moderation and enforcement rules, structures and practices (including through algorithmic systems) that govern the access and dissemination of protected content by their users. This private governance of essential means of access, dissemination and expression to (and through) creative content is hardly equitable, though. In fact, it is an expression of how platforms control what users – including users-creators – can say and disseminate online, and how they can monetise their content. As platform power grows, EU law is adjusting by moving towards enhancing the responsibility of platforms for content they host. One crucial example of this is Article 17 of the new Copyright Directive (2019/790), which fundamentally changes the regime and liability of “online content-sharing service providers” (OCSSPs). This complex regime, complemented by rules in the Digital Services Act, sets out a new environment for OCSSPs to design and carry out content moderation, as well as to define their contractual relationship with users, including creators. The latter relationship is characterized by significant power imbalance in favour of platforms, calling into question whether the law can and should do more to protect users-creators. This article addresses the power of large-scale platforms in EU law over their users’ copyright-protected content and its effects on the governance of that content, including on its exploitation and some of its implications for freedom of expression. Our analysis combines legal and empirical methods. We carry our doctrinal legal research to clarify the complex legal regime that governs platforms’ contractual obligations to users and content moderation activities, including the space available for private ordering, with a focus on EU law. From the empirical perspective, we conducted a thematic analysis of most versions of the Terms of Services published over time by the three largest social media platforms in number of users – Facebook, Instagram and YouTube – so as to identify and examine the rules these companies have established to regulate user-generated content, and the ways in which such provisions shifted in the past two decades. In so doing, we unveil how foundational this sort of regulation has always been to platforms’ functioning and how it contributes to defining a system of content exploitation.}, keywords = {CDSM Directive, Content moderation, Copyright, creators, Digital services act, online content, Online platforms, platform regulation, private ordering, terms of service}, }

Using Terms and Conditions to Apply Fundamental Rights to Content Moderation external link

German Law Journal (forthcoming), 2022

Abstract

Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platform's terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards, and therefore allowing Article 14 to fulfil its revolutionary potential.

Content moderation, Digital services act, Freedom of expression, Online platforms, platform regulation, terms and conditions

Bibtex

Article{nokey, title = {Using Terms and Conditions to Apply Fundamental Rights to Content Moderation}, author = {Quintais, J. and Appelman, N. and Fahy, R.}, url = {https://osf.io/f2n7m/}, year = {2022}, date = {2022-11-25}, journal = {German Law Journal (forthcoming)}, abstract = {Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platform\'s terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards, and therefore allowing Article 14 to fulfil its revolutionary potential.}, keywords = {Content moderation, Digital services act, Freedom of expression, Online platforms, platform regulation, terms and conditions}, }

Article 12 DSA: Will platforms be required to apply EU fundamental rights in content moderation decisions? external link

Content moderation, Digital services act, DSA, frontpage, Fundamental rights

Bibtex

Online publication{Quintais2021f, title = {Article 12 DSA: Will platforms be required to apply EU fundamental rights in content moderation decisions?}, author = {Quintais, J. and Appelman, N. and Fahy, R.}, url = {https://dsa-observatory.eu/2021/05/31/article-12-dsa-will-platforms-be-required-to-apply-eu-fundamental-rights-in-content-moderation-decisions/}, year = {0531}, date = {2021-05-31}, keywords = {Content moderation, Digital services act, DSA, frontpage, Fundamental rights}, }

The Interplay between the Digital Services Act and Sector Regulation: How Special is Copyright? external link

Quintais, J. & Schwemer, S.
European Journal of Risk Regulation, vol. 13, iss. : 2, pp: 191-217, 2022

Abstract

On 15 December 2020, the European Commission published its proposal for a Regulation on a Single Market for Digital Services (Digital Services Act). It carries out a regulatory overhaul of the 21-year- old horizontal rules on intermediary liability in the Directive and introduces new due diligence obligations for intermediary services. Our analysis illuminates an important point that has so far received little attention: how would the Digital Services Act’s rules interact with existing sector-specific lex specialis rules? In this paper, we look specifically at the intersection of the Digital Services Act with the regime for online content sharing service providers (OCSSPs) set forth in art. 17 of Directive (EU) 2019/790 on copyright in the Digital Single Market (CDSM Directive). At first glance, these regimes do not appear to overlap as the rules on copyright are lex specialis to the Digital Services Act. A closer look shows a more complex and nuanced picture. Our analysis concludes that the DSA will apply to OCSSPs insofar as it contains rules that regulate matters not covered by art. 17 CDSM Directive, as well as specific rules on matters where art. 17 leaves margin of discretion to Member States. This includes, to varying degrees, rules in the DSA relating to the liability of intermediary providers and to due diligence obligations for online platforms of different sizes. Importantly, we consider that such rules apply even where art. 17 CDSM Directive contains specific (but less precise) regulation on the matter. From a normative perspective, this might be a desirable outcome, to the extent that the DSA aims to establish “uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected”. Based on our analysis, we suggest a number of clarifications that might be help achieve that goal.

Art. 17 CDSM Directive, Content moderation, Copyright, Digital services act, frontpage, Online platforms

Bibtex

Article{Quintais2021e, title = {The Interplay between the Digital Services Act and Sector Regulation: How Special is Copyright?}, author = {Quintais, J. and Schwemer, S.}, url = {https://www.ivir.nl/ejrr_2022/}, doi = {https://doi.org/https://doi.org/10.1017/err.2022.1}, year = {0314}, date = {2022-03-14}, journal = {European Journal of Risk Regulation}, volume = {13}, issue = {2}, pages = {191-217}, abstract = {On 15 December 2020, the European Commission published its proposal for a Regulation on a Single Market for Digital Services (Digital Services Act). It carries out a regulatory overhaul of the 21-year- old horizontal rules on intermediary liability in the Directive and introduces new due diligence obligations for intermediary services. Our analysis illuminates an important point that has so far received little attention: how would the Digital Services Act’s rules interact with existing sector-specific lex specialis rules? In this paper, we look specifically at the intersection of the Digital Services Act with the regime for online content sharing service providers (OCSSPs) set forth in art. 17 of Directive (EU) 2019/790 on copyright in the Digital Single Market (CDSM Directive). At first glance, these regimes do not appear to overlap as the rules on copyright are lex specialis to the Digital Services Act. A closer look shows a more complex and nuanced picture. Our analysis concludes that the DSA will apply to OCSSPs insofar as it contains rules that regulate matters not covered by art. 17 CDSM Directive, as well as specific rules on matters where art. 17 leaves margin of discretion to Member States. This includes, to varying degrees, rules in the DSA relating to the liability of intermediary providers and to due diligence obligations for online platforms of different sizes. Importantly, we consider that such rules apply even where art. 17 CDSM Directive contains specific (but less precise) regulation on the matter. From a normative perspective, this might be a desirable outcome, to the extent that the DSA aims to establish “uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected”. Based on our analysis, we suggest a number of clarifications that might be help achieve that goal.}, keywords = {Art. 17 CDSM Directive, Content moderation, Copyright, Digital services act, frontpage, Online platforms}, }

Intermediary Liability and Trade Mark Infringement – Proliferation of Filter Obligations in Civil Law Jurisdictions? external link

1126, pp: 381-403

Abstract

The erosion of the safe harbour for hosting in the EU Directive on Copyright in the Digital Single Market (CDSM Directive) leads to a remarkable climate change in the field of EU copyright law and the civil law jurisdictions of continental EU Member States. Inevitably, it raises the question of potential repercussions on the safe harbour for hosting and filtering standards in trademark cases. Even though online marketplaces are explicitly exempted from the new copyright rules and the CDSM Directive is not intended to neutralize the safe harbour for hosting in trademark cases, the adoption of a more restrictive approach in copyright law may quicken the appetite of trademark proprietors for similar measures in trademark law. The extension of the new copyright approach to trademark cases, however, is unlikely to yield satisfactory results.Due to the different conceptual contours of trademark rights, a system mimicking the filtering obligations following from the CDSM Directive would give trademark proprietors excessive control over the use of their trademarks in the digital environment. Such an overbroad system of automated, algorithmic filtering would encroach upon the fundamental guarantee of freedom of expression and freedom of competition. It is likely to have a chilling effect on legitimate descriptive use of trademarks, comparative advertising, advertising by resellers, information about alternative offers in the marketplace, and use criticizing or commenting upon trademarked products. As a result, consumers would receive less diverse information on goods and services and the free movement of goods and services in the internal market would be curtailed. The reliability of the internet as an independent source of trademark-related information would be put at risk. The analysis, thus, leads to the insight that a proliferation of the new filtering obligations in copyright law is undesirable and should be avoided.

algorithmic enforcement, confusion, Content moderation, descriptive use, dilution, exhaustion of trademark rights, filtering obligations, free movement of goods and services, freedom of commercial expression, freedom of competition, frontpage, market transparency, Merkenrecht, parallel imports, platform economy

Bibtex

Chapter{Senftleben2020g, title = {Intermediary Liability and Trade Mark Infringement – Proliferation of Filter Obligations in Civil Law Jurisdictions?}, author = {Senftleben, M.}, url = {https://www.ivir.nl/publicaties/download/Intermediary_Liability_and_Trade_Mark_Infringement.pdf https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3736919 https://www.oxfordhandbooks.com/view/10.1093/oxfordhb/9780198837138.001.0001/oxfordhb-9780198837138}, year = {1126}, date = {2020-11-26}, abstract = {The erosion of the safe harbour for hosting in the EU Directive on Copyright in the Digital Single Market (CDSM Directive) leads to a remarkable climate change in the field of EU copyright law and the civil law jurisdictions of continental EU Member States. Inevitably, it raises the question of potential repercussions on the safe harbour for hosting and filtering standards in trademark cases. Even though online marketplaces are explicitly exempted from the new copyright rules and the CDSM Directive is not intended to neutralize the safe harbour for hosting in trademark cases, the adoption of a more restrictive approach in copyright law may quicken the appetite of trademark proprietors for similar measures in trademark law. The extension of the new copyright approach to trademark cases, however, is unlikely to yield satisfactory results.Due to the different conceptual contours of trademark rights, a system mimicking the filtering obligations following from the CDSM Directive would give trademark proprietors excessive control over the use of their trademarks in the digital environment. Such an overbroad system of automated, algorithmic filtering would encroach upon the fundamental guarantee of freedom of expression and freedom of competition. It is likely to have a chilling effect on legitimate descriptive use of trademarks, comparative advertising, advertising by resellers, information about alternative offers in the marketplace, and use criticizing or commenting upon trademarked products. As a result, consumers would receive less diverse information on goods and services and the free movement of goods and services in the internal market would be curtailed. The reliability of the internet as an independent source of trademark-related information would be put at risk. The analysis, thus, leads to the insight that a proliferation of the new filtering obligations in copyright law is undesirable and should be avoided.}, keywords = {algorithmic enforcement, confusion, Content moderation, descriptive use, dilution, exhaustion of trademark rights, filtering obligations, free movement of goods and services, freedom of commercial expression, freedom of competition, frontpage, market transparency, Merkenrecht, parallel imports, platform economy}, }

The Odyssey of the Prohibition on General Monitoring Obligations on the Way to the Digital Services Act: Between Article 15 of the E-Commerce Directive and Article 17 of the Directive on Copyright in the Digital Single Market external link

Abstract

EU law provides explicitly that intermediaries may not be obliged to monitor their service in a general manner in order to detect and prevent the illegal activity of their users. However, a misunderstanding of the difference between monitoring specific content and monitoring FOR specific content is a recurrent theme in the debate on intermediary liability and a central driver of the controversy surrounding it. Rightly understood, a prohibited general monitoring obligation arises whenever content – no matter how specifically it is defined – must be identified among the totality of the content on a platform. The moment platform content must be screened in its entirety, the monitoring obligation acquires an excessive, general nature. Against this background, a content moderation duty can only be deemed permissible if it is specific in respect of both the protected subject matter and potential infringers. This requirement of 'double specificity' is of particular importance because it prevents encroachments upon fundamental rights. The jurisprudence of the Court of Justice of the European Union has shed light on the anchorage of the general monitoring ban in primary EU law, in particular the right to the protection of personal data, the freedom of expression and information, the freedom to conduct a business, and the free movement of goods and services in the internal market. Due to their higher rank in the norm hierarchy, these legal guarantees constitute common ground for the application of the general monitoring prohibition in secondary EU legislation, namely Article 15(1) of the E-Commerce Directive ('ECD') and Article 17(8) of the Directive on Copyright in the Digital Single Market ('CDSMD'). With regard to the Digital Services Act (‘DSA’), this result of the analysis implies that any further manifestation of the general monitoring ban in the DSA would have to be construed and applied – in the light of applicable CJEU case law – as a safeguard against encroachments upon the aforementioned fundamental rights and freedoms. If the final text of the DSA does not contain a reiteration of the prohibition of general monitoring obligations known from Article 15(1) ECD and Article 17(8) CDSMD, the regulation of internet service provider liability, duties of care and injunctions would still have to avoid inroads into the aforementioned fundamental rights and freedoms and observe the principle of proportionality. The double specificity requirement plays a central role in this respect.

algorithmic enforcement, Auteursrecht, censorship, Content moderation, Copyright, defamation, Digital services act, filtering, Freedom of expression, frontpage, general monitoring, hosting service, injunctive relief, intermediary liability, notice and stay down, notice and take down, safe harbour, trade mark, user-generated content

Bibtex

Report{Senftleben2020e, title = {The Odyssey of the Prohibition on General Monitoring Obligations on the Way to the Digital Services Act: Between Article 15 of the E-Commerce Directive and Article 17 of the Directive on Copyright in the Digital Single Market}, author = {Senftleben, M. and Angelopoulos, C.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3717022}, year = {1029}, date = {2020-10-29}, abstract = {EU law provides explicitly that intermediaries may not be obliged to monitor their service in a general manner in order to detect and prevent the illegal activity of their users. However, a misunderstanding of the difference between monitoring specific content and monitoring FOR specific content is a recurrent theme in the debate on intermediary liability and a central driver of the controversy surrounding it. Rightly understood, a prohibited general monitoring obligation arises whenever content – no matter how specifically it is defined – must be identified among the totality of the content on a platform. The moment platform content must be screened in its entirety, the monitoring obligation acquires an excessive, general nature. Against this background, a content moderation duty can only be deemed permissible if it is specific in respect of both the protected subject matter and potential infringers. This requirement of \'double specificity\' is of particular importance because it prevents encroachments upon fundamental rights. The jurisprudence of the Court of Justice of the European Union has shed light on the anchorage of the general monitoring ban in primary EU law, in particular the right to the protection of personal data, the freedom of expression and information, the freedom to conduct a business, and the free movement of goods and services in the internal market. Due to their higher rank in the norm hierarchy, these legal guarantees constitute common ground for the application of the general monitoring prohibition in secondary EU legislation, namely Article 15(1) of the E-Commerce Directive (\'ECD\') and Article 17(8) of the Directive on Copyright in the Digital Single Market (\'CDSMD\'). With regard to the Digital Services Act (‘DSA’), this result of the analysis implies that any further manifestation of the general monitoring ban in the DSA would have to be construed and applied – in the light of applicable CJEU case law – as a safeguard against encroachments upon the aforementioned fundamental rights and freedoms. If the final text of the DSA does not contain a reiteration of the prohibition of general monitoring obligations known from Article 15(1) ECD and Article 17(8) CDSMD, the regulation of internet service provider liability, duties of care and injunctions would still have to avoid inroads into the aforementioned fundamental rights and freedoms and observe the principle of proportionality. The double specificity requirement plays a central role in this respect.}, keywords = {algorithmic enforcement, Auteursrecht, censorship, Content moderation, Copyright, defamation, Digital services act, filtering, Freedom of expression, frontpage, general monitoring, hosting service, injunctive relief, intermediary liability, notice and stay down, notice and take down, safe harbour, trade mark, user-generated content}, }

Selected Aspects of Implementing Article 17 of the Directive on Copyright in the Digital Single Market into National Law – Comment of the European Copyright Society external link

Metzger, A., Senftleben, M., Derclaye E., Dreier, T., Geiger, C., Griffiths, J., Hilty, R., Hugenholtz, P., Riis, T., Rognstad, O.A., Strowel, A.M., Synodinou, T. & Xalabarder, R.
2020

Abstract

The national implementation of Article 17 of the Directive on Copyright in the Digital Single Market (DSMD) poses particular challenges. Article 17 is one of the most complex – and most controversial – provisions of the new legislative package which EU Member States must transpose into national law by 7 June 2021. Seeking to contribute to the debate on implementation options, the European Copyright Society addresses several core aspects of Article 17 that may play an important role in the national implementation process. It deals with the concept of online content-sharing service providers (OCSSPs) before embarking on a discussion of the licensing and content moderation duties which OCSSPs must fulfil in accordance with Article 17(1) and (4). The analysis also focuses on the copyright limitations mentioned in Article 17(7) that support the creation and dissemination of transformative user-generated content (UGC). It also discusses the appropriate configuration of complaint and redress mechanisms set forth in Article 17(9) that seek to reduce the risk of unjustified content removals. Finally, the European Copyright Society addresses the possibility of implementing direct remuneration claims for authors and performers, and explores the private international law aspect of applicable law – an impact factor that is often overlooked in the debate.

algorithmic enforcement, applicable law, collective copyright management, content hosting, Content moderation, copyright contract law, EU copyright law, filtering mechanisms, Freedom of expression, Licensing, notice-and-takedown, private international law, transformative use, user-generated content

Bibtex

Article{Metzger2020, title = {Selected Aspects of Implementing Article 17 of the Directive on Copyright in the Digital Single Market into National Law – Comment of the European Copyright Society}, author = {Metzger, A. and Senftleben, M. and Derclaye E. and Dreier, T. and Geiger, C. and Griffiths, J. and Hilty, R. and Hugenholtz, P. and Riis, T. and Rognstad, O.A. and Strowel, A.M. and Synodinou, T. and Xalabarder, R.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3589323}, year = {0507}, date = {2020-05-07}, abstract = {The national implementation of Article 17 of the Directive on Copyright in the Digital Single Market (DSMD) poses particular challenges. Article 17 is one of the most complex – and most controversial – provisions of the new legislative package which EU Member States must transpose into national law by 7 June 2021. Seeking to contribute to the debate on implementation options, the European Copyright Society addresses several core aspects of Article 17 that may play an important role in the national implementation process. It deals with the concept of online content-sharing service providers (OCSSPs) before embarking on a discussion of the licensing and content moderation duties which OCSSPs must fulfil in accordance with Article 17(1) and (4). The analysis also focuses on the copyright limitations mentioned in Article 17(7) that support the creation and dissemination of transformative user-generated content (UGC). It also discusses the appropriate configuration of complaint and redress mechanisms set forth in Article 17(9) that seek to reduce the risk of unjustified content removals. Finally, the European Copyright Society addresses the possibility of implementing direct remuneration claims for authors and performers, and explores the private international law aspect of applicable law – an impact factor that is often overlooked in the debate.}, keywords = {algorithmic enforcement, applicable law, collective copyright management, content hosting, Content moderation, copyright contract law, EU copyright law, filtering mechanisms, Freedom of expression, Licensing, notice-and-takedown, private international law, transformative use, user-generated content}, }