How the EU Outsources the Task of Human Rights Protection to Platforms and Users: The Case of UGC Monetization external link

Senftleben, M., Quintais, J. & Meiring, A.
Berkeley Technology Law Journal, vol. 38, iss. : 3, pp: 933-1010, 2024

Abstract

With the shift from the traditional safe harbor for hosting to statutory content filtering and licensing obligations, EU copyright law has substantially curtailed the freedom of users to upload and share their content creations. Seeking to avoid overbroad inroads into freedom of expression, EU law obliges online platforms and the creative industry to take into account human rights when coordinating their content filtering actions. Platforms must also establish complaint and redress procedures for users. The European Commission will initiate stakeholder dialogues to identify best practices. These “safety valves” in the legislative package, however, are mere fig leaves. Instead of safeguarding human rights, the EU legislator outsources human rights obligations to the platform industry. At the same time, the burden of policing content moderation systems is imposed on users who are unlikely to bring complaints in each individual case. The new legislative design in the EU will thus “conceal” human rights violations instead of bringing them to light. Nonetheless, the DSA rests on the same – highly problematic – approach. Against this background, the paper discusses the weakening – and potential loss – of fundamental freedoms as a result of the departure from the traditional notice-and-takedown approach. Adding a new element to the ongoing debate on content licensing and filtering, the analysis will devote particular attention to the fact that EU law, for the most part, has left untouched the private power of platforms to determine the “house rules” governing the most popular copyright-owner reaction to detected matches between protected works and content uploads: the (algorithmic) monetization of that content. Addressing the “legal vacuum” in the field of content monetization, the analysis explores outsourcing and concealment risks in this unregulated space. Focusing on large-scale platforms for user-generated content, such as YouTube, Instagram and TikTok, two normative problems come to the fore: (1) the fact that rightholders, when opting for monetization, de facto monetize not only their own rights but also the creative input of users; (2) the fact that user creativity remains unremunerated as long as the monetization option is only available to rightholders. As a result of this configuration, the monetization mechanism disregards users’ right to (intellectual) property and discriminates against user creativity. Against this background, we discuss whether the DSA provisions that seek to ensure transparency of content moderation actions and terms and conditions offer useful sources of information that could empower users. Moreover, we raise the question whether the detailed regulation of platform actions in the DSA may resolve the described human rights dilemmas to some extent.

Artificial intelligence, Content moderation, Copyright, derivative works, discrimination, Freedom of expression, Human rights, liability, proportionality, user-generated content

Bibtex

Article{nokey, title = {How the EU Outsources the Task of Human Rights Protection to Platforms and Users: The Case of UGC Monetization}, author = {Senftleben, M. and Quintais, J. and Meiring, A.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4421150}, year = {2024}, date = {2024-01-23}, journal = {Berkeley Technology Law Journal}, volume = {38}, issue = {3}, pages = {933-1010}, abstract = {With the shift from the traditional safe harbor for hosting to statutory content filtering and licensing obligations, EU copyright law has substantially curtailed the freedom of users to upload and share their content creations. Seeking to avoid overbroad inroads into freedom of expression, EU law obliges online platforms and the creative industry to take into account human rights when coordinating their content filtering actions. Platforms must also establish complaint and redress procedures for users. The European Commission will initiate stakeholder dialogues to identify best practices. These “safety valves” in the legislative package, however, are mere fig leaves. Instead of safeguarding human rights, the EU legislator outsources human rights obligations to the platform industry. At the same time, the burden of policing content moderation systems is imposed on users who are unlikely to bring complaints in each individual case. The new legislative design in the EU will thus “conceal” human rights violations instead of bringing them to light. Nonetheless, the DSA rests on the same – highly problematic – approach. Against this background, the paper discusses the weakening – and potential loss – of fundamental freedoms as a result of the departure from the traditional notice-and-takedown approach. Adding a new element to the ongoing debate on content licensing and filtering, the analysis will devote particular attention to the fact that EU law, for the most part, has left untouched the private power of platforms to determine the “house rules” governing the most popular copyright-owner reaction to detected matches between protected works and content uploads: the (algorithmic) monetization of that content. Addressing the “legal vacuum” in the field of content monetization, the analysis explores outsourcing and concealment risks in this unregulated space. Focusing on large-scale platforms for user-generated content, such as YouTube, Instagram and TikTok, two normative problems come to the fore: (1) the fact that rightholders, when opting for monetization, de facto monetize not only their own rights but also the creative input of users; (2) the fact that user creativity remains unremunerated as long as the monetization option is only available to rightholders. As a result of this configuration, the monetization mechanism disregards users’ right to (intellectual) property and discriminates against user creativity. Against this background, we discuss whether the DSA provisions that seek to ensure transparency of content moderation actions and terms and conditions offer useful sources of information that could empower users. Moreover, we raise the question whether the detailed regulation of platform actions in the DSA may resolve the described human rights dilemmas to some extent.}, keywords = {Artificial intelligence, Content moderation, Copyright, derivative works, discrimination, Freedom of expression, Human rights, liability, proportionality, user-generated content}, }

Outsourcing Human Rights Obligations and Concealing Human Rights Deficits: The Example of Monetizing User-Generated Content Under the CDSM Directive and the Digital Services Act external link

Senftleben, M., Quintais, J. & Meiring, A.

Abstract

With the shift from the traditional safe harbor for hosting to statutory content filtering and licensing obligations, EU copyright law has substantially curtailed the freedom of users to upload and share their content creations. Seeking to avoid overbroad inroads into freedom of expression, EU law obliges online platforms and the creative industry to take into account human rights when coordinating their content filtering actions. Platforms must also establish complaint and redress procedures for users. The European Commission will initiate stakeholder dialogues to identify best practices. These “safety valves” in the legislative package, however, are mere fig leaves. Instead of safeguarding human rights, the EU legislator outsources human rights obligations to the platform industry. At the same time, the burden of policing content moderation systems is imposed on users who are unlikely to bring complaints in each individual case. The new legislative design in the EU will thus “conceal” human rights violations instead of bringing them to light. Nonetheless, the DSA rests on the same – highly problematic – approach. Against this background, the paper discusses the weakening – and potential loss – of fundamental freedoms as a result of the departure from the traditional notice-and-takedown approach. Adding a new element to the ongoing debate on content licensing and filtering, the analysis will devote particular attention to the fact that EU law, for the most part, has left untouched the private power of platforms to determine the “house rules” governing the most popular copyright-owner reaction to detected matches between protected works and content uploads: the (algorithmic) monetization of that content. Addressing the “legal vacuum” in the field of content monetization, the analysis explores outsourcing and concealment risks in this unregulated space. Focusing on large-scale platforms for user-generated content, such as YouTube, Instagram and TikTok, two normative problems come to the fore: (1) the fact that rightholders, when opting for monetization, de facto monetize not only their own rights but also the creative input of users; (2) the fact that user creativity remains unremunerated as long as the monetization option is only available to rightholders. As a result of this configuration, the monetization mechanism disregards users’ right to (intellectual) property and discriminates against user creativity. Against this background, we discuss whether the DSA provisions that seek to ensure transparency of content moderation actions and terms and conditions offer useful sources of information that could empower users. Moreover, we raise the question whether the detailed regulation of platform actions in the DSA may resolve the described human rights dilemmas to some extent.

Artificial intelligence, Content moderation, Copyright, derivative works, discrimination, Freedom of expression, Human rights, liability, user-generated content

Bibtex

Online publication{nokey, title = {Outsourcing Human Rights Obligations and Concealing Human Rights Deficits: The Example of Monetizing User-Generated Content Under the CDSM Directive and the Digital Services Act}, author = {Senftleben, M. and Quintais, J. and Meiring, A.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4421150}, year = {}, date = {DATE ERROR: pub_date = }, abstract = {With the shift from the traditional safe harbor for hosting to statutory content filtering and licensing obligations, EU copyright law has substantially curtailed the freedom of users to upload and share their content creations. Seeking to avoid overbroad inroads into freedom of expression, EU law obliges online platforms and the creative industry to take into account human rights when coordinating their content filtering actions. Platforms must also establish complaint and redress procedures for users. The European Commission will initiate stakeholder dialogues to identify best practices. These “safety valves” in the legislative package, however, are mere fig leaves. Instead of safeguarding human rights, the EU legislator outsources human rights obligations to the platform industry. At the same time, the burden of policing content moderation systems is imposed on users who are unlikely to bring complaints in each individual case. The new legislative design in the EU will thus “conceal” human rights violations instead of bringing them to light. Nonetheless, the DSA rests on the same – highly problematic – approach. Against this background, the paper discusses the weakening – and potential loss – of fundamental freedoms as a result of the departure from the traditional notice-and-takedown approach. Adding a new element to the ongoing debate on content licensing and filtering, the analysis will devote particular attention to the fact that EU law, for the most part, has left untouched the private power of platforms to determine the “house rules” governing the most popular copyright-owner reaction to detected matches between protected works and content uploads: the (algorithmic) monetization of that content. Addressing the “legal vacuum” in the field of content monetization, the analysis explores outsourcing and concealment risks in this unregulated space. Focusing on large-scale platforms for user-generated content, such as YouTube, Instagram and TikTok, two normative problems come to the fore: (1) the fact that rightholders, when opting for monetization, de facto monetize not only their own rights but also the creative input of users; (2) the fact that user creativity remains unremunerated as long as the monetization option is only available to rightholders. As a result of this configuration, the monetization mechanism disregards users’ right to (intellectual) property and discriminates against user creativity. Against this background, we discuss whether the DSA provisions that seek to ensure transparency of content moderation actions and terms and conditions offer useful sources of information that could empower users. Moreover, we raise the question whether the detailed regulation of platform actions in the DSA may resolve the described human rights dilemmas to some extent.}, keywords = {Artificial intelligence, Content moderation, Copyright, derivative works, discrimination, Freedom of expression, Human rights, liability, user-generated content}, }

The Odyssey of the Prohibition on General Monitoring Obligations on the Way to the Digital Services Act: Between Article 15 of the E-Commerce Directive and Article 17 of the Directive on Copyright in the Digital Single Market external link

Abstract

EU law provides explicitly that intermediaries may not be obliged to monitor their service in a general manner in order to detect and prevent the illegal activity of their users. However, a misunderstanding of the difference between monitoring specific content and monitoring FOR specific content is a recurrent theme in the debate on intermediary liability and a central driver of the controversy surrounding it. Rightly understood, a prohibited general monitoring obligation arises whenever content – no matter how specifically it is defined – must be identified among the totality of the content on a platform. The moment platform content must be screened in its entirety, the monitoring obligation acquires an excessive, general nature. Against this background, a content moderation duty can only be deemed permissible if it is specific in respect of both the protected subject matter and potential infringers. This requirement of 'double specificity' is of particular importance because it prevents encroachments upon fundamental rights. The jurisprudence of the Court of Justice of the European Union has shed light on the anchorage of the general monitoring ban in primary EU law, in particular the right to the protection of personal data, the freedom of expression and information, the freedom to conduct a business, and the free movement of goods and services in the internal market. Due to their higher rank in the norm hierarchy, these legal guarantees constitute common ground for the application of the general monitoring prohibition in secondary EU legislation, namely Article 15(1) of the E-Commerce Directive ('ECD') and Article 17(8) of the Directive on Copyright in the Digital Single Market ('CDSMD'). With regard to the Digital Services Act (‘DSA’), this result of the analysis implies that any further manifestation of the general monitoring ban in the DSA would have to be construed and applied – in the light of applicable CJEU case law – as a safeguard against encroachments upon the aforementioned fundamental rights and freedoms. If the final text of the DSA does not contain a reiteration of the prohibition of general monitoring obligations known from Article 15(1) ECD and Article 17(8) CDSMD, the regulation of internet service provider liability, duties of care and injunctions would still have to avoid inroads into the aforementioned fundamental rights and freedoms and observe the principle of proportionality. The double specificity requirement plays a central role in this respect.

algorithmic enforcement, Auteursrecht, censorship, Content moderation, Copyright, defamation, Digital services act, filtering, Freedom of expression, frontpage, general monitoring, hosting service, injunctive relief, intermediary liability, notice and stay down, notice and take down, safe harbour, trade mark, user-generated content

Bibtex

Report{Senftleben2020e, title = {The Odyssey of the Prohibition on General Monitoring Obligations on the Way to the Digital Services Act: Between Article 15 of the E-Commerce Directive and Article 17 of the Directive on Copyright in the Digital Single Market}, author = {Senftleben, M. and Angelopoulos, C.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3717022}, year = {1029}, date = {2020-10-29}, abstract = {EU law provides explicitly that intermediaries may not be obliged to monitor their service in a general manner in order to detect and prevent the illegal activity of their users. However, a misunderstanding of the difference between monitoring specific content and monitoring FOR specific content is a recurrent theme in the debate on intermediary liability and a central driver of the controversy surrounding it. Rightly understood, a prohibited general monitoring obligation arises whenever content – no matter how specifically it is defined – must be identified among the totality of the content on a platform. The moment platform content must be screened in its entirety, the monitoring obligation acquires an excessive, general nature. Against this background, a content moderation duty can only be deemed permissible if it is specific in respect of both the protected subject matter and potential infringers. This requirement of \'double specificity\' is of particular importance because it prevents encroachments upon fundamental rights. The jurisprudence of the Court of Justice of the European Union has shed light on the anchorage of the general monitoring ban in primary EU law, in particular the right to the protection of personal data, the freedom of expression and information, the freedom to conduct a business, and the free movement of goods and services in the internal market. Due to their higher rank in the norm hierarchy, these legal guarantees constitute common ground for the application of the general monitoring prohibition in secondary EU legislation, namely Article 15(1) of the E-Commerce Directive (\'ECD\') and Article 17(8) of the Directive on Copyright in the Digital Single Market (\'CDSMD\'). With regard to the Digital Services Act (‘DSA’), this result of the analysis implies that any further manifestation of the general monitoring ban in the DSA would have to be construed and applied – in the light of applicable CJEU case law – as a safeguard against encroachments upon the aforementioned fundamental rights and freedoms. If the final text of the DSA does not contain a reiteration of the prohibition of general monitoring obligations known from Article 15(1) ECD and Article 17(8) CDSMD, the regulation of internet service provider liability, duties of care and injunctions would still have to avoid inroads into the aforementioned fundamental rights and freedoms and observe the principle of proportionality. The double specificity requirement plays a central role in this respect.}, keywords = {algorithmic enforcement, Auteursrecht, censorship, Content moderation, Copyright, defamation, Digital services act, filtering, Freedom of expression, frontpage, general monitoring, hosting service, injunctive relief, intermediary liability, notice and stay down, notice and take down, safe harbour, trade mark, user-generated content}, }

Selected Aspects of Implementing Article 17 of the Directive on Copyright in the Digital Single Market into National Law – Comment of the European Copyright Society external link

Metzger, A., Senftleben, M., Derclaye E., Dreier, T., Geiger, C., Griffiths, J., Hilty, R., Hugenholtz, P., Riis, T., Rognstad, O.A., Strowel, A.M., Synodinou, T. & Xalabarder, R.
2020

Abstract

The national implementation of Article 17 of the Directive on Copyright in the Digital Single Market (DSMD) poses particular challenges. Article 17 is one of the most complex – and most controversial – provisions of the new legislative package which EU Member States must transpose into national law by 7 June 2021. Seeking to contribute to the debate on implementation options, the European Copyright Society addresses several core aspects of Article 17 that may play an important role in the national implementation process. It deals with the concept of online content-sharing service providers (OCSSPs) before embarking on a discussion of the licensing and content moderation duties which OCSSPs must fulfil in accordance with Article 17(1) and (4). The analysis also focuses on the copyright limitations mentioned in Article 17(7) that support the creation and dissemination of transformative user-generated content (UGC). It also discusses the appropriate configuration of complaint and redress mechanisms set forth in Article 17(9) that seek to reduce the risk of unjustified content removals. Finally, the European Copyright Society addresses the possibility of implementing direct remuneration claims for authors and performers, and explores the private international law aspect of applicable law – an impact factor that is often overlooked in the debate.

algorithmic enforcement, applicable law, collective copyright management, content hosting, Content moderation, copyright contract law, EU copyright law, filtering mechanisms, Freedom of expression, Licensing, notice-and-takedown, private international law, transformative use, user-generated content

Bibtex

Article{Metzger2020, title = {Selected Aspects of Implementing Article 17 of the Directive on Copyright in the Digital Single Market into National Law – Comment of the European Copyright Society}, author = {Metzger, A. and Senftleben, M. and Derclaye E. and Dreier, T. and Geiger, C. and Griffiths, J. and Hilty, R. and Hugenholtz, P. and Riis, T. and Rognstad, O.A. and Strowel, A.M. and Synodinou, T. and Xalabarder, R.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3589323}, year = {0507}, date = {2020-05-07}, abstract = {The national implementation of Article 17 of the Directive on Copyright in the Digital Single Market (DSMD) poses particular challenges. Article 17 is one of the most complex – and most controversial – provisions of the new legislative package which EU Member States must transpose into national law by 7 June 2021. Seeking to contribute to the debate on implementation options, the European Copyright Society addresses several core aspects of Article 17 that may play an important role in the national implementation process. It deals with the concept of online content-sharing service providers (OCSSPs) before embarking on a discussion of the licensing and content moderation duties which OCSSPs must fulfil in accordance with Article 17(1) and (4). The analysis also focuses on the copyright limitations mentioned in Article 17(7) that support the creation and dissemination of transformative user-generated content (UGC). It also discusses the appropriate configuration of complaint and redress mechanisms set forth in Article 17(9) that seek to reduce the risk of unjustified content removals. Finally, the European Copyright Society addresses the possibility of implementing direct remuneration claims for authors and performers, and explores the private international law aspect of applicable law – an impact factor that is often overlooked in the debate.}, keywords = {algorithmic enforcement, applicable law, collective copyright management, content hosting, Content moderation, copyright contract law, EU copyright law, filtering mechanisms, Freedom of expression, Licensing, notice-and-takedown, private international law, transformative use, user-generated content}, }

Institutionalized Algorithmic Enforcement – The Pros and Cons of the EU Approach to UGC Platform Liability external link

Florida International University Law Review, vol. 14, num: 2, pp: 299-328, 2020

Abstract

Algorithmic copyright enforcement – the use of automated filtering tools to detect infringing content before it appears on the internet – has a deep impact on the freedom of users to upload and share information. Instead of presuming that user-generated content ("UGC") does not amount to infringement unless copyright owners take action and provide proof, the default position of automated filtering systems is that every upload is suspicious and that copyright owners are entitled to ex ante control over the sharing of information online. If platform providers voluntarily introduce algorithmic enforcement measures, this may be seen as a private decision following from the freedom of companies to run their business as they wish. If, however, copyright legislation institutionalizes algorithmic enforcement and imposes a legal obligation on platform providers to employ automated filtering tools, the law itself transforms copyright into a censorship and filtering instrument. Nonetheless, the new EU Directive on Copyright in the Digital Single Market (“DSM Directive”) follows this path and requires the employment of automated filtering tools to ensure that unauthorized protected content does not populate UGC platforms. The new EU rules on UGC licensing and screening will inevitably lead to the adoption of algorithmic enforcement measures in practice. Without automated content control, UGC platforms will be unable to escape liability for infringing user uploads. To provide a complete picture, however, it is important to also shed light on counterbalances which may distinguish this new, institutionalized form of algorithmic enforcement from known content filtering tools that have evolved as voluntary measures in the private sector. The DSM Directive underlines the necessity to safeguard user freedoms that support transformative, creative remixes and mash-ups of pre-existing content. This feature of the new legislation may offer important incentives to develop algorithmic tools that go beyond the mere identification of unauthorized takings from protected works. It has the potential to encourage content assessment mechanisms that factor the degree of transformative effort and user creativity into the equation. As a result, more balanced content filtering tools may emerge in the EU. Against this background, the analysis shows that the new EU legislation not only escalates the use of algorithmic enforcement measures that already commenced in the private sector years ago. If rightly implemented, it may also add an important nuance to existing content identification tools and alleviate the problems arising from reliance on automated filtering mechanisms.

aansprakelijkheid, Auteursrecht, censuur, EU, frontpage, Platforms, user-generated content, Vrijheid van meningsuiting

Bibtex

Article{Senftleben2020, title = {Institutionalized Algorithmic Enforcement – The Pros and Cons of the EU Approach to UGC Platform Liability}, author = {Senftleben, M.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3565175 https://ecollections.law.fiu.edu/lawreview/vol14/iss2/11/}, doi = {https://doi.org/10.25148/lawrev.14.2.11}, year = {1020}, date = {2020-10-20}, journal = {Florida International University Law Review}, volume = {14}, number = {2}, pages = {299-328}, abstract = {Algorithmic copyright enforcement – the use of automated filtering tools to detect infringing content before it appears on the internet – has a deep impact on the freedom of users to upload and share information. Instead of presuming that user-generated content ("UGC") does not amount to infringement unless copyright owners take action and provide proof, the default position of automated filtering systems is that every upload is suspicious and that copyright owners are entitled to ex ante control over the sharing of information online. If platform providers voluntarily introduce algorithmic enforcement measures, this may be seen as a private decision following from the freedom of companies to run their business as they wish. If, however, copyright legislation institutionalizes algorithmic enforcement and imposes a legal obligation on platform providers to employ automated filtering tools, the law itself transforms copyright into a censorship and filtering instrument. Nonetheless, the new EU Directive on Copyright in the Digital Single Market (“DSM Directive”) follows this path and requires the employment of automated filtering tools to ensure that unauthorized protected content does not populate UGC platforms. The new EU rules on UGC licensing and screening will inevitably lead to the adoption of algorithmic enforcement measures in practice. Without automated content control, UGC platforms will be unable to escape liability for infringing user uploads. To provide a complete picture, however, it is important to also shed light on counterbalances which may distinguish this new, institutionalized form of algorithmic enforcement from known content filtering tools that have evolved as voluntary measures in the private sector. The DSM Directive underlines the necessity to safeguard user freedoms that support transformative, creative remixes and mash-ups of pre-existing content. This feature of the new legislation may offer important incentives to develop algorithmic tools that go beyond the mere identification of unauthorized takings from protected works. It has the potential to encourage content assessment mechanisms that factor the degree of transformative effort and user creativity into the equation. As a result, more balanced content filtering tools may emerge in the EU. Against this background, the analysis shows that the new EU legislation not only escalates the use of algorithmic enforcement measures that already commenced in the private sector years ago. If rightly implemented, it may also add an important nuance to existing content identification tools and alleviate the problems arising from reliance on automated filtering mechanisms.}, keywords = {aansprakelijkheid, Auteursrecht, censuur, EU, frontpage, Platforms, user-generated content, Vrijheid van meningsuiting}, }