The right to trust your vote: Cybersecurity, human rights and electronic voting download

van Daalen, O. & Hoekstra, N.
2024

cybersecurity, Electronic voting, Human rights

Bibtex

Report{nokey, title = {The right to trust your vote: Cybersecurity, human rights and electronic voting}, author = {van Daalen, O. and Hoekstra, N.}, url = {https://www.ivir.nl/nl/publications/the-right-to-trust-your-vote-cybersecurity-human-rights-and-electronic-voting/vandaalenhoekstra2024a/}, year = {2024}, date = {2024-12-05}, keywords = {cybersecurity, Electronic voting, Human rights}, }

From Encryption to Quantum Computing – The Governance of Information Security and Human Rights external link

T.M.C. Asser Press, 2024, Series: Information Technology and Law (IT&Law) Series, Edition: 38, ISBN: 978-94-6265-634-5

encryption, Human rights, Information security

Bibtex

Book{nokey, title = {From Encryption to Quantum Computing – The Governance of Information Security and Human Rights}, author = {van Daalen, O.}, url = {https://link.springer.com/book/10.1007/978-94-6265-635-2}, year = {2024}, date = {2024-09-10}, keywords = {encryption, Human rights, Information security}, }

Copyright, Upcycling, and the Human Right to Environmental Protection external link

Kluwer Copyright Blog, 2024

Copyright, Human rights

Bibtex

Online publication{nokey, title = {Copyright, Upcycling, and the Human Right to Environmental Protection}, author = {Izyumenko, E.}, url = {https://copyrightblog.kluweriplaw.com/2024/05/30/copyright-upcycling-and-the-human-right-to-environmental-protection/}, year = {2024}, date = {2024-05-30}, journal = {Kluwer Copyright Blog}, keywords = {Copyright, Human rights}, }

Annotatie bij EHRM 9 maart 2023 (LB / Hongarije) download

Nederlandse Jurisprudentie, iss. : 15, num: 144, pp: 3352-3354, 2024

Abstract

Openbaar maken persoonsgegevens wegens belastingschuld. Bescherming persoonlijke data. Belang van toetsing in individuele gevallen. Margin of appreciation. Schending van art. 8 EVRM. Grote Kamer.

Human rights, Privacy

Bibtex

Case note{nokey, title = {Annotatie bij EHRM 9 maart 2023 (LB / Hongarije)}, author = {Dommering, E.}, url = {https://www.ivir.nl/nl/publications/annotatie-bij-ehrm-9-maart-2023-lb-hongarije/annotatie_nj_2024_144/}, year = {2024}, date = {2024-05-28}, journal = {Nederlandse Jurisprudentie}, issue = {15}, number = {144}, abstract = {Openbaar maken persoonsgegevens wegens belastingschuld. Bescherming persoonlijke data. Belang van toetsing in individuele gevallen. Margin of appreciation. Schending van art. 8 EVRM. Grote Kamer.}, keywords = {Human rights, Privacy}, }

How the EU Outsources the Task of Human Rights Protection to Platforms and Users: The Case of UGC Monetization external link

Senftleben, M., Quintais, J. & Meiring, A.
Berkeley Technology Law Journal, vol. 38, iss. : 3, pp: 933-1010, 2024

Abstract

With the shift from the traditional safe harbor for hosting to statutory content filtering and licensing obligations, EU copyright law has substantially curtailed the freedom of users to upload and share their content creations. Seeking to avoid overbroad inroads into freedom of expression, EU law obliges online platforms and the creative industry to take into account human rights when coordinating their content filtering actions. Platforms must also establish complaint and redress procedures for users. The European Commission will initiate stakeholder dialogues to identify best practices. These “safety valves” in the legislative package, however, are mere fig leaves. Instead of safeguarding human rights, the EU legislator outsources human rights obligations to the platform industry. At the same time, the burden of policing content moderation systems is imposed on users who are unlikely to bring complaints in each individual case. The new legislative design in the EU will thus “conceal” human rights violations instead of bringing them to light. Nonetheless, the DSA rests on the same – highly problematic – approach. Against this background, the paper discusses the weakening – and potential loss – of fundamental freedoms as a result of the departure from the traditional notice-and-takedown approach. Adding a new element to the ongoing debate on content licensing and filtering, the analysis will devote particular attention to the fact that EU law, for the most part, has left untouched the private power of platforms to determine the “house rules” governing the most popular copyright-owner reaction to detected matches between protected works and content uploads: the (algorithmic) monetization of that content. Addressing the “legal vacuum” in the field of content monetization, the analysis explores outsourcing and concealment risks in this unregulated space. Focusing on large-scale platforms for user-generated content, such as YouTube, Instagram and TikTok, two normative problems come to the fore: (1) the fact that rightholders, when opting for monetization, de facto monetize not only their own rights but also the creative input of users; (2) the fact that user creativity remains unremunerated as long as the monetization option is only available to rightholders. As a result of this configuration, the monetization mechanism disregards users’ right to (intellectual) property and discriminates against user creativity. Against this background, we discuss whether the DSA provisions that seek to ensure transparency of content moderation actions and terms and conditions offer useful sources of information that could empower users. Moreover, we raise the question whether the detailed regulation of platform actions in the DSA may resolve the described human rights dilemmas to some extent.

Artificial intelligence, Content moderation, Copyright, derivative works, discrimination, Freedom of expression, Human rights, liability, proportionality, user-generated content

Bibtex

Article{nokey, title = {How the EU Outsources the Task of Human Rights Protection to Platforms and Users: The Case of UGC Monetization}, author = {Senftleben, M. and Quintais, J. and Meiring, A.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4421150}, year = {2024}, date = {2024-01-23}, journal = {Berkeley Technology Law Journal}, volume = {38}, issue = {3}, pages = {933-1010}, abstract = {With the shift from the traditional safe harbor for hosting to statutory content filtering and licensing obligations, EU copyright law has substantially curtailed the freedom of users to upload and share their content creations. Seeking to avoid overbroad inroads into freedom of expression, EU law obliges online platforms and the creative industry to take into account human rights when coordinating their content filtering actions. Platforms must also establish complaint and redress procedures for users. The European Commission will initiate stakeholder dialogues to identify best practices. These “safety valves” in the legislative package, however, are mere fig leaves. Instead of safeguarding human rights, the EU legislator outsources human rights obligations to the platform industry. At the same time, the burden of policing content moderation systems is imposed on users who are unlikely to bring complaints in each individual case. The new legislative design in the EU will thus “conceal” human rights violations instead of bringing them to light. Nonetheless, the DSA rests on the same – highly problematic – approach. Against this background, the paper discusses the weakening – and potential loss – of fundamental freedoms as a result of the departure from the traditional notice-and-takedown approach. Adding a new element to the ongoing debate on content licensing and filtering, the analysis will devote particular attention to the fact that EU law, for the most part, has left untouched the private power of platforms to determine the “house rules” governing the most popular copyright-owner reaction to detected matches between protected works and content uploads: the (algorithmic) monetization of that content. Addressing the “legal vacuum” in the field of content monetization, the analysis explores outsourcing and concealment risks in this unregulated space. Focusing on large-scale platforms for user-generated content, such as YouTube, Instagram and TikTok, two normative problems come to the fore: (1) the fact that rightholders, when opting for monetization, de facto monetize not only their own rights but also the creative input of users; (2) the fact that user creativity remains unremunerated as long as the monetization option is only available to rightholders. As a result of this configuration, the monetization mechanism disregards users’ right to (intellectual) property and discriminates against user creativity. Against this background, we discuss whether the DSA provisions that seek to ensure transparency of content moderation actions and terms and conditions offer useful sources of information that could empower users. Moreover, we raise the question whether the detailed regulation of platform actions in the DSA may resolve the described human rights dilemmas to some extent.}, keywords = {Artificial intelligence, Content moderation, Copyright, derivative works, discrimination, Freedom of expression, Human rights, liability, proportionality, user-generated content}, }

The right to encryption: Privacy as preventing unlawful access external link

Computer Law & Security Review, vol. 49, 2023

Abstract

Encryption technologies are a fundamental building block of modern digital infrastructure, but plans to curb these technologies continue to spring up. Even in the European Union, where their application is by now firmly embedded in legislation, lawmakers are again calling for measures which would impact these technologies. One of the most important arguments in this debate are human rights, most notably the rights to privacy and to freedom of expression. And although some authors have in the past explored how encryption technologies support human rights, this connection is not yet firmly grounded in an analysis of European human rights case law. This contribution aims to fill this gap, developing a framework for assessing restrictions of encryption technologies under the rights to privacy and freedom of expression as protected under the European Convention of Human Rights (the Convention) and the Charter of Fundamental rights in the European Union (the Charter). In the first section, the relevant function of encryption technologies, restricting access to information (called confidentiality), is discussed. In the second section, an overview of some governmental policies and practices impacting these technologies is provided. This continues with a discussion of the case law on the rights to privacy, data protection and freedom of expression, arguing that these rights are not only about ensuring lawful access by governments to protected information, but also about preventing unlawful access by others. And because encryption technologies are an important technology to reduce the risk of this unlawful access, it is then proposed that this risk is central to the assessment of governance measures in the field of encryption technologies. The article concludes by recommending that states perform an in-depth assessement of this when proposing new measures, and that courts when reviewing them also place the risk of unlawful access central to the analysis of interference and proportionality.

communications confidentiality, encryption, Freedom of expression, Human rights, Privacy, unlawful access

Bibtex

Article{nokey, title = {The right to encryption: Privacy as preventing unlawful access}, author = {van Daalen, O.}, url = {https://www.sciencedirect.com/science/article/pii/S0267364923000146}, doi = {https://doi.org/10.1016/j.clsr.2023.105804}, year = {2023}, date = {2023-05-23}, journal = {Computer Law & Security Review}, volume = {49}, pages = {}, abstract = {Encryption technologies are a fundamental building block of modern digital infrastructure, but plans to curb these technologies continue to spring up. Even in the European Union, where their application is by now firmly embedded in legislation, lawmakers are again calling for measures which would impact these technologies. One of the most important arguments in this debate are human rights, most notably the rights to privacy and to freedom of expression. And although some authors have in the past explored how encryption technologies support human rights, this connection is not yet firmly grounded in an analysis of European human rights case law. This contribution aims to fill this gap, developing a framework for assessing restrictions of encryption technologies under the rights to privacy and freedom of expression as protected under the European Convention of Human Rights (the Convention) and the Charter of Fundamental rights in the European Union (the Charter). In the first section, the relevant function of encryption technologies, restricting access to information (called confidentiality), is discussed. In the second section, an overview of some governmental policies and practices impacting these technologies is provided. This continues with a discussion of the case law on the rights to privacy, data protection and freedom of expression, arguing that these rights are not only about ensuring lawful access by governments to protected information, but also about preventing unlawful access by others. And because encryption technologies are an important technology to reduce the risk of this unlawful access, it is then proposed that this risk is central to the assessment of governance measures in the field of encryption technologies. The article concludes by recommending that states perform an in-depth assessement of this when proposing new measures, and that courts when reviewing them also place the risk of unlawful access central to the analysis of interference and proportionality.}, keywords = {communications confidentiality, encryption, Freedom of expression, Human rights, Privacy, unlawful access}, }

Outsourcing Human Rights Obligations and Concealing Human Rights Deficits: The Example of Monetizing User-Generated Content Under the CDSM Directive and the Digital Services Act external link

Senftleben, M., Quintais, J. & Meiring, A.

Abstract

With the shift from the traditional safe harbor for hosting to statutory content filtering and licensing obligations, EU copyright law has substantially curtailed the freedom of users to upload and share their content creations. Seeking to avoid overbroad inroads into freedom of expression, EU law obliges online platforms and the creative industry to take into account human rights when coordinating their content filtering actions. Platforms must also establish complaint and redress procedures for users. The European Commission will initiate stakeholder dialogues to identify best practices. These “safety valves” in the legislative package, however, are mere fig leaves. Instead of safeguarding human rights, the EU legislator outsources human rights obligations to the platform industry. At the same time, the burden of policing content moderation systems is imposed on users who are unlikely to bring complaints in each individual case. The new legislative design in the EU will thus “conceal” human rights violations instead of bringing them to light. Nonetheless, the DSA rests on the same – highly problematic – approach. Against this background, the paper discusses the weakening – and potential loss – of fundamental freedoms as a result of the departure from the traditional notice-and-takedown approach. Adding a new element to the ongoing debate on content licensing and filtering, the analysis will devote particular attention to the fact that EU law, for the most part, has left untouched the private power of platforms to determine the “house rules” governing the most popular copyright-owner reaction to detected matches between protected works and content uploads: the (algorithmic) monetization of that content. Addressing the “legal vacuum” in the field of content monetization, the analysis explores outsourcing and concealment risks in this unregulated space. Focusing on large-scale platforms for user-generated content, such as YouTube, Instagram and TikTok, two normative problems come to the fore: (1) the fact that rightholders, when opting for monetization, de facto monetize not only their own rights but also the creative input of users; (2) the fact that user creativity remains unremunerated as long as the monetization option is only available to rightholders. As a result of this configuration, the monetization mechanism disregards users’ right to (intellectual) property and discriminates against user creativity. Against this background, we discuss whether the DSA provisions that seek to ensure transparency of content moderation actions and terms and conditions offer useful sources of information that could empower users. Moreover, we raise the question whether the detailed regulation of platform actions in the DSA may resolve the described human rights dilemmas to some extent.

Artificial intelligence, Content moderation, Copyright, derivative works, discrimination, Freedom of expression, Human rights, liability, user-generated content

Bibtex

Online publication{nokey, title = {Outsourcing Human Rights Obligations and Concealing Human Rights Deficits: The Example of Monetizing User-Generated Content Under the CDSM Directive and the Digital Services Act}, author = {Senftleben, M. and Quintais, J. and Meiring, A.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4421150}, year = {}, date = {DATE ERROR: pub_date = }, abstract = {With the shift from the traditional safe harbor for hosting to statutory content filtering and licensing obligations, EU copyright law has substantially curtailed the freedom of users to upload and share their content creations. Seeking to avoid overbroad inroads into freedom of expression, EU law obliges online platforms and the creative industry to take into account human rights when coordinating their content filtering actions. Platforms must also establish complaint and redress procedures for users. The European Commission will initiate stakeholder dialogues to identify best practices. These “safety valves” in the legislative package, however, are mere fig leaves. Instead of safeguarding human rights, the EU legislator outsources human rights obligations to the platform industry. At the same time, the burden of policing content moderation systems is imposed on users who are unlikely to bring complaints in each individual case. The new legislative design in the EU will thus “conceal” human rights violations instead of bringing them to light. Nonetheless, the DSA rests on the same – highly problematic – approach. Against this background, the paper discusses the weakening – and potential loss – of fundamental freedoms as a result of the departure from the traditional notice-and-takedown approach. Adding a new element to the ongoing debate on content licensing and filtering, the analysis will devote particular attention to the fact that EU law, for the most part, has left untouched the private power of platforms to determine the “house rules” governing the most popular copyright-owner reaction to detected matches between protected works and content uploads: the (algorithmic) monetization of that content. Addressing the “legal vacuum” in the field of content monetization, the analysis explores outsourcing and concealment risks in this unregulated space. Focusing on large-scale platforms for user-generated content, such as YouTube, Instagram and TikTok, two normative problems come to the fore: (1) the fact that rightholders, when opting for monetization, de facto monetize not only their own rights but also the creative input of users; (2) the fact that user creativity remains unremunerated as long as the monetization option is only available to rightholders. As a result of this configuration, the monetization mechanism disregards users’ right to (intellectual) property and discriminates against user creativity. Against this background, we discuss whether the DSA provisions that seek to ensure transparency of content moderation actions and terms and conditions offer useful sources of information that could empower users. Moreover, we raise the question whether the detailed regulation of platform actions in the DSA may resolve the described human rights dilemmas to some extent.}, keywords = {Artificial intelligence, Content moderation, Copyright, derivative works, discrimination, Freedom of expression, Human rights, liability, user-generated content}, }

Export control of cybersurveillance items in the new dual-use regulation: The challenges of applying human rights logic to export control external link

Computer Law & Security Review, vol. 48, 2023

Abstract

In 2021, the Recast Dual-Use Regulation entered into force. The regulation includes a heavily debated new provision on the export control of so-called cybersurveillance items. This provision departs from the traditional logic of export control rules in multiple ways. Most importantly, it positions human rights considerations as an important factor in the export control of a flexible range of technologies. This article explores the operation, implications and challenges of this new human rights-orientated approach to export control of digital surveillance technologies. Taking the definition of cybersurveillance items as a starting point of the analysis, the article draws on surveillance-related case law of the European Court of Human Rights and the Court of Justice of the European Union, to define the potential scope of application of the open-ended cybersurveillance concept of the Regulation. By exploring how this concept maps to technologies often connected with human rights infringements, such as facial recognition, location tracking and open-source intelligence, the article highlights the challenges of applying this new approach and underscores the need for its further development in practice.

cybersurveillance, Human rights, Regulation

Bibtex

Article{nokey, title = {Export control of cybersurveillance items in the new dual-use regulation: The challenges of applying human rights logic to export control}, author = {van Daalen, O. and van Hoboken, J. and Rucz, M.}, doi = {https://doi.org/10.1016/j.clsr.2022.105789}, year = {2023}, date = {2023-04-21}, journal = {Computer Law & Security Review}, volume = {48}, pages = {}, abstract = {In 2021, the Recast Dual-Use Regulation entered into force. The regulation includes a heavily debated new provision on the export control of so-called cybersurveillance items. This provision departs from the traditional logic of export control rules in multiple ways. Most importantly, it positions human rights considerations as an important factor in the export control of a flexible range of technologies. This article explores the operation, implications and challenges of this new human rights-orientated approach to export control of digital surveillance technologies. Taking the definition of cybersurveillance items as a starting point of the analysis, the article draws on surveillance-related case law of the European Court of Human Rights and the Court of Justice of the European Union, to define the potential scope of application of the open-ended cybersurveillance concept of the Regulation. By exploring how this concept maps to technologies often connected with human rights infringements, such as facial recognition, location tracking and open-source intelligence, the article highlights the challenges of applying this new approach and underscores the need for its further development in practice.}, keywords = {cybersurveillance, Human rights, Regulation}, }

Protection of Intellectual Property Rights per Protocol No. 1 of the Convention for the Protection of Human Rights and Fundamental Freedoms

GRUR International, vol. 72, iss. : 3, pp: 323-324, 2023

Human rights, Intellectual property

Bibtex

Case note{nokey, title = {Protection of Intellectual Property Rights per Protocol No. 1 of the Convention for the Protection of Human Rights and Fundamental Freedoms}, author = {Izyumenko, E.}, doi = {https://doi.org/10.1093/grurint/ikac144}, year = {2023}, date = {2023-02-02}, journal = {GRUR International}, volume = {72}, issue = {3}, pages = {323-324}, keywords = {Human rights, Intellectual property}, }

Europe’s Human Rights Court rules for the first time on a breach of a copyright holder’s right to property in a private dispute

Journal of Intellectual Property Law & Practice, vol. 17, iss. : 11, pp: 896–898, 2022

Abstract

The European Court of Human Rights has recently ruled that the domestic courts’ failure to justify the grounds for dismissing the applicant’s copyright infringement claim in a private-party dispute concerning the unauthorized online reproduction of the applicant’s book breached the latter’s human right to property. Notably, the Court was not satisfied with the fact that the national courts had not persuasively explained their conclusions regarding the applicability in the applicant’s case of digital exhaustion and of copyright exceptions for libraries and private copying.

Copyright, Human rights

Bibtex

Article{nokey, title = {Europe’s Human Rights Court rules for the first time on a breach of a copyright holder’s right to property in a private dispute}, author = {Izyumenko, E.}, doi = {https://doi.org/10.1093/jiplp/jpac093}, year = {2022}, date = {2022-10-17}, journal = {Journal of Intellectual Property Law & Practice}, volume = {17}, issue = {11}, pages = {896–898}, abstract = {The European Court of Human Rights has recently ruled that the domestic courts’ failure to justify the grounds for dismissing the applicant’s copyright infringement claim in a private-party dispute concerning the unauthorized online reproduction of the applicant’s book breached the latter’s human right to property. Notably, the Court was not satisfied with the fact that the national courts had not persuasively explained their conclusions regarding the applicability in the applicant’s case of digital exhaustion and of copyright exceptions for libraries and private copying.}, keywords = {Copyright, Human rights}, }