Getting Data Subject Rights Right: A submission to the European Data Protection Board from international data rights academics, to inform regulatory guidance external link

Ausloos, J., Veale, M. & Mahieu, R.
JIPITEC, vol. 10, num: 3, 2019

Abstract

We are a group of academics active in research and practice around data rights. We believe that the European Data Protection Board (EDPB) guidance on data rights currently under development is an important point to resolve a variety of tensions and grey areas which, if left unaddressed, may significantly undermine the fundamental right to data protection. All of us were present at the recent stakeholder event on data rights in Brussels on 4 November 2019, and it is in the context and spirit of stakeholder engagement that we have created this document to explore and provide recommendations and examples in this area. This document is based on comprehensive empirical evidence as well as CJEU case law, EDPB (and, previously, Article 29 Working Party) guidance and extensive scientific research into the scope, rationale, effects and general modalities of data rights.

GDPR, gegevensbescherming, Privacy

Bibtex

Article{Ausloos2020, title = {Getting Data Subject Rights Right: A submission to the European Data Protection Board from international data rights academics, to inform regulatory guidance}, author = {Ausloos, J. and Veale, M. and Mahieu, R.}, url = {https://www.jipitec.eu/issues/jipitec-10-3-2019/5031}, year = {1231}, date = {2019-12-31}, journal = {JIPITEC}, volume = {10}, number = {3}, pages = {}, abstract = {We are a group of academics active in research and practice around data rights. We believe that the European Data Protection Board (EDPB) guidance on data rights currently under development is an important point to resolve a variety of tensions and grey areas which, if left unaddressed, may significantly undermine the fundamental right to data protection. All of us were present at the recent stakeholder event on data rights in Brussels on 4 November 2019, and it is in the context and spirit of stakeholder engagement that we have created this document to explore and provide recommendations and examples in this area. This document is based on comprehensive empirical evidence as well as CJEU case law, EDPB (and, previously, Article 29 Working Party) guidance and extensive scientific research into the scope, rationale, effects and general modalities of data rights.}, keywords = {GDPR, gegevensbescherming, Privacy}, }

Privacy Protection(ism): The Latest Wave of Trade Constraints on Regulatory Autonomy external link

University of Miami Law Review, vol. 74, num: 2, pp: 416-519, 2020

Abstract

Countries spend billions of dollars each year to strengthen their discursive power to shape international policy debates. They do so because in public policy conversations labels and narratives matter enormously. The “digital protectionism” label has been used in the last decade as a tool to gain the policy upper hand in digital trade policy debates about cross-border flows of personal and other data. Using the Foucauldian framework of discourse analysis, this Article brings a unique perspective on this topic. The Article makes two central arguments. First, the Article argues that the term “protectionism” is not endowed with an inherent meaning but is socially constructed by the power of discourse used in international negotiations, and in the interpretation and application of international trade policy and rules. In other words, there are as many definitions of “(digital) protectionism” as there are discourses. The U.S. and E.U. “digital trade” discourses illustrate this point. Using the same term, those trading partners advance utterly different discourses and agendas: an economic discourse with economic efficiency as the main benchmark (United States), and a more multidisciplinary discourse where both economic efficiency and protection of fundamental rights are equally important (European Union). Second, based on a detailed evaluation of the economic “digital trade” discourse, the Article contends that the coining of the term “digital protectionism” to refer to domestic information governance policies not yet fully covered by trade law disciplines is not a logical step to respond to objectively changing circumstances, but rather a product of that discourse, which is coming to dominate U.S.-led international trade negotiations. The Article demonstrates how this redefinition of “protectionism” has already resulted in the adoption of international trade rules in recent trade agreements further restricting domestic autonomy to protect the rights to privacy and the protection of personal data. The Article suggests that the distinction between privacy and personal data protection and protectionism is a moral question, not a question of economic efficiency. Therefore, when a policy conversation, such as the one on cross-border data flows, involves noneconomic spill-over effects to individual rights, such conversation should not be confined within the straightjacket of trade economics, but rather placed in a broader normative perspective. Finally, the Article argues that, in conducting recently restarted multilateral negotiations on electronic commerce at the World Trade Organization, countries should rethink the goals of international trade for the twenty-first century. Such goals should determine and define the discourse, not the other way around. The discussion should not be about what “protectionism” means but about how far domestic regimes are willing to let trade rules interfere in their autonomy to protect their societal, cultural, and political values.

frontpage, Privacy, protectionism, Regulation, trade

Bibtex

Article{Yakovleva2020, title = {Privacy Protection(ism): The Latest Wave of Trade Constraints on Regulatory Autonomy}, author = {Yakovleva, S.}, url = {https://repository.law.miami.edu/umlr/vol74/iss2/5/}, year = {0227}, date = {2020-02-27}, journal = {University of Miami Law Review}, volume = {74}, number = {2}, pages = {416-519}, abstract = {Countries spend billions of dollars each year to strengthen their discursive power to shape international policy debates. They do so because in public policy conversations labels and narratives matter enormously. The “digital protectionism” label has been used in the last decade as a tool to gain the policy upper hand in digital trade policy debates about cross-border flows of personal and other data. Using the Foucauldian framework of discourse analysis, this Article brings a unique perspective on this topic. The Article makes two central arguments. First, the Article argues that the term “protectionism” is not endowed with an inherent meaning but is socially constructed by the power of discourse used in international negotiations, and in the interpretation and application of international trade policy and rules. In other words, there are as many definitions of “(digital) protectionism” as there are discourses. The U.S. and E.U. “digital trade” discourses illustrate this point. Using the same term, those trading partners advance utterly different discourses and agendas: an economic discourse with economic efficiency as the main benchmark (United States), and a more multidisciplinary discourse where both economic efficiency and protection of fundamental rights are equally important (European Union). Second, based on a detailed evaluation of the economic “digital trade” discourse, the Article contends that the coining of the term “digital protectionism” to refer to domestic information governance policies not yet fully covered by trade law disciplines is not a logical step to respond to objectively changing circumstances, but rather a product of that discourse, which is coming to dominate U.S.-led international trade negotiations. The Article demonstrates how this redefinition of “protectionism” has already resulted in the adoption of international trade rules in recent trade agreements further restricting domestic autonomy to protect the rights to privacy and the protection of personal data. The Article suggests that the distinction between privacy and personal data protection and protectionism is a moral question, not a question of economic efficiency. Therefore, when a policy conversation, such as the one on cross-border data flows, involves noneconomic spill-over effects to individual rights, such conversation should not be confined within the straightjacket of trade economics, but rather placed in a broader normative perspective. Finally, the Article argues that, in conducting recently restarted multilateral negotiations on electronic commerce at the World Trade Organization, countries should rethink the goals of international trade for the twenty-first century. Such goals should determine and define the discourse, not the other way around. The discussion should not be about what “protectionism” means but about how far domestic regimes are willing to let trade rules interfere in their autonomy to protect their societal, cultural, and political values.}, keywords = {frontpage, Privacy, protectionism, Regulation, trade}, }

The Privacy Disconnect external link

Chapter in: Human Rights in the Age of Platforms, ed. R.F. Jørgensen, Cambridge: The MIT Press, 2019., 0207, pp: 255-284, ISBN: 9780262039055

Privacy

Bibtex

Chapter{vanHoboken2020, title = {The Privacy Disconnect}, author = {van Hoboken, J.}, url = {https://mitpress.mit.edu/books/human-rights-age-platforms https://www.ivir.nl/publicaties/download/privacy_disconnect.pdf}, year = {0207}, date = {2020-02-07}, keywords = {Privacy}, }

Panel discussion at CPDP 2020: We need to talk about filters: algorithmic copyright enforcement vs data protection. external link

Quintais, J., Ducato, R., Mazgal, A., Zuiderveen Borgesius, F. & Hegladóttir, A.
2020

Abstract

The new Copyright in the Digital Single Market (DSM) Directive was published in May 2019. Its most controversial provision is Article 17 (ex 13), which creates a new liability regime for user-generated content platforms, like YouTube and Facebook. The new regime makes these platforms directly liable for their users’ uploads, without the possibility of benefiting from the hosting safe-harbour. This forces platforms to either license all or most of the content uploaded by users (which is near impossible) or to adopt preventive measures like filters. The likely outcome is that covered platforms will engage in general monitoring of the content uploaded by their users. This panel will discuss the issues raised by Article 17 DSM Directive and the model of algorithmic enforcement it incentivizes, with a focus on the freedom of expression and data protection risks it entails. • Article 17 of the Copyright in the Digital Single Market Directive creates a new liability regime for user-generated content platforms. • Does this provision introduce de facto the controversial upload filtering systems and, as a result, general monitoring of information in content-sharing platforms? • Is Article 17 essentially in conflict with the GDPR and, in particular, the principle of minimisation and the right not to be subject to automated decision-making processes? What are the potential consequences of this provision on users’ freedom of expression? • If Article 17 can negatively affect data protection and freedom of expression what are the possible legal and extra-legal responses to neutralise the risk?

Copyright, Data protection, frontpage, Privacy

Bibtex

Presentation{Quintais2020, title = {Panel discussion at CPDP 2020: We need to talk about filters: algorithmic copyright enforcement vs data protection.}, author = {Quintais, J. and Ducato, R. and Mazgal, A. and Zuiderveen Borgesius, F. and Hegladóttir, A.}, url = {https://www.youtube.com/watch?v=SstHA1ALZoI}, year = {2020}, date = {2020-02-06}, abstract = {The new Copyright in the Digital Single Market (DSM) Directive was published in May 2019. Its most controversial provision is Article 17 (ex 13), which creates a new liability regime for user-generated content platforms, like YouTube and Facebook. The new regime makes these platforms directly liable for their users’ uploads, without the possibility of benefiting from the hosting safe-harbour. This forces platforms to either license all or most of the content uploaded by users (which is near impossible) or to adopt preventive measures like filters. The likely outcome is that covered platforms will engage in general monitoring of the content uploaded by their users. This panel will discuss the issues raised by Article 17 DSM Directive and the model of algorithmic enforcement it incentivizes, with a focus on the freedom of expression and data protection risks it entails. • Article 17 of the Copyright in the Digital Single Market Directive creates a new liability regime for user-generated content platforms. • Does this provision introduce de facto the controversial upload filtering systems and, as a result, general monitoring of information in content-sharing platforms? • Is Article 17 essentially in conflict with the GDPR and, in particular, the principle of minimisation and the right not to be subject to automated decision-making processes? What are the potential consequences of this provision on users’ freedom of expression? • If Article 17 can negatively affect data protection and freedom of expression what are the possible legal and extra-legal responses to neutralise the risk?}, keywords = {Copyright, Data protection, frontpage, Privacy}, }

The regulation of online political micro-targeting in Europe external link

Internet Policy Review, vol. 8, num: 4, 2020

Abstract

In this paper, we examine how online political micro-targeting is regulated in Europe. While there are no specific rules on such micro-targeting, there are general rules that apply. We focus on three fields of law: data protection law, freedom of expression, and sector-specific rules for political advertising; for the latter we examine four countries. We argue that the rules in the General Data Protection Regulation (GDPR) are necessary, but not sufficient. We show that political advertising, including online political micro-targeting, is protected by the right to freedom of expression. That right is not absolute, however. From a European human rights perspective, it is possible for lawmakers to limit the possibilities for political advertising. Indeed, some countries ban TV advertising for political parties during elections.

Advertising, Data protection law, elections, europe, frontpage, Micro-targeting, Politics, Privacy, Regulering, Vrijheid van meningsuiting

Bibtex

Article{Dobber2020, title = {The regulation of online political micro-targeting in Europe}, author = {Dobber, T. and Fahy, R. and Zuiderveen Borgesius, F.}, url = {https://policyreview.info/articles/analysis/regulation-online-political-micro-targeting-europe}, doi = {https://doi.org/10.14763/2019.4.1440}, year = {0116}, date = {2020-01-16}, journal = {Internet Policy Review}, volume = {8}, number = {4}, pages = {}, abstract = {In this paper, we examine how online political micro-targeting is regulated in Europe. While there are no specific rules on such micro-targeting, there are general rules that apply. We focus on three fields of law: data protection law, freedom of expression, and sector-specific rules for political advertising; for the latter we examine four countries. We argue that the rules in the General Data Protection Regulation (GDPR) are necessary, but not sufficient. We show that political advertising, including online political micro-targeting, is protected by the right to freedom of expression. That right is not absolute, however. From a European human rights perspective, it is possible for lawmakers to limit the possibilities for political advertising. Indeed, some countries ban TV advertising for political parties during elections.}, keywords = {Advertising, Data protection law, elections, europe, frontpage, Micro-targeting, Politics, Privacy, Regulering, Vrijheid van meningsuiting}, }

Access and Reuse of Machine-Generated Data for Scientific Research external link

Erasmus Law Review, num: 2, pp: 155-165, 2019

Abstract

Data driven innovation holds the potential in transforming current business and knowledge discovery models. For this reason, data sharing has become one of the central points of interest for the European Commission towards the creation of a Digital Single Market. The value of automatically generated data, which are collected by Internet-connected objects (IoT), is increasing: from smart houses to wearables, machine-generated data hold significant potential for growth, learning, and problem solving. Facilitating researchers in order to provide access to these types of data implies not only the articulation of existing legal obstacles and of proposed legal solutions but also the understanding of the incentives that motivate the sharing of the data in question. What are the legal tools that researchers can use to gain access and reuse rights in the context of their research?

frontpage, GDPR, Internet of Things, machine-generated data, Personal data, Privacy, scientific research

Bibtex

Article{Giannopoulou2019bb, title = {Access and Reuse of Machine-Generated Data for Scientific Research}, author = {Giannopoulou, A.}, url = {https://www.ivir.nl/publicaties/download/Erasmus_Law_Review_2019.pdf}, doi = {https://doi.org/10.5553/ELR.000136}, year = {1220}, date = {2019-12-20}, journal = {Erasmus Law Review}, number = {2}, abstract = {Data driven innovation holds the potential in transforming current business and knowledge discovery models. For this reason, data sharing has become one of the central points of interest for the European Commission towards the creation of a Digital Single Market. The value of automatically generated data, which are collected by Internet-connected objects (IoT), is increasing: from smart houses to wearables, machine-generated data hold significant potential for growth, learning, and problem solving. Facilitating researchers in order to provide access to these types of data implies not only the articulation of existing legal obstacles and of proposed legal solutions but also the understanding of the incentives that motivate the sharing of the data in question. What are the legal tools that researchers can use to gain access and reuse rights in the context of their research?}, keywords = {frontpage, GDPR, Internet of Things, machine-generated data, Personal data, Privacy, scientific research}, }

European Regulation of Smartphone Ecosystems external link

European Data Protection Law Review (EDPL), vol. 5, num: 4, pp: 476-491, 2019

Abstract

For the first time, two pieces of EU legislation will specifically target smartphone ecosystems in relation to smartphone and mobile software (eg, iOS and Android) privacy, and use and monetisation of data. And yet, both pieces of legislation approach data use and data monetisation from radically contrasting perspectives. The first is the proposed ePrivacy Regulation, which seeks to provide enhanced protection against user data monitoring and tracking in smartphones, and safeguard privacy in electronic communications. On the other hand, the recently enacted Platform-to-Business Regulation 2019, seeks to bring fairness to platform-business user relations (including app stores and app developers), and is crucially built upon the premise that the ability to access and use data, including personal data, can enable important value creation in the online platform economy. This article discusses how these two Regulations will apply to smartphone ecosystems, especially relating to user and device privacy. The article analyses the potential tension points between the two sets of rules, which result from the underlying policy objectives of safeguarding privacy in electronic communications and the functioning of the digital economy in the emerging era of platform governance. The article concludes with a discussion on how to address these issues, at the intersection of privacy and competition in the digital platform economy.

frontpage, governance, Platforms, Privacy, Regulering, smartphones

Bibtex

Article{Fahy2019eb, title = {European Regulation of Smartphone Ecosystems}, author = {Fahy, R. and van Hoboken, J.}, url = {https://edpl.lexxion.eu/article/EDPL/2019/4/6}, doi = {https://doi.org/https://doi.org/10.21552/edpl/2019/4/6}, year = {1213}, date = {2019-12-13}, journal = {European Data Protection Law Review (EDPL)}, volume = {5}, number = {4}, pages = {476-491}, abstract = {For the first time, two pieces of EU legislation will specifically target smartphone ecosystems in relation to smartphone and mobile software (eg, iOS and Android) privacy, and use and monetisation of data. And yet, both pieces of legislation approach data use and data monetisation from radically contrasting perspectives. The first is the proposed ePrivacy Regulation, which seeks to provide enhanced protection against user data monitoring and tracking in smartphones, and safeguard privacy in electronic communications. On the other hand, the recently enacted Platform-to-Business Regulation 2019, seeks to bring fairness to platform-business user relations (including app stores and app developers), and is crucially built upon the premise that the ability to access and use data, including personal data, can enable important value creation in the online platform economy. This article discusses how these two Regulations will apply to smartphone ecosystems, especially relating to user and device privacy. The article analyses the potential tension points between the two sets of rules, which result from the underlying policy objectives of safeguarding privacy in electronic communications and the functioning of the digital economy in the emerging era of platform governance. The article concludes with a discussion on how to address these issues, at the intersection of privacy and competition in the digital platform economy.}, keywords = {frontpage, governance, Platforms, Privacy, Regulering, smartphones}, }

Fundamental rights review of EU data collection instruments and programmes external link

Fondazione Giacomo Brodolini & Irion, K.
2019

Abstract

This report is the result of a Pilot Project requested by the European Parliament, managed by the Commission and carried out by a group of independent experts. The scope of the project was to establish and support an independent experts’ group to carry out a fundamental rights review of existing EU legislation and instruments in the Area of Freedom, Security and Justice (AFSJ) that involve the collection, retention, storage or transfer of personal data. One outcome of the project is a database of AFSJ legislation and instruments with individual fundamental rights assessments (at http://brodolini.mbs.it/). The final report concludes that that fundamental rights safeguards need to be more consistently considered and applied in the AFSJ. The conclusions highlight five broad issues for further consideration: ambiguous definitions and open terms; law enforcement access to migration databases; the expansion of centralised databases; data retention periods; and information rights and duties.

Area of Freedom, EU databases, EU law, frontpage, Fundamental rights, Personal data, Privacy, Security and Justice

Bibtex

Online publication{Brodolini2019, title = {Fundamental rights review of EU data collection instruments and programmes}, author = {Fondazione Giacomo Brodolini and Irion, K.}, url = {http://www.fondazionebrodolini.it/sites/default/files/final_report_0.pdf}, year = {1204}, date = {2019-12-04}, abstract = {This report is the result of a Pilot Project requested by the European Parliament, managed by the Commission and carried out by a group of independent experts. The scope of the project was to establish and support an independent experts’ group to carry out a fundamental rights review of existing EU legislation and instruments in the Area of Freedom, Security and Justice (AFSJ) that involve the collection, retention, storage or transfer of personal data. One outcome of the project is a database of AFSJ legislation and instruments with individual fundamental rights assessments (at http://brodolini.mbs.it/). The final report concludes that that fundamental rights safeguards need to be more consistently considered and applied in the AFSJ. The conclusions highlight five broad issues for further consideration: ambiguous definitions and open terms; law enforcement access to migration databases; the expansion of centralised databases; data retention periods; and information rights and duties.}, keywords = {Area of Freedom, EU databases, EU law, frontpage, Fundamental rights, Personal data, Privacy, Security and Justice}, }

Justitie toegang geven tot versleutelde chatberichten is geen goed idee external link

Trouw, 2019

frontpage, Privacy

Bibtex

Article{vanDaalen2019b, title = {Justitie toegang geven tot versleutelde chatberichten is geen goed idee}, author = {van Daalen, O.}, url = {https://www.trouw.nl/opinie/justitie-toegang-geven-tot-versleutelde-chatberichten-is-geen-goed-idee~bd398447/}, year = {1108}, date = {2019-11-08}, journal = {Trouw}, keywords = {frontpage, Privacy}, }