Personalised pricing: The demise of the fixed price? external link

Abstract

An online seller or platform is technically able to offer every consumer a different price for the same product, based on information it has about the customers. Such online price discrimination exacerbates concerns regarding the fairness and morality of price discrimination, and the possible need for regulation. In this chapter, we discuss the underlying basis of price discrimination in economic theory, and its popular perception. Our surveys show that consumers are critical and suspicious of online price discrimination. A majority consider it unacceptable and unfair, and are in favour of a ban. When stores apply online price discrimination, most consumers think they should be informed about it. We argue that the General Data Protection Regulation (GDPR) applies to the most controversial forms of online price discrimination, and not only requires companies to disclose their use of price discrimination, but also requires companies to ask customers for their prior consent. Industry practice, however, does not show any adoption of these two principles.

algorithms, frontpage, GDPR, gegevensbescherming, Personalisation, Price discrimination, Privacy

Bibtex

Article{Poort2021, title = {Personalised pricing: The demise of the fixed price?}, author = {Poort, J. and Zuiderveen Borgesius, F.}, url = {https://www.ivir.nl/publicaties/download/The-Demise-of-the-Fixed-Price.pdf}, year = {0304}, date = {2021-03-04}, abstract = {An online seller or platform is technically able to offer every consumer a different price for the same product, based on information it has about the customers. Such online price discrimination exacerbates concerns regarding the fairness and morality of price discrimination, and the possible need for regulation. In this chapter, we discuss the underlying basis of price discrimination in economic theory, and its popular perception. Our surveys show that consumers are critical and suspicious of online price discrimination. A majority consider it unacceptable and unfair, and are in favour of a ban. When stores apply online price discrimination, most consumers think they should be informed about it. We argue that the General Data Protection Regulation (GDPR) applies to the most controversial forms of online price discrimination, and not only requires companies to disclose their use of price discrimination, but also requires companies to ask customers for their prior consent. Industry practice, however, does not show any adoption of these two principles.}, keywords = {algorithms, frontpage, GDPR, gegevensbescherming, Personalisation, Price discrimination, Privacy}, }

AI Regulation in the European Union and Trade Law: How can accountability of AI and a high level of consumer protection prevail over a trade law discipline on source code?, study commissioned by the Vzbv, Amsterdam: Institute for Information Law, 2021 external link

2021

Abstract

The Federation of German Consumer Organisations (Verbraucherzentrale Bun-desverband – vzbv) has commissioned this study from the Institute for Information Law (IViR) at the University of Amsterdam, in order to shed light on the cross-border supply of AI technology and its impact on EU consumer rights. In the current negotiations on electronic commerce at the World Trade Organisation (WTO), the EU supports the introduction – in the legal text – of a clause which prohibits the participating countries to introduce – in their national laws – measures that require access to, or transfer of, the source code of software, with some exceptions. This is a cause for concern for experts and rights advocates, as such a clause – if not carefully conditioned – can prevent future EU regulation of AI that may be harmful to consumers. This study concludes that the source code clause within trade law indeed restricts the EU’s right to regulate in the field of AI governance in several important ways.

accountability, application programming interfaces, Artificial intelligence, auditability, Electronic commerce, EU consumer protection, frontpage, GATS, source code, transpareny, WTO law

Bibtex

Other{Irion2021, title = {AI Regulation in the European Union and Trade Law: How can accountability of AI and a high level of consumer protection prevail over a trade law discipline on source code?, study commissioned by the Vzbv, Amsterdam: Institute for Information Law, 2021}, author = {Irion, K.}, url = {https://www.ivir.nl/irion_study_ai_and_trade_21-01-26-2/}, year = {0126}, date = {2021-01-26}, abstract = {The Federation of German Consumer Organisations (Verbraucherzentrale Bun-desverband – vzbv) has commissioned this study from the Institute for Information Law (IViR) at the University of Amsterdam, in order to shed light on the cross-border supply of AI technology and its impact on EU consumer rights. In the current negotiations on electronic commerce at the World Trade Organisation (WTO), the EU supports the introduction – in the legal text – of a clause which prohibits the participating countries to introduce – in their national laws – measures that require access to, or transfer of, the source code of software, with some exceptions. This is a cause for concern for experts and rights advocates, as such a clause – if not carefully conditioned – can prevent future EU regulation of AI that may be harmful to consumers. This study concludes that the source code clause within trade law indeed restricts the EU’s right to regulate in the field of AI governance in several important ways.}, keywords = {accountability, application programming interfaces, Artificial intelligence, auditability, Electronic commerce, EU consumer protection, frontpage, GATS, source code, transpareny, WTO law}, }

Getting under your skin(s): A legal-ethical exploration of Fortnite’s transformation into a content delivery platform and its manipulative potential external link

Interactive Entertainment Law Review, vol. 4, num: 1, 2021

Abstract

This paper investigates the ethical and legal implications of increasingly manipulative practices in the gaming industry by looking at one of the currently most popular and profitable video games in the world. Fortnite has morphed from an online game into a quasi-social network and an important cultural reference point in the lifeworld of many (young) people. The game is also emblematic of the freemium business model, with strong incentives to design the game in a manner which maximises microtransactions. This article suggests that to properly understand Fortnite’s practices – which we predict will become more widely adopted in the video game industry in the near future – we need an additional perspective. Fortnite is not only designed for hyper-engagement; its search for continued growth and sustained relevance is driving its transformation from being a mere video game into a content delivery platform. This means that third parties can offer non game-related services to players within Fortnite’s immersive game experience. In this paper, we draw on an ethical theory of manipulation (which defines manipulation as an ethically problematic influence on a person’s behaviour) to explore whether the gaming experience offered by Fortnite harbours manipulative potential. To legally address the manipulative potential of commercial video game practices such as the ones found in Fortnite, we turn to European data protection and consumer protection law. More specifically, we explore how the European Union’s General Data Protection Regulation and Unfair Commercial Practices Directive can provide regulators with tools to address Fortnite’s manipulative potential and to make Fortnite (more) forthright.

Consumer law, Data protection law, Fortnite, manipulation, Platforms, video games

Bibtex

Article{SaxAusloos2021, title = {Getting under your skin(s): A legal-ethical exploration of Fortnite’s transformation into a content delivery platform and its manipulative potential}, author = {Sax, M. and Ausloos, J.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3764489}, year = {0301}, date = {2021-03-01}, journal = {Interactive Entertainment Law Review}, volume = {4}, number = {1}, pages = {}, abstract = {This paper investigates the ethical and legal implications of increasingly manipulative practices in the gaming industry by looking at one of the currently most popular and profitable video games in the world. Fortnite has morphed from an online game into a quasi-social network and an important cultural reference point in the lifeworld of many (young) people. The game is also emblematic of the freemium business model, with strong incentives to design the game in a manner which maximises microtransactions. This article suggests that to properly understand Fortnite’s practices – which we predict will become more widely adopted in the video game industry in the near future – we need an additional perspective. Fortnite is not only designed for hyper-engagement; its search for continued growth and sustained relevance is driving its transformation from being a mere video game into a content delivery platform. This means that third parties can offer non game-related services to players within Fortnite’s immersive game experience. In this paper, we draw on an ethical theory of manipulation (which defines manipulation as an ethically problematic influence on a person’s behaviour) to explore whether the gaming experience offered by Fortnite harbours manipulative potential. To legally address the manipulative potential of commercial video game practices such as the ones found in Fortnite, we turn to European data protection and consumer protection law. More specifically, we explore how the European Union’s General Data Protection Regulation and Unfair Commercial Practices Directive can provide regulators with tools to address Fortnite’s manipulative potential and to make Fortnite (more) forthright.}, keywords = {Consumer law, Data protection law, Fortnite, manipulation, Platforms, video games}, }

Optimization of what? For-profit health apps as manipulative digital environments external link

Ethics and Information Technology, vol. 23, num: 3, pp: 345-361, 2021

Abstract

Mobile health applications (‘health apps’) that promise the user to help her with some aspect of her health are very popular: for-profit apps such as MyFitnessPal, Fitbit, or Headspace have tens of millions of users each. For-profit health apps are designed and run as optimization systems. One would expect that these health apps aim to optimize the health of the user, but in reality they aim to optimize user engagement and, in effect, conversion. This is problematic, I argue, because digital health environments that aim to optimize user engagement risk being manipulative. To develop this argument, I first provide a brief analysis of the underlying business models and the resulting designs of the digital environments provided by popular for-profit health apps. In a second step, I present a concept of manipulation that can help analyze digital environments such as health apps. In the last part of the article, I use my concept of manipulation to analyze the manipulative potential of for-profit health apps. Although for-profit health can certainly empower their users, the conditions for empowerment also largely overlap with the conditions for manipulation. As a result, we should be cautious when embracing the empowerment discourse surrounding health apps. An additional aim of this article is to contribute to the rapidly growing literature on digital choice architectures and the ethics of influencing behavior through such choice architectures. I take health apps to be a paradigmatic example of digital choice architectures that give rise to ethical questions, so my analysis of the manipulative potential of health apps can also inform the larger literature on digital choice architectures.

autonomy, choice architectures, health apps, manipulation, mhealh

Bibtex

Article{Sax2021, title = {Optimization of what? For-profit health apps as manipulative digital environments}, author = {Sax, M.}, url = {https://link.springer.com/article/10.1007/s10676-020-09576-6}, doi = {https://doi.org/10.1007/s10676-020-09576-6}, year = {0103}, date = {2021-01-03}, journal = {Ethics and Information Technology}, volume = {23}, number = {3}, pages = {345-361}, abstract = {Mobile health applications (‘health apps’) that promise the user to help her with some aspect of her health are very popular: for-profit apps such as MyFitnessPal, Fitbit, or Headspace have tens of millions of users each. For-profit health apps are designed and run as optimization systems. One would expect that these health apps aim to optimize the health of the user, but in reality they aim to optimize user engagement and, in effect, conversion. This is problematic, I argue, because digital health environments that aim to optimize user engagement risk being manipulative. To develop this argument, I first provide a brief analysis of the underlying business models and the resulting designs of the digital environments provided by popular for-profit health apps. In a second step, I present a concept of manipulation that can help analyze digital environments such as health apps. In the last part of the article, I use my concept of manipulation to analyze the manipulative potential of for-profit health apps. Although for-profit health can certainly empower their users, the conditions for empowerment also largely overlap with the conditions for manipulation. As a result, we should be cautious when embracing the empowerment discourse surrounding health apps. An additional aim of this article is to contribute to the rapidly growing literature on digital choice architectures and the ethics of influencing behavior through such choice architectures. I take health apps to be a paradigmatic example of digital choice architectures that give rise to ethical questions, so my analysis of the manipulative potential of health apps can also inform the larger literature on digital choice architectures.}, keywords = {autonomy, choice architectures, health apps, manipulation, mhealh}, }

Artikel 18-23 DSM-richtlijn: Exploitatiecontracten external link

AMI, vol. 2020, num: 6, pp: 187-192, 2020

Abstract

De DSM-richtlijn van 17 april 2019 bevat een zestal bepalingen op het gebied van het auteurscontractenrecht. Artikelen 18 tot en met 23 hebben niet alleen betrekking op de ‘billijke vergoeding van auteurs en uitvoerende kunstenaars in exploitatiecontracten’, zoals het opschrift van titel IV, hoofdstuk 3 van de richtlijn belooft, maar ook op transparantie, geschillenbeslechting en het recht op herroeping van verleende rechten. Hoewel de meeste van deze onderwerpen reeds een plaats hebben gevonden in hoofdstuk 1a van de huidige Auteurswet, noopt de richtlijn op een aantal punten tot wetswijziging. Dat geldt in het bijzonder voor de transparantieplicht, die in de huidige wet niet voorkomt. In deze bijdrage, onderdeel van een reeks van AMI-artikelen over de DSM-richtlijn, worden de auteurscontractenrechtelijke bepalingen van de richtlijn en de omzetting ervan besproken.

Auteursrecht, DSM-richtlijn, exploitatiecontracten, frontpage

Bibtex

Article{Hugenholtz2020h, title = {Artikel 18-23 DSM-richtlijn: Exploitatiecontracten}, author = {Hugenholtz, P.}, url = {https://www.ivir.nl/publicaties/download/AMI_2020_6.pdf}, year = {1218}, date = {2020-12-18}, journal = {AMI}, volume = {2020}, number = {6}, pages = {187-192}, abstract = {De DSM-richtlijn van 17 april 2019 bevat een zestal bepalingen op het gebied van het auteurscontractenrecht. Artikelen 18 tot en met 23 hebben niet alleen betrekking op de ‘billijke vergoeding van auteurs en uitvoerende kunstenaars in exploitatiecontracten’, zoals het opschrift van titel IV, hoofdstuk 3 van de richtlijn belooft, maar ook op transparantie, geschillenbeslechting en het recht op herroeping van verleende rechten. Hoewel de meeste van deze onderwerpen reeds een plaats hebben gevonden in hoofdstuk 1a van de huidige Auteurswet, noopt de richtlijn op een aantal punten tot wetswijziging. Dat geldt in het bijzonder voor de transparantieplicht, die in de huidige wet niet voorkomt. In deze bijdrage, onderdeel van een reeks van AMI-artikelen over de DSM-richtlijn, worden de auteurscontractenrechtelijke bepalingen van de richtlijn en de omzetting ervan besproken.}, keywords = {Auteursrecht, DSM-richtlijn, exploitatiecontracten, frontpage}, }

Trends and Developments in Artificial Intelligence: Challenges to Copyright external link

Kluwer Copyright Blog, 2020

Artificial intelligence, Auteursrecht, frontpage

Bibtex

Article{Hugenholtz2020g, title = {Trends and Developments in Artificial Intelligence: Challenges to Copyright}, author = {Hugenholtz, P. and Quintais, J. and Gervais, D.J.}, url = {http://copyrightblog.kluweriplaw.com/2020/12/16/trends-and-developments-in-artificial-intelligence-challenges-to-copyright/}, year = {1217}, date = {2020-12-17}, journal = {Kluwer Copyright Blog}, keywords = {Artificial intelligence, Auteursrecht, frontpage}, }

Decentralised Data Processing: Personal Data Stores and the GDPR external link

Janssen, H., Cobbe, J., Norval, C. & Singh, J.
International Data Privacy Law, vol. 10, num: 4, pp: 356-384, 2021

Abstract

When it comes to online services, users have limited control over how their personal data is processed. This is partly due to the nature of the business models of those services, where data is typically stored and aggregated in data centres. This has recently led to the development of technologies aiming at leveraging user control over the processing of their personal data. Personal Data Stores (“PDSs”) represent a class of these technologies; PDSs provide users with a device, enabling them to capture, aggregate and manage their personal data. The device provides tools for users to control and monitor access, sharing and computation over data on their device. The motivation for PDSs are described as (i) to assist users with their confidentiality and privacy concerns, and/or (ii) to provide opportunities for users to transact with or otherwise monetise their data. While PDSs potentially might enable some degree of user empowerment, they raise interesting considerations and uncertainties in relation to the responsibilities under the General Data Protection Regulation (GDPR). More specifically, the designations of responsibilities among key parties involved in PDS ecosystems are unclear. Further, the technical architecture of PDSs appears to restrict certain lawful grounds for processing, while technical means to identify certain category data, as proposed by some, may remain theoretical. We explore the considerations, uncertainties, and limitations of PDSs with respect to some key obligations under the GDPR. As PDS technologies continue to develop and proliferate, potentially providing an alternative to centralised approaches to data processing, we identify issues which require consideration by regulators, PDS platform providers and technologists.

GDPR, Privacy

Bibtex

Article{Janssen2021, title = {Decentralised Data Processing: Personal Data Stores and the GDPR}, author = {Janssen, H. and Cobbe, J. and Norval, C. and Singh, J.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3570895 https://www.ivir.nl/publicaties/download/IDPL-2021-4.pdf}, doi = {https://doi.org/https://doi.org/10.1093/idpl/ipaa016}, year = {0104}, date = {2021-01-04}, journal = {International Data Privacy Law}, volume = {10}, number = {4}, pages = {356-384}, abstract = {When it comes to online services, users have limited control over how their personal data is processed. This is partly due to the nature of the business models of those services, where data is typically stored and aggregated in data centres. This has recently led to the development of technologies aiming at leveraging user control over the processing of their personal data. Personal Data Stores (“PDSs”) represent a class of these technologies; PDSs provide users with a device, enabling them to capture, aggregate and manage their personal data. The device provides tools for users to control and monitor access, sharing and computation over data on their device. The motivation for PDSs are described as (i) to assist users with their confidentiality and privacy concerns, and/or (ii) to provide opportunities for users to transact with or otherwise monetise their data. While PDSs potentially might enable some degree of user empowerment, they raise interesting considerations and uncertainties in relation to the responsibilities under the General Data Protection Regulation (GDPR). More specifically, the designations of responsibilities among key parties involved in PDS ecosystems are unclear. Further, the technical architecture of PDSs appears to restrict certain lawful grounds for processing, while technical means to identify certain category data, as proposed by some, may remain theoretical. We explore the considerations, uncertainties, and limitations of PDSs with respect to some key obligations under the GDPR. As PDS technologies continue to develop and proliferate, potentially providing an alternative to centralised approaches to data processing, we identify issues which require consideration by regulators, PDS platform providers and technologists.}, keywords = {GDPR, Privacy}, }

Centering the Law in the Digital State external link

Cobbe, J., Seng Ah Lee, M., Singh, J. & Janssen, H.
Computer, vol. 53, num: 10, pp: 47-58, 2020

Abstract

Driven by the promise of increased efficiencies and cost-savings, the public sector has shown much interest in automated decision-making (ADM) technologies. However, the rule of law and fundamental principles of good government are being lost along the way.

automated decision making

Bibtex

Article{Cobbe2020, title = {Centering the Law in the Digital State}, author = {Cobbe, J. and Seng Ah Lee, M. and Singh, J. and Janssen, H.}, doi = {https://doi.org/10.1109/MC.2020.3006623}, year = {0925}, date = {2020-09-25}, journal = {Computer}, volume = {53}, number = {10}, pages = {47-58}, abstract = {Driven by the promise of increased efficiencies and cost-savings, the public sector has shown much interest in automated decision-making (ADM) technologies. However, the rule of law and fundamental principles of good government are being lost along the way.}, keywords = {automated decision making}, }