The Right to an Explanation in Practice: Insights from Case Law for the GDPR and the AI Act external link

Law, Innovation, and Technology (forthcoming), 2024

Abstract

[This is a pre-publication draft paper, forthcoming in Law, Innovation, and Technology 17.2, October 2025. The final version is subject to further revisions.] The right to an explanation under the GDPR has been much discussed in legal-doctrinal scholarship. This paper expands upon this academic discourse, by providing insights into what questions the application of the right to an explanation has raised in legal practice. By looking at cases brought before various judicial bodies and data protection authorities across the European Union, we discuss questions regarding the scope, content, and balancing exercise of the right to an explanation. We argue, moreover, that these questions also raise important interpretative issues regarding the right to an explanation under the AI Act. Similar to the GDPR, the AI Act's right to an explanation leaves many legal questions unanswered. Therefore, the insights from the already established case law under the GDPR, can help us to understand better how the AI Act's right to an explanation should be understood in practice.

AI Act, case law, GDPR, Privacy

Bibtex

Article{nokey, title = {The Right to an Explanation in Practice: Insights from Case Law for the GDPR and the AI Act}, author = {Metikoš, L. and Ausloos, J.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4996173}, year = {2024}, date = {2024-10-24}, journal = {Law, Innovation, and Technology (forthcoming)}, abstract = {[This is a pre-publication draft paper, forthcoming in Law, Innovation, and Technology 17.2, October 2025. The final version is subject to further revisions.] The right to an explanation under the GDPR has been much discussed in legal-doctrinal scholarship. This paper expands upon this academic discourse, by providing insights into what questions the application of the right to an explanation has raised in legal practice. By looking at cases brought before various judicial bodies and data protection authorities across the European Union, we discuss questions regarding the scope, content, and balancing exercise of the right to an explanation. We argue, moreover, that these questions also raise important interpretative issues regarding the right to an explanation under the AI Act. Similar to the GDPR, the AI Act\'s right to an explanation leaves many legal questions unanswered. Therefore, the insights from the already established case law under the GDPR, can help us to understand better how the AI Act\'s right to an explanation should be understood in practice.}, keywords = {AI Act, case law, GDPR, Privacy}, }

Expert perspectives on GDPR compliance in the context of smart homes and vulnerable persons

Information & Communications Technology Law, 2023

Abstract

This article introduces information gathered through 21 semi-structured interviews conducted with UK, EU and international professionals in the field of General Data Protection Regulation (GDPR) compliance and technology design, with a focus on the smart home context and vulnerable people using smart products. Those discussions gave various insights and perspectives into how the two communities (lawyers and technologists) view intricate practical data protection challenges in this specific setting. The variety of interviewees allowed to compare different approaches to data protection compliance topics. Answers to the following questions were provided: when organisations develop and/or deploy smart devices that use personal data, do they take into consideration the needs of vulnerable groups of people to comply with the GDPR? What are the underlying issues linked to the practical data protection law challenges faced by organisations working on smart devices used by vulnerable persons? How do experts perceive data protection law-related problems in this context?

Data protection, GDPR, Internet of Things, smart devices

Bibtex

Article{nokey, title = {Expert perspectives on GDPR compliance in the context of smart homes and vulnerable persons}, author = {Piasecki, S.}, doi = {https://doi.org/10.1080/13600834.2023.2231326}, year = {2023}, date = {2023-07-07}, journal = {Information & Communications Technology Law}, abstract = {This article introduces information gathered through 21 semi-structured interviews conducted with UK, EU and international professionals in the field of General Data Protection Regulation (GDPR) compliance and technology design, with a focus on the smart home context and vulnerable people using smart products. Those discussions gave various insights and perspectives into how the two communities (lawyers and technologists) view intricate practical data protection challenges in this specific setting. The variety of interviewees allowed to compare different approaches to data protection compliance topics. Answers to the following questions were provided: when organisations develop and/or deploy smart devices that use personal data, do they take into consideration the needs of vulnerable groups of people to comply with the GDPR? What are the underlying issues linked to the practical data protection law challenges faced by organisations working on smart devices used by vulnerable persons? How do experts perceive data protection law-related problems in this context?}, keywords = {Data protection, GDPR, Internet of Things, smart devices}, }

SLAPPed by the GDPR: protecting public interest journalism in the face of GDPR-based strategic litigation against public participation

Journal of Media Law, vol. 14, iss. : 2, pp: 378-405, 2022

Abstract

Strategic litigation against public participation is a threat to public interest journalism. Although typically a defamation claim underpins a SLAPP, the GDPR may serve as an alternative basis. This paper explores how public interest journalism is protected, and could be better protected, from abusive GDPR proceedings. The GDPR addresses the tension between data protection and freedom of expression by providing for a journalistic exemption. However, narrow national implementations of this provision leave the GDPR open for abuse. By analysing GDPR proceedings against newspaper Forbes Hungary, the paper illustrates how the GDPR can be instrumentalised as a SLAPP strategy. As European anti-SLAPP initiatives are finetuned, abusive GDPR proceedings need to be recognised as emerging forms of SLAPPs, requiring more attention to inadequate engagement with European freedom of expression standards in national implementations of the GDPR, data protection authorities’ role in facilitating SLAPPs, and the chilling effects of GDPR sanctions.

Data protection, Freedom of expression, GDPR, journalistic exemption, SLAPPS

Bibtex

Article{nokey, title = {SLAPPed by the GDPR: protecting public interest journalism in the face of GDPR-based strategic litigation against public participation}, author = {Rucz, M.}, doi = {https://doi.org/10.1080/17577632.2022.2129614}, year = {2022}, date = {2022-10-10}, journal = {Journal of Media Law}, volume = {14}, issue = {2}, pages = {378-405}, abstract = {Strategic litigation against public participation is a threat to public interest journalism. Although typically a defamation claim underpins a SLAPP, the GDPR may serve as an alternative basis. This paper explores how public interest journalism is protected, and could be better protected, from abusive GDPR proceedings. The GDPR addresses the tension between data protection and freedom of expression by providing for a journalistic exemption. However, narrow national implementations of this provision leave the GDPR open for abuse. By analysing GDPR proceedings against newspaper Forbes Hungary, the paper illustrates how the GDPR can be instrumentalised as a SLAPP strategy. As European anti-SLAPP initiatives are finetuned, abusive GDPR proceedings need to be recognised as emerging forms of SLAPPs, requiring more attention to inadequate engagement with European freedom of expression standards in national implementations of the GDPR, data protection authorities’ role in facilitating SLAPPs, and the chilling effects of GDPR sanctions.}, keywords = {Data protection, Freedom of expression, GDPR, journalistic exemption, SLAPPS}, }

The Right to Lodge a Data Protection Complaint: Ok, But Then What? An empirical study of current practices under the GDPR external link

European Data Protection Scholars Network
2022

Abstract

This study examines current Data Protection Authorities' (DPA) practices related to their obligation to facilitate the submission of complaints, granting special attention to the connection between this obligation and the right to an effective judicial remedy against DPAs. It combines legal analysis and the observation of DPA websites, together with insights obtained from the online public register of decisions adopted under the ʻone-stop-shopʼ mechanism. This study was commissioned by Access Now.

Data Protection Authorities, frontpage, GDPR, remedy, right to an effective remedy

Bibtex

Other{Network2022, title = {The Right to Lodge a Data Protection Complaint: Ok, But Then What? An empirical study of current practices under the GDPR}, author = {European Data Protection Scholars Network}, url = {https://www.ivir.nl/gdpr-complaint-study-2/}, year = {0712}, date = {2022-07-12}, abstract = {This study examines current Data Protection Authorities\' (DPA) practices related to their obligation to facilitate the submission of complaints, granting special attention to the connection between this obligation and the right to an effective judicial remedy against DPAs. It combines legal analysis and the observation of DPA websites, together with insights obtained from the online public register of decisions adopted under the ʻone-stop-shopʼ mechanism. This study was commissioned by Access Now.}, keywords = {Data Protection Authorities, frontpage, GDPR, remedy, right to an effective remedy}, }

A Matter of (Joint) control? Virtual assistants and the general data protection regulation external link

Computer Law & Security Review, vol. 45, 2022

Abstract

This article provides an overview and critical examination of the rules for determining who qualifies as controller or joint controller under the General Data Protection Regulation. Using Google Assistant – an artificial intelligence-driven virtual assistant – as a case study, we argue that these rules are overreaching and difficult to apply in the present-day information society and Internet of Things environments. First, as a consequence of recent developments in case law and supervisory guidance, these rules lead to a complex and ambiguous test to determine (joint) control. Second, due to advances in technological applications and business models, it is increasingly challenging to apply such rules to contemporary processing operations. In particular, as illustrated by the Google Assistant, individuals will likely be qualified as joint controllers, together with Google and also third-party developers, for at least the collection and possible transmission of other individuals’ personal data via the virtual assistant. Third, we identify follow-on issues relating to the apportionment of responsibilities between joint controllers and the effective and complete protection of data subjects. We conclude by questioning whether the framework for determining who qualifies as controller or joint controller is future-proof and normatively desirable.

frontpage, GDPR, Privacy, Recht op gegevensbescherming

Bibtex

Article{nokey, title = {A Matter of (Joint) control? Virtual assistants and the general data protection regulation}, author = {Mil, J. van and Quintais, J.}, doi = {https://doi.org/https://doi.org/10.1016/j.clsr.2022.105689}, year = {0616}, date = {2022-06-16}, journal = {Computer Law & Security Review}, volume = {45}, pages = {}, abstract = {This article provides an overview and critical examination of the rules for determining who qualifies as controller or joint controller under the General Data Protection Regulation. Using Google Assistant – an artificial intelligence-driven virtual assistant – as a case study, we argue that these rules are overreaching and difficult to apply in the present-day information society and Internet of Things environments. First, as a consequence of recent developments in case law and supervisory guidance, these rules lead to a complex and ambiguous test to determine (joint) control. Second, due to advances in technological applications and business models, it is increasingly challenging to apply such rules to contemporary processing operations. In particular, as illustrated by the Google Assistant, individuals will likely be qualified as joint controllers, together with Google and also third-party developers, for at least the collection and possible transmission of other individuals’ personal data via the virtual assistant. Third, we identify follow-on issues relating to the apportionment of responsibilities between joint controllers and the effective and complete protection of data subjects. We conclude by questioning whether the framework for determining who qualifies as controller or joint controller is future-proof and normatively desirable.}, keywords = {frontpage, GDPR, Privacy, Recht op gegevensbescherming}, }

The General Data Protection Regulation though the lens of digital sovereignty external link

2022

Abstract

This short contribution will present and discuss the European Union’s (EU) General Data Protection Regulation (GDPR) through the lens of ‘digital sovereignty. When high-ranking representatives of EU institutions endorsed digital sovereignty this has been interpreted as a signpost for a new-found assertiveness in EU digital policy. However, digital sovereignty is conceptually fuzzy and is used to animate a wide spectrum of geopolitical, normative, and industrial ambitions. In the context of the GDPR it makes sense to operationalize digital sovereignty as the ability of rules to assert authority in a global and interdependent digital ecosystem. Conceived this way, I will reflect on how the GDPR wields transnational capacity by design in the form of safeguards against inbound and outbound circumvention.

Digital sovereignty, GDPR, transfer of personal data, transnational capacity

Bibtex

Article{Irion2022, title = {The General Data Protection Regulation though the lens of digital sovereignty}, author = {Irion, K.}, url = {https://www.ivir.nl/irion-gdpr-and-digital-sovereignty-11mar22/}, year = {0328}, date = {2022-03-28}, abstract = {This short contribution will present and discuss the European Union’s (EU) General Data Protection Regulation (GDPR) through the lens of ‘digital sovereignty. When high-ranking representatives of EU institutions endorsed digital sovereignty this has been interpreted as a signpost for a new-found assertiveness in EU digital policy. However, digital sovereignty is conceptually fuzzy and is used to animate a wide spectrum of geopolitical, normative, and industrial ambitions. In the context of the GDPR it makes sense to operationalize digital sovereignty as the ability of rules to assert authority in a global and interdependent digital ecosystem. Conceived this way, I will reflect on how the GDPR wields transnational capacity by design in the form of safeguards against inbound and outbound circumvention.}, keywords = {Digital sovereignty, GDPR, transfer of personal data, transnational capacity}, }

Personal data ordering in context: the interaction of meso-level data governance regimes with macro frameworks external link

Internet Policy Review, vol. 10, num: 3, 2021

Abstract

The technological infrastructures enabling the collection, processing, and trading of data have fuelled a rapid innovation of data governance models. We differentiate between macro, meso, and micro level models, which correspond to major political blocks; societal-, industry-, or community level systems, and individual approaches, respectively. We focus on meso-level models, which coalesce around: (1) organisations prioritising their own interests over interests of other stakeholders; (2) organisations offering technological and legal tools aiming to empower individuals; (3) community-based data intermediaries fostering collective rights and interests. In this article we assess these meso-level models, and discuss their interaction with the macro-level legal frameworks that have evolved in the US, the EU, and China. The legal landscape has largely remained inconsistent and fragmented, with enforcement struggling to keep up with the latest developments. We argue, first, that the success of meso-logics is largely defined by global economic competition, and, second, that these meso-logics may potentially put the EU’s macro-level framework with its mixed internal market and fundamental rights-oriented model under pressure. We conclude that, given the relative absence of a strong macro level-framework and an intensive competition of governance models at meso-level, it may be challenging to avoid compromises to the European macro framework.

Data governance, Data intermediaries, Data ordering, Data sovereignty, GDPR

Bibtex

Article{Bodó2021b, title = {Personal data ordering in context: the interaction of meso-level data governance regimes with macro frameworks}, author = {Bodó, B. and Irion, K. and Janssen, H. and Giannopoulou, A.}, url = {https://policyreview.info/articles/analysis/personal-data-ordering-context-interaction-meso-level-data-governance-regimes}, doi = {https://doi.org/10.14763/2021.3.1581}, year = {1011}, date = {2021-10-11}, journal = {Internet Policy Review}, volume = {10}, number = {3}, pages = {}, abstract = {The technological infrastructures enabling the collection, processing, and trading of data have fuelled a rapid innovation of data governance models. We differentiate between macro, meso, and micro level models, which correspond to major political blocks; societal-, industry-, or community level systems, and individual approaches, respectively. We focus on meso-level models, which coalesce around: (1) organisations prioritising their own interests over interests of other stakeholders; (2) organisations offering technological and legal tools aiming to empower individuals; (3) community-based data intermediaries fostering collective rights and interests. In this article we assess these meso-level models, and discuss their interaction with the macro-level legal frameworks that have evolved in the US, the EU, and China. The legal landscape has largely remained inconsistent and fragmented, with enforcement struggling to keep up with the latest developments. We argue, first, that the success of meso-logics is largely defined by global economic competition, and, second, that these meso-logics may potentially put the EU’s macro-level framework with its mixed internal market and fundamental rights-oriented model under pressure. We conclude that, given the relative absence of a strong macro level-framework and an intensive competition of governance models at meso-level, it may be challenging to avoid compromises to the European macro framework.}, keywords = {Data governance, Data intermediaries, Data ordering, Data sovereignty, GDPR}, }

Personalised pricing: The demise of the fixed price? external link

Abstract

An online seller or platform is technically able to offer every consumer a different price for the same product, based on information it has about the customers. Such online price discrimination exacerbates concerns regarding the fairness and morality of price discrimination, and the possible need for regulation. In this chapter, we discuss the underlying basis of price discrimination in economic theory, and its popular perception. Our surveys show that consumers are critical and suspicious of online price discrimination. A majority consider it unacceptable and unfair, and are in favour of a ban. When stores apply online price discrimination, most consumers think they should be informed about it. We argue that the General Data Protection Regulation (GDPR) applies to the most controversial forms of online price discrimination, and not only requires companies to disclose their use of price discrimination, but also requires companies to ask customers for their prior consent. Industry practice, however, does not show any adoption of these two principles.

algorithms, frontpage, GDPR, gegevensbescherming, Personalisation, Price discrimination, Privacy

Bibtex

Article{Poort2021, title = {Personalised pricing: The demise of the fixed price?}, author = {Poort, J. and Zuiderveen Borgesius, F.}, url = {https://www.ivir.nl/publicaties/download/The-Demise-of-the-Fixed-Price.pdf}, year = {0304}, date = {2021-03-04}, abstract = {An online seller or platform is technically able to offer every consumer a different price for the same product, based on information it has about the customers. Such online price discrimination exacerbates concerns regarding the fairness and morality of price discrimination, and the possible need for regulation. In this chapter, we discuss the underlying basis of price discrimination in economic theory, and its popular perception. Our surveys show that consumers are critical and suspicious of online price discrimination. A majority consider it unacceptable and unfair, and are in favour of a ban. When stores apply online price discrimination, most consumers think they should be informed about it. We argue that the General Data Protection Regulation (GDPR) applies to the most controversial forms of online price discrimination, and not only requires companies to disclose their use of price discrimination, but also requires companies to ask customers for their prior consent. Industry practice, however, does not show any adoption of these two principles.}, keywords = {algorithms, frontpage, GDPR, gegevensbescherming, Personalisation, Price discrimination, Privacy}, }

Decentralised Data Processing: Personal Data Stores and the GDPR external link

Janssen, H., Cobbe, J., Norval, C. & Singh, J.
International Data Privacy Law, vol. 10, num: 4, pp: 356-384, 2021

Abstract

When it comes to online services, users have limited control over how their personal data is processed. This is partly due to the nature of the business models of those services, where data is typically stored and aggregated in data centres. This has recently led to the development of technologies aiming at leveraging user control over the processing of their personal data. Personal Data Stores (“PDSs”) represent a class of these technologies; PDSs provide users with a device, enabling them to capture, aggregate and manage their personal data. The device provides tools for users to control and monitor access, sharing and computation over data on their device. The motivation for PDSs are described as (i) to assist users with their confidentiality and privacy concerns, and/or (ii) to provide opportunities for users to transact with or otherwise monetise their data. While PDSs potentially might enable some degree of user empowerment, they raise interesting considerations and uncertainties in relation to the responsibilities under the General Data Protection Regulation (GDPR). More specifically, the designations of responsibilities among key parties involved in PDS ecosystems are unclear. Further, the technical architecture of PDSs appears to restrict certain lawful grounds for processing, while technical means to identify certain category data, as proposed by some, may remain theoretical. We explore the considerations, uncertainties, and limitations of PDSs with respect to some key obligations under the GDPR. As PDS technologies continue to develop and proliferate, potentially providing an alternative to centralised approaches to data processing, we identify issues which require consideration by regulators, PDS platform providers and technologists.

GDPR, Privacy

Bibtex

Article{Janssen2021, title = {Decentralised Data Processing: Personal Data Stores and the GDPR}, author = {Janssen, H. and Cobbe, J. and Norval, C. and Singh, J.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3570895 https://www.ivir.nl/publicaties/download/IDPL-2021-4.pdf}, doi = {https://doi.org/https://doi.org/10.1093/idpl/ipaa016}, year = {0104}, date = {2021-01-04}, journal = {International Data Privacy Law}, volume = {10}, number = {4}, pages = {356-384}, abstract = {When it comes to online services, users have limited control over how their personal data is processed. This is partly due to the nature of the business models of those services, where data is typically stored and aggregated in data centres. This has recently led to the development of technologies aiming at leveraging user control over the processing of their personal data. Personal Data Stores (“PDSs”) represent a class of these technologies; PDSs provide users with a device, enabling them to capture, aggregate and manage their personal data. The device provides tools for users to control and monitor access, sharing and computation over data on their device. The motivation for PDSs are described as (i) to assist users with their confidentiality and privacy concerns, and/or (ii) to provide opportunities for users to transact with or otherwise monetise their data. While PDSs potentially might enable some degree of user empowerment, they raise interesting considerations and uncertainties in relation to the responsibilities under the General Data Protection Regulation (GDPR). More specifically, the designations of responsibilities among key parties involved in PDS ecosystems are unclear. Further, the technical architecture of PDSs appears to restrict certain lawful grounds for processing, while technical means to identify certain category data, as proposed by some, may remain theoretical. We explore the considerations, uncertainties, and limitations of PDSs with respect to some key obligations under the GDPR. As PDS technologies continue to develop and proliferate, potentially providing an alternative to centralised approaches to data processing, we identify issues which require consideration by regulators, PDS platform providers and technologists.}, keywords = {GDPR, Privacy}, }

Data Protection or Data Frustration? Individual perceptions and attitudes towards the GDPR external link

Strycharz, J., Ausloos, J. & Helberger, N.
European Data Protection Law Review, vol. 6, num: 3, pp: 407-421, 2020

Abstract

Strengthening individual rights, enhancing control over one’s data and raising awareness were among the main aims the European Commission set for the General Data Protection Regulation (GDPR). In order to assess whether these aims have been met, research into individual perceptions, awareness, and understanding of the Regulation is necessary. This study thus examines individual reactions to the GDPR in order to provide insights into user agency in relation to the Regulation. More specifically, it discusses empirical data (survey with N = 1288) on individual knowledge of, reactions to, and rights exercised under the GDPR in the Netherlands. The results show high awareness of the GDPR and knowledge of individual rights. At the same time, the Dutch show substantial reactance to the Regulation and doubt the effectiveness of their individual rights. These findings point to several issues obstructing the GDPR’s effectiveness, and constitute useful signposts for policy-makers and enforcement agencies to prioritise their strategies in achieving the original aims of the Regulation.

frontpage, GDPR, gegevensbescherming, Privacy

Bibtex

Article{Strycharz2020, title = {Data Protection or Data Frustration? Individual perceptions and attitudes towards the GDPR}, author = {Strycharz, J. and Ausloos, J. and Helberger, N.}, url = {https://www.ivir.nl/publicaties/download/EDPLR_2020_3.pdf}, doi = {https://doi.org/https://doi.org/10.21552/edpl/2020/3/10}, year = {1013}, date = {2020-10-13}, journal = {European Data Protection Law Review}, volume = {6}, number = {3}, pages = {407-421}, abstract = {Strengthening individual rights, enhancing control over one’s data and raising awareness were among the main aims the European Commission set for the General Data Protection Regulation (GDPR). In order to assess whether these aims have been met, research into individual perceptions, awareness, and understanding of the Regulation is necessary. This study thus examines individual reactions to the GDPR in order to provide insights into user agency in relation to the Regulation. More specifically, it discusses empirical data (survey with N = 1288) on individual knowledge of, reactions to, and rights exercised under the GDPR in the Netherlands. The results show high awareness of the GDPR and knowledge of individual rights. At the same time, the Dutch show substantial reactance to the Regulation and doubt the effectiveness of their individual rights. These findings point to several issues obstructing the GDPR’s effectiveness, and constitute useful signposts for policy-makers and enforcement agencies to prioritise their strategies in achieving the original aims of the Regulation.}, keywords = {frontpage, GDPR, gegevensbescherming, Privacy}, }