Towards a Normative Perspective on Journalistic
AI: Embracing the Messy Reality of Normative
Ideals
download

Helberger, N., Drunen, M. van, Möller, J., Vrijenhoek, S. & Eskens, S.
Digital Journalism, vol. 10, iss. : 10, pp: 1605-1626, 2022

Abstract

Few would disagree that AI systems and applications need to be “responsible,” but what is “responsible” and how to answer that question? Answering that question requires a normative perspective on the role of journalistic AI and the values it shall serve. Such a perspective needs to be grounded in a broader normative framework and a thorough understanding of the dynamics and complexities of journalistic AI at the level of people, newsrooms and media markets. This special issue aims to develop such a normative perspective on the use of AI-driven tools in journalism and the role of digital journalism studies in advancing that perspective. The contributions in this special issue combine conceptual, organisational and empirical angles to study the challenges involved in actively using AI to promote editorial values, the powers at play, the role of economic and regulatory conditions, and ways of bridging academic ideals and the messy reality of the real world. This editorial brings the different contributions into conversation, situates them in the broader digital journalism studies scholarship and identifies seven key-take aways.

Artificial intelligence, governance, Journalism, Media law, normative perspective, professional values, Regulation

Bibtex

Article{nokey, title = {Towards a Normative Perspective on JournalisticAI: Embracing the Messy Reality of NormativeIdeals}, author = {Helberger, N. and Drunen, M. van and Möller, J. and Vrijenhoek, S. and Eskens, S.}, url = {https://www.ivir.nl/nl/publications/towards-a-normative-perspective-on-journalisticai-embracing-the-messy-reality-of-normativeideals/digital_journalism_2022_10/}, doi = {https://doi.org/10.1080/21670811.2022.2152195}, year = {2022}, date = {2022-12-22}, journal = {Digital Journalism}, volume = {10}, issue = {10}, pages = {1605-1626}, abstract = {Few would disagree that AI systems and applications need to be “responsible,” but what is “responsible” and how to answer that question? Answering that question requires a normative perspective on the role of journalistic AI and the values it shall serve. Such a perspective needs to be grounded in a broader normative framework and a thorough understanding of the dynamics and complexities of journalistic AI at the level of people, newsrooms and media markets. This special issue aims to develop such a normative perspective on the use of AI-driven tools in journalism and the role of digital journalism studies in advancing that perspective. The contributions in this special issue combine conceptual, organisational and empirical angles to study the challenges involved in actively using AI to promote editorial values, the powers at play, the role of economic and regulatory conditions, and ways of bridging academic ideals and the messy reality of the real world. This editorial brings the different contributions into conversation, situates them in the broader digital journalism studies scholarship and identifies seven key-take aways.}, keywords = {Artificial intelligence, governance, Journalism, Media law, normative perspective, professional values, Regulation}, }

Toward a Critique of Algorithmic Violence external link

Bellanova, R., Irion, K., Lindskov Jacobsen, K., Ragazzi, F., Saugmann, R. & Suchman, L.
International Political Sociology, vol. 15, num: 1, pp: 121–150, 2021

Abstract

Questions about how algorithms contribute to (in)security are under discussion across international political sociology. Building upon and adding to these debates, our collective discussion foregrounds questions about algorithmic violence. We argue that it is important to examine how algorithmic systems feed (into) specific forms of violence, and how they justify violent actions or redefine what forms of violence are deemed legitimate. Bringing together different disciplinary and conceptual vantage points, this collective discussion opens a conversation about algorithmic violence focusing both on its specific instances and on the challenges that arise in conceptualizing and studying it. Overall, the discussion converges on three areas of concern—the violence undergirding the creation and feeding of data infrastructures; the translation processes at play in the use of computer/machine vision across diverse security practices; and the institutional governing of algorithmic violence, especially its organization, limitation, and legitimation.

affordences, algorithmic violence, Artificial intelligence, cloud computing, frontpage, governance, harm, interdisciplinary, machine learning

Bibtex

Article{Bellanova2021, title = {Toward a Critique of Algorithmic Violence}, author = {Bellanova, R. and Irion, K. and Lindskov Jacobsen, K. and Ragazzi, F. and Saugmann, R. and Suchman, L.}, doi = {https://doi.org/https://doi.org/10.1093/ips/olab003}, year = {0329}, date = {2021-03-29}, journal = {International Political Sociology}, volume = {15}, number = {1}, pages = {121–150}, abstract = {Questions about how algorithms contribute to (in)security are under discussion across international political sociology. Building upon and adding to these debates, our collective discussion foregrounds questions about algorithmic violence. We argue that it is important to examine how algorithmic systems feed (into) specific forms of violence, and how they justify violent actions or redefine what forms of violence are deemed legitimate. Bringing together different disciplinary and conceptual vantage points, this collective discussion opens a conversation about algorithmic violence focusing both on its specific instances and on the challenges that arise in conceptualizing and studying it. Overall, the discussion converges on three areas of concern—the violence undergirding the creation and feeding of data infrastructures; the translation processes at play in the use of computer/machine vision across diverse security practices; and the institutional governing of algorithmic violence, especially its organization, limitation, and legitimation.}, keywords = {affordences, algorithmic violence, Artificial intelligence, cloud computing, frontpage, governance, harm, interdisciplinary, machine learning}, }

Operationalizing Research Access in Platform Governance: What to learn from other industries? external link

Abstract

A new study published by AlgorithmWatch, in cooperation with the European Policy Centre and the University of Amsterdam’s Institute for Information Law, shows that the GDPR needn’t stand in the way of meaningful research access to platform data; looks to health and environmental sectors for best practices in privacy-respecting data sharing frameworks.

Facebook, frontpage, governance, Platforms, research access

Bibtex

Report{Ausloos2020b, title = {Operationalizing Research Access in Platform Governance: What to learn from other industries?}, author = {Ausloos, J. and Leerssen, P. and Thije, P. ten}, url = {https://www.ivir.nl/publicaties/download/GoverningPlatforms_IViR_study_June2020-AlgorithmWatch-2020-06-24.pdf}, year = {0625}, date = {2020-06-25}, abstract = {A new study published by AlgorithmWatch, in cooperation with the European Policy Centre and the University of Amsterdam’s Institute for Information Law, shows that the GDPR needn’t stand in the way of meaningful research access to platform data; looks to health and environmental sectors for best practices in privacy-respecting data sharing frameworks.}, keywords = {Facebook, frontpage, governance, Platforms, research access}, }

European Regulation of Smartphone Ecosystems external link

European Data Protection Law Review (EDPL), vol. 5, num: 4, pp: 476-491, 2019

Abstract

For the first time, two pieces of EU legislation will specifically target smartphone ecosystems in relation to smartphone and mobile software (eg, iOS and Android) privacy, and use and monetisation of data. And yet, both pieces of legislation approach data use and data monetisation from radically contrasting perspectives. The first is the proposed ePrivacy Regulation, which seeks to provide enhanced protection against user data monitoring and tracking in smartphones, and safeguard privacy in electronic communications. On the other hand, the recently enacted Platform-to-Business Regulation 2019, seeks to bring fairness to platform-business user relations (including app stores and app developers), and is crucially built upon the premise that the ability to access and use data, including personal data, can enable important value creation in the online platform economy. This article discusses how these two Regulations will apply to smartphone ecosystems, especially relating to user and device privacy. The article analyses the potential tension points between the two sets of rules, which result from the underlying policy objectives of safeguarding privacy in electronic communications and the functioning of the digital economy in the emerging era of platform governance. The article concludes with a discussion on how to address these issues, at the intersection of privacy and competition in the digital platform economy.

frontpage, governance, Platforms, Privacy, Regulering, smartphones

Bibtex

Article{Fahy2019eb, title = {European Regulation of Smartphone Ecosystems}, author = {Fahy, R. and van Hoboken, J.}, url = {https://edpl.lexxion.eu/article/EDPL/2019/4/6}, doi = {https://doi.org/https://doi.org/10.21552/edpl/2019/4/6}, year = {1213}, date = {2019-12-13}, journal = {European Data Protection Law Review (EDPL)}, volume = {5}, number = {4}, pages = {476-491}, abstract = {For the first time, two pieces of EU legislation will specifically target smartphone ecosystems in relation to smartphone and mobile software (eg, iOS and Android) privacy, and use and monetisation of data. And yet, both pieces of legislation approach data use and data monetisation from radically contrasting perspectives. The first is the proposed ePrivacy Regulation, which seeks to provide enhanced protection against user data monitoring and tracking in smartphones, and safeguard privacy in electronic communications. On the other hand, the recently enacted Platform-to-Business Regulation 2019, seeks to bring fairness to platform-business user relations (including app stores and app developers), and is crucially built upon the premise that the ability to access and use data, including personal data, can enable important value creation in the online platform economy. This article discusses how these two Regulations will apply to smartphone ecosystems, especially relating to user and device privacy. The article analyses the potential tension points between the two sets of rules, which result from the underlying policy objectives of safeguarding privacy in electronic communications and the functioning of the digital economy in the emerging era of platform governance. The article concludes with a discussion on how to address these issues, at the intersection of privacy and competition in the digital platform economy.}, keywords = {frontpage, governance, Platforms, Privacy, Regulering, smartphones}, }