Opinie: Internetproletariërs aller landen verenigt u! download

Mediaforum, iss. : 3, pp: 85, 2023

Facebook, Internet, moderators

Bibtex

Article{nokey, title = {Opinie: Internetproletariërs aller landen verenigt u!}, author = {Dommering, E.}, url = {https://www.ivir.nl/nl/publications/opinie-internetproletariers-aller-landen-verenigt-u/opinie_dommering_mediaforum-2023-3/}, year = {2023}, date = {2023-06-30}, journal = {Mediaforum}, issue = {3}, keywords = {Facebook, Internet, moderators}, }

Personal Data Stores and the GDPR’s lawful grounds for processing personal data

Janssen, H., Cobbe, J., Norval, C. & Singh, J.
2019

Abstract

Personal Data Stores (‘PDSs’) entail users having a (physical or virtual) device within which they themselves can, in theory, capture, aggregate, and control the access to and the transfer of personal data. Their aim is to empower users in relation to their personal data, strengthening their opportunities for data protection, privacy, and/or to facilitate trade and monetisation. As PDS technologies develop, it is important to consider their role in relation to issues of data protection. The General Data Protection Regulation requires that the processing of user data be predicated on one of its defined lawful bases, whereby the Regulation does not favour any one basis over another. We explore how PDS architectures relate to these lawful bases, and observe that they tend to favour the bases that require direct user involvement. This paper considers issues that the envisaged architectural choices surrounding the lawful grounds may entail.

Data protection, decentralisation, lawful grounds for processing, personal data stores, Privacy, Transparency

Bibtex

Conference paper{nokey, title = {Personal Data Stores and the GDPR’s lawful grounds for processing personal data}, author = {Janssen, H. and Cobbe, J. and Norval, C. and Singh, J.}, doi = {https://doi.org/10.5281/zenodo.3234902}, year = {2019}, date = {2019-05-29}, abstract = {Personal Data Stores (‘PDSs’) entail users having a (physical or virtual) device within which they themselves can, in theory, capture, aggregate, and control the access to and the transfer of personal data. Their aim is to empower users in relation to their personal data, strengthening their opportunities for data protection, privacy, and/or to facilitate trade and monetisation. As PDS technologies develop, it is important to consider their role in relation to issues of data protection. The General Data Protection Regulation requires that the processing of user data be predicated on one of its defined lawful bases, whereby the Regulation does not favour any one basis over another. We explore how PDS architectures relate to these lawful bases, and observe that they tend to favour the bases that require direct user involvement. This paper considers issues that the envisaged architectural choices surrounding the lawful grounds may entail.}, keywords = {Data protection, decentralisation, lawful grounds for processing, personal data stores, Privacy, Transparency}, }

Data protection and tech startups: The need for attention, support, and scrutiny

Norval, C., Janssen, H., Cobbe, J. & Singh, J.
Policy & Internet, vol. 13, iss. : 2, pp: 278-299, 2021

Abstract

Though discussions of data protection have focused on the larger, more established organisations, startups also warrant attention. This is particularly so for tech startups, who are often innovating at the ‘cutting-edge’—pushing the boundaries of technologies that typically lack established data protection best-practices. Initial decisions taken by startups could well have long-term impacts, and their actions may inform (for better or for worse) how particular technologies and the applications they support are implemented, deployed, and perceived for years to come. Ensuring that the innovations and practices of tech startups are sound, appropriate and acceptable should therefore be a high priority. This paper explores the attitudes and preparedness of tech startups to issues of data protection. We interviewed a series of UK-based emerging tech startups as the EU's General Data Protection Regulation (GDPR) came into effect, which revealed areas in which there is a disconnect between the approaches of the startups and the nature and requirements of the GDPR. We discuss the misconceptions and associated risks facing innovative tech startups and offer a number of considerations for the firms and supervisory authorities alike. In light of our discussions, and given what is at stake, we argue that more needs to be done to help ensure that emerging technologies and the practices of the companies that operate them better align with the regulatory obligations. We conclude that tech startups warrant increased attention, support, and scrutiny to raise the standard of data protection for the benefit of us all.

Bibtex

Article{nokey, title = {Data protection and tech startups: The need for attention, support, and scrutiny}, author = {Norval, C. and Janssen, H. and Cobbe, J. and Singh, J.}, doi = {https://doi.org/10.1002/poi3.255}, year = {2021}, date = {2021-05-07}, journal = {Policy & Internet}, volume = {13}, issue = {2}, pages = {278-299}, abstract = {Though discussions of data protection have focused on the larger, more established organisations, startups also warrant attention. This is particularly so for tech startups, who are often innovating at the ‘cutting-edge’—pushing the boundaries of technologies that typically lack established data protection best-practices. Initial decisions taken by startups could well have long-term impacts, and their actions may inform (for better or for worse) how particular technologies and the applications they support are implemented, deployed, and perceived for years to come. Ensuring that the innovations and practices of tech startups are sound, appropriate and acceptable should therefore be a high priority. This paper explores the attitudes and preparedness of tech startups to issues of data protection. We interviewed a series of UK-based emerging tech startups as the EU\'s General Data Protection Regulation (GDPR) came into effect, which revealed areas in which there is a disconnect between the approaches of the startups and the nature and requirements of the GDPR. We discuss the misconceptions and associated risks facing innovative tech startups and offer a number of considerations for the firms and supervisory authorities alike. In light of our discussions, and given what is at stake, we argue that more needs to be done to help ensure that emerging technologies and the practices of the companies that operate them better align with the regulatory obligations. We conclude that tech startups warrant increased attention, support, and scrutiny to raise the standard of data protection for the benefit of us all.}, }

De toekomst van de digitale rechtsstaat. Pleidooi voor het gebruik van een mensenrechten impact assessment voor de publieke sector external link

(L)aw Matters: Blogs and Essays in Honour of prof. dr. Aalt Willem Hering, Boekenmaker, 2022, pp: 198-204

Bibtex

Chapter{nokey, title = {De toekomst van de digitale rechtsstaat. Pleidooi voor het gebruik van een mensenrechten impact assessment voor de publieke sector}, author = {Janssen, H.}, url = {https://www.globalacademicpress.com/ebooks/sascha_hardt/}, year = {2022}, date = {2022-03-25}, volume = {1}, pages = {198-204}, }

Practical fundamental rights impact assessments

Janssen, H., Seng Ah Lee, M. & Singh, J.
International Journal of Law and Information, vol. 30, iss. : 2, pp: 200-232, 2022

Abstract

The European Union’s General Data Protection Regulation tasks organizations to perform a Data Protection Impact Assessment (DPIA) to consider fundamental rights risks of their artificial intelligence (AI) system. However, assessing risks can be challenging, as fundamental rights are often considered abstract in nature. So far, guidance regarding DPIAs has largely focussed on data protection, leaving broader fundamental rights aspects less elaborated. This is problematic because potential negative societal consequences of AI systems may remain unaddressed and damage public trust in organizations using AI. Towards this, we introduce a practical, four-Phased framework, assisting organizations with performing fundamental rights impact assessments. This involves organizations (i) defining the system’s purposes and tasks, and the responsibilities of parties involved in the AI system; (ii) assessing the risks regarding the system’s development; (iii) justifying why the risks of potential infringements on rights are proportionate; and (iv) adopt organizational and/or technical measures mitigating risks identified. We further indicate how regulators might support these processes with practical guidance.

Bibtex

Article{nokey, title = {Practical fundamental rights impact assessments}, author = {Janssen, H. and Seng Ah Lee, M. and Singh, J.}, doi = {https://doi.org/10.1093/ijlit/eaac018}, year = {2022}, date = {2022-11-21}, journal = {International Journal of Law and Information}, volume = {30}, issue = {2}, pages = {200-232}, abstract = {The European Union’s General Data Protection Regulation tasks organizations to perform a Data Protection Impact Assessment (DPIA) to consider fundamental rights risks of their artificial intelligence (AI) system. However, assessing risks can be challenging, as fundamental rights are often considered abstract in nature. So far, guidance regarding DPIAs has largely focussed on data protection, leaving broader fundamental rights aspects less elaborated. This is problematic because potential negative societal consequences of AI systems may remain unaddressed and damage public trust in organizations using AI. Towards this, we introduce a practical, four-Phased framework, assisting organizations with performing fundamental rights impact assessments. This involves organizations (i) defining the system’s purposes and tasks, and the responsibilities of parties involved in the AI system; (ii) assessing the risks regarding the system’s development; (iii) justifying why the risks of potential infringements on rights are proportionate; and (iv) adopt organizational and/or technical measures mitigating risks identified. We further indicate how regulators might support these processes with practical guidance.}, }

Intermediating data rights exercises: the role of legal mandates

International Data Privacy Law, vol. 12, iss. : 4, pp: 316-331, 2022

Abstract

Data subject rights constitute critical tools for empowerment in the digitized society. There is a growing trend of relying on third parties to facilitate or coordinate the collective exercises of data rights, on behalf of one or more data subjects. This contribution refers to these parties as ‘Data Rights Intermediaries’ (DRIs), ie where an ‘intermediating’ party facilitates or enables the collective exercise of data rights. The exercise of data rights by these DRIs on behalf of the data subjects can only be effectuated with the help of mandates. Data rights mandates are not expressly framed in the GDPR their delineation can be ambiguous. It is important to highlight that data rights are mandatable and this without affecting their inalienability in light of their fundamental rights’ nature. This article argues that contract law and fiduciary duties both have longstanding traditions and robust norms in many jurisdictions, all of which can be explored towards shaping the appropriate environment to regulate data rights mandates in particular. The article concludes that the key in unlocking the full potential of data rights mandates can already be found in existing civil law constructs, whose diversity reveals the need for solidifying the responsibility and accountability of mandated DRIs. The continued adherence to fundamental contract law principles will have to be complemented by a robust framework of institutional safeguards. The need for such safeguards stems from the vulnerable position of data subjects, both vis-à-vis DRIs as well as data controllers.

Bibtex

Article{nokey, title = {Intermediating data rights exercises: the role of legal mandates}, author = {Giannopoulou, A. and Ausloos, J. and Delacroix, S. and Janssen, H.}, doi = {https://doi.org/10.1093/idpl/ipac017}, year = {2022}, date = {2022-11-15}, journal = {International Data Privacy Law}, volume = {12}, issue = {4}, pages = {316-331}, abstract = {Data subject rights constitute critical tools for empowerment in the digitized society. There is a growing trend of relying on third parties to facilitate or coordinate the collective exercises of data rights, on behalf of one or more data subjects. This contribution refers to these parties as ‘Data Rights Intermediaries’ (DRIs), ie where an ‘intermediating’ party facilitates or enables the collective exercise of data rights. The exercise of data rights by these DRIs on behalf of the data subjects can only be effectuated with the help of mandates. Data rights mandates are not expressly framed in the GDPR their delineation can be ambiguous. It is important to highlight that data rights are mandatable and this without affecting their inalienability in light of their fundamental rights’ nature. This article argues that contract law and fiduciary duties both have longstanding traditions and robust norms in many jurisdictions, all of which can be explored towards shaping the appropriate environment to regulate data rights mandates in particular. The article concludes that the key in unlocking the full potential of data rights mandates can already be found in existing civil law constructs, whose diversity reveals the need for solidifying the responsibility and accountability of mandated DRIs. The continued adherence to fundamental contract law principles will have to be complemented by a robust framework of institutional safeguards. The need for such safeguards stems from the vulnerable position of data subjects, both vis-à-vis DRIs as well as data controllers.}, }

Beyond financial regulation of crypto-asset wallet software: In search of secondary liability external link

Computer Law & Security Review, vol. 49, num: 105829, 2023

Abstract

Since Bitcoin, the blockchain space considerably evolved. One crucial piece of software to interact with blockchains and hold private-public key pairs to distinct crypto-assets and securities are wallets. Wallet software can be offered by liable third-parties (‘custodians’) who hold certain rights over assets and transactions. As parties subject to financial regulation, they are to uphold Anti-money Laundering and Combating the Financing of Terrorist (AML/CFT) standards by undertaking Know-Your-Customer (KYC) checks on users of their services. In juxtaposition, wallet software can also be issued without the involvement of a liable third-party. As no KYC is performed and users have full ‘freedom to act’, such ‘non-custodial’ wallet software is popular in criminal undertakings. They are required to interact with peer-to-peer applications and organisations running on blockchains whose benefits are not the subject of this paper. To date, financial regulation fails to adequately address such wallet software because it presumes the existence of a registered, liable entity offering said software. As illustrated in the case of Tornado Cash, financial regulation fails to trace chains of secondary liability. Alas, the considered solution is a systematic surveillance of all transactions. Against this backdrop, this paper sets forth an alternative approach rooted in copyright law. Concepts that pertain to secondary liability prove of value to develop a flexible, principles-based approach to the regulation of non-custodial wallet software that accounts for both, infringing and non-infringing uses.

blockchain, Crypto-assets, decentralised finance, non-custodial wallet, Regulation, secondary liability

Bibtex

Article{nokey, title = {Beyond financial regulation of crypto-asset wallet software: In search of secondary liability}, author = {Barbereau, T. and Bodó, B.}, url = {https://www.sciencedirect.com/science/article/pii/S0267364923000390}, doi = {https://doi.org/10.1016/j.clsr.2023.105829}, year = {2023}, date = {2023-06-22}, journal = {Computer Law & Security Review}, volume = {49}, number = {105829}, pages = {}, abstract = {Since Bitcoin, the blockchain space considerably evolved. One crucial piece of software to interact with blockchains and hold private-public key pairs to distinct crypto-assets and securities are wallets. Wallet software can be offered by liable third-parties (‘custodians’) who hold certain rights over assets and transactions. As parties subject to financial regulation, they are to uphold Anti-money Laundering and Combating the Financing of Terrorist (AML/CFT) standards by undertaking Know-Your-Customer (KYC) checks on users of their services. In juxtaposition, wallet software can also be issued without the involvement of a liable third-party. As no KYC is performed and users have full ‘freedom to act’, such ‘non-custodial’ wallet software is popular in criminal undertakings. They are required to interact with peer-to-peer applications and organisations running on blockchains whose benefits are not the subject of this paper. To date, financial regulation fails to adequately address such wallet software because it presumes the existence of a registered, liable entity offering said software. As illustrated in the case of Tornado Cash, financial regulation fails to trace chains of secondary liability. Alas, the considered solution is a systematic surveillance of all transactions. Against this backdrop, this paper sets forth an alternative approach rooted in copyright law. Concepts that pertain to secondary liability prove of value to develop a flexible, principles-based approach to the regulation of non-custodial wallet software that accounts for both, infringing and non-infringing uses.}, keywords = {blockchain, Crypto-assets, decentralised finance, non-custodial wallet, Regulation, secondary liability}, }

Leg het me nog één keer uit: het recht op een uitleg na Uber en Ola. Annotatie bij Hof Amsterdam, 4 april 2023 download

Privacy & Informatie, iss. : 3, pp: 114-116, 2023

Bibtex

Case note{nokey, title = {Leg het me nog één keer uit: het recht op een uitleg na Uber en Ola. Annotatie bij Hof Amsterdam, 4 april 2023}, author = {Metikoš, L.}, url = {https://www.ivir.nl/nl/pi_2023/}, year = {2023}, date = {2023-06-15}, journal = {Privacy & Informatie}, issue = {3}, }

Dealing with opinion power and media concentration in the platform era external link

LSE Blog, 2023

media concentration, Media law, Platforms

Bibtex

Online publication{nokey, title = {Dealing with opinion power and media concentration in the platform era}, author = {Seipp, T.}, url = {https://blogs.lse.ac.uk/medialse/2023/05/15/dealing-with-opinion-power-and-media-concentration-in-the-platform-era/}, year = {2023}, date = {2023-05-15}, journal = {LSE Blog}, keywords = {media concentration, Media law, Platforms}, }

Freedom of Expression, the Media and Journalists: Case-law of the European Court of Human Rights external link

McGonagle, T. & Voorhoof, D.
European Audiovisual Observatory, 2023, Strasbourg, Edition: 8th , ISBN: 9789287184351

Abstract

This e-book provides valuable insights into the European Court of Human Rights’ extensive case-law on freedom of expression and media and journalistic freedoms. The first seven editions of the e-book (2013, 2015, 2016, 2017, 2020, 2021 and 2022) have proved hugely successful. The new seventh edition summarises over 378 judgments or decisions by the Court and provides hyperlinks to the full text of each of the summarised judgments or decisions (via HUDOC, the Court's online case-law database).

Freedom of expression, Journalism, Media law

Bibtex

Book{nokey, title = {Freedom of Expression, the Media and Journalists: Case-law of the European Court of Human Rights}, author = {McGonagle, T. and Voorhoof, D.}, url = {https://rm.coe.int/iris-themes-vol-iii-8th-edition-april-2023-/1680ab1d11}, year = {2023}, date = {2023-04-24}, abstract = {This e-book provides valuable insights into the European Court of Human Rights’ extensive case-law on freedom of expression and media and journalistic freedoms. The first seven editions of the e-book (2013, 2015, 2016, 2017, 2020, 2021 and 2022) have proved hugely successful. The new seventh edition summarises over 378 judgments or decisions by the Court and provides hyperlinks to the full text of each of the summarised judgments or decisions (via HUDOC, the Court\'s online case-law database).}, keywords = {Freedom of expression, Journalism, Media law}, }