In this new study Prof. Martin Senftleben and Dr. Christina Angelopoulos provide guidelines for content moderation obligations under the Digital Services Act:
The Odyssey of the Prohibition on General Monitoring Obligations on the Way to the Digital Services Act: Between Article 15 of the E-Commerce Directive and Article 17 of the Directive on Copyright in the Digital Single Market
M.R.F. Senftleben & C. Angelopoulos, Amsterdam: Institute for Information Law/Cambridge: Centre for Intellectual Property and Information Law, October 22, 2020.
In the context of the current debate on the potential further harmonisation of the responsibilities of online platforms and information service providers in the Digital Services Act, the study seeks to clarify the correct interpretation of the scope of the prohibition of a general monitoring obligation in the E-Commerce Directive and the Directive on Copyright in the Digital Single Market in order to identify guidelines for the potential inclusion and further development of the general monitoring ban in the Digital Services Act.
The study was prepared by Prof. Dr. Martin Senftleben (Institute for Information Law (IViR), University of Amsterdam) and Dr. Christina Angelopoulos (Centre for Intellectual Property and Information Law
(CIPIL), University of Cambridge).
Funding for this project was secured from Copyright for Creativity (C4C). The authors have carried out the study in complete academic independence.
Abstract:
EU law provides explicitly that intermediaries may not be obliged to monitor their service in a general manner in order to detect and prevent the illegal activity of their users. However, a misunderstanding of the difference between monitoring specific content and monitoring FOR specific content is a recurrent theme in the debate on intermediary liability and a central driver of the controversy surrounding it. Rightly understood, a prohibited general monitoring obligation arises whenever content – no matter how specifically it is defined – must be identified among the totality of the content on a platform. The moment platform content must be screened in its entirety, the monitoring obligation acquires an excessive, general nature. Against this background, a content moderation duty can only be deemed permissible if it is specific in respect of both the protected subject matter and potential infringers.
This requirement of ‘double specificity’ is of particular importance because it prevents encroachments upon fundamental rights. The jurisprudence of the Court of Justice of the European Union has shed light on the anchorage of the general monitoring ban in primary EU law, in particular the right to the protection of personal data, the freedom of expression and information, the freedom to conduct a business, and the free movement of goods and services in the internal market. Due to their higher rank in the norm hierarchy, these legal guarantees constitute common ground for the application of the general monitoring prohibition in secondary EU legislation, namely Article 15(1) of the E-Commerce Directive (‘ECD’) and Article 17(8) of the Directive on Copyright in the Digital Single Market (‘CDSMD’).
With regard to the Digital Services Act (‘DSA’), this result of the analysis implies that any further manifestation of the general monitoring ban in the DSA would have to be construed and applied – in the light of applicable CJEU case law – as a safeguard against encroachments upon the aforementioned fundamental rights and freedoms. If the final text of the DSA does not contain a reiteration of the prohibition of general monitoring obligations known from Article 15(1) ECD and Article 17(8) CDSMD, the regulation of internet service provider liability, duties of care and injunctions would still have to avoid inroads into the aforementioned fundamental rights and freedoms and observe the principle of proportionality. The double specificity requirement plays a central role in this respect.