The proposal to extend the Audiovisiual Media Services Directive (AVMSD) continues along its legislative path. We are now entering the trilogue negotiations phase, and, after having read the unrelated [at least at first glance] G7 Taormina Statement on the fight against terrorism and violent extremism, I am re-reading the text of the Proposal for a Directive of the European Parliament and of the Council amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services in view of changing market realities (First reading) – General approach, 2016/0151 (COD), dated 5 May 2017.
Just in case EU law-makers are starting to feel defeatist [as well], I thought it might be useful to recall a couple of well-known principles, which could help clarify the debate [who knows?].
- Requirements for better regulating
The Better Regulation strategy of the European Commission (EC), as expressed in its Communication of 14 September 2016, lies on three pillars: transparency, evidence-based approach and citizen and stakeholder involvement. The EC suggests that the upshot of such a strategy can only be a “well-targeted, evidence-based and simply written” legislation, which is meant to be easily implemented and thereby will “achieve its goals on the ground.” Besides, the EC insists that better regulation is a “job for all institutions,” including the European Parliament and the Council.
Does the proposal to extend the Audiovisual Media Services Directive (AMSD) meet this standard? Well, it is legitimate to have some doubts as explained below [it is true to say that a few months ago I was more optimistic].
- Requirements for restricting freedom of expression
As a matter of principle, legislation that aims to restricting fundamental freedoms such as freedom of expression, which is protected by both the European Convention on Human Rights (Article 10) and the European Charter of Fundamental Rights (Article 11) should be subject to strict scrutiny. Such legislation must be assessed on the basis of a three-pronged test: restrictions to freedom of expression must be “prescribed by law,” have one or more legitimate aims in the light of paragraph 2 of Article 10, and be “necessary in a democratic society,” as per European Court of Human Rights (ECtHR) case law.
The proposal to extend the AVMSD, and its Article 28a in particular, prima facie restricts freedom of expression. This is because it aims to restrict, even if not completely prohibit, the transmission of content alleged to be harmful to minors, otherwise described as “content which may impair the physical, mental or moral development of minors,” which is not necessarily illegal content. In addition, it also aims to restrict, and in this case prohibit as well, content that is (only) alleged to “incite to violence or hatred” or “to publicly provoke the commitment of terrorist offences”. Allegedly unlawful content is also not necessarily illegal, as mistakes can be made by notifiers or users as well as video-platforms platforms themselves (assuming they do not have to automatically react upon notifications). There is no requirement in the proposal that video-sharing platforms should wait for a judicial decision to be delivered before acting, contrary to the Opinion of the Civil Liberties, Justice and Home Affairs Committee of the European Parliament (amendment 16 to Recital 30a).
In addition, the more I re-read the proposal to extend the AVMSD, the more it appears ambiguous. Clarity is essential for meeting the “prescribed by law” prong.
The fundamental ambiguity lies in the fact that Member States are required under Article 28a to ensure that video-sharing platforms take appropriate measures to protect users, including children, from certain types of harmful and/or illegal content.
A list of measures is provided in paragraph 2, which is not meant to be exhaustive, but which contains a subparagraph 2(a) “including and applying, in the terms and conditions of the video-sharing platform […] services, […] the requirements not to incite to violence or hatred as referred to in point (b) of paragraph 1 and not to publicly provoke the commitment of terrorist offences as referred to in point (ba) of paragraph 1, in accordance with Article 6, as well as the […] concept of content which may impair the physical, mental or moral development of minors, in accordance with Article[…] 12(1) […];”
While incitement to violence and hatred as well as public provocation to commit terrorist offences and content harmful to minors should be included within the terms and conditions of video-sharing platforms and applied, the proposal is silent on how “application” or better “enforcement” of these terms and conditions shall be undertaken.
The way to be compliant with the E-commerce Directive and its Article 15 as interpreted by the Court of Justice of the European Union (CJEU) in Sabam/Netlog would be to say that online platforms need to put in place notice-and-take-down procedures for them to be able to identify illegal content (and not simply alleged illegal content), and then react upon notifications. In other words, systematic ex ante filtering systems should not be an implementation option as per the CJEU case law, in line with the Parliament’s negotiation mandate.
If what underlies the proposal for an extended AVMSD is an obligation to put in place a notice-and-take-down procedure, is there really a need to include it at all in the proposal, as it is already part of the EU acquis, at least for illegal content?
Furthermore, Article 28a(2) expressly states: “The measures shall be practicable and proportionate, taking into account the size of the video-sharing platform service and the nature of the service that is provided.” This seems to suggest that given the sheer volume of user-generated content uploaded on video-sharing platforms, automated content restriction systems can, if not de facto shall, be put in place.
This begs the question whether what the European Commission has in mind is in fact the imposition of an ex ante filtering obligation. Examining other proposals in the field, such as the proposal for a new Copyright Directive in the Digital Single Market, it would seem that such an interpretation is more than realistic. A clear reference to Article 14 and 15 of the E-commerce Directive in Article 28a as references to the E-commerce Directive were also found in other proposals that propose such restrictions.
Article 28a is all the more ambiguous, given that it also caters for the possibility for Member States to impose on video-sharing platforms “measures that are more detailed or stricter than the measures referred to in […] paragraph 2.” It is actually hard to see what a measure stricter than a notice-and-take-down procedure could be if it is not an ex ante filtering obligation. The only possible option would probably be the suspension of user accounts. However, as such suspension does not really help with the fact a sheer volume of user-generated content is present on video-sharing platforms.
Article 28a therefore lacks a minimum level of clarity.
This lack of clarity is highly problematic because Article 28a would then restrict freedom of expression in another way: by implicitly referring to automated means for private enforcement purposes, which necessarily entails significant level of false positives, since the technology is not able to assess context and therefore necessarily leads to a chilling effect. The CJEU has already stated twice that freedom of expression is undermined when a filtering system does not distinguish adequately between unlawful content and lawful content, with the result that its introduction could lead to the restriction of lawful communications.
Finally, for a restriction to be necessary in a democratic society, “the authorities are required, (…), to choose the means that cause the least possible prejudice to the rights in question [i.e. freedom of expression]” as per the European Court of Human Rights (ECtHR) case law.
The Council of Europe Committee of Ministers has specifically dealt with Internet filters in its Recommendation of 2008 and stated, among other things, that Member States should:
- refrain from filtering Internet content in electronic communications networks operated by public actors for reasons other than those laid down in Article 10, paragraph 2, of the European Convention on Human Rights, as interpreted by the European Court of Human Rights;
- guarantee that nationwide general blocking or filtering measures are only introduced by the state if the conditions of Article 10, paragraph 2, of the European Convention on Human Rights are fulfilled. Such action by the state should only be taken if the filtering concerns specific and clearly identifiable content, a competent national authority has taken a decision on its illegality and the decision can be reviewed by an independent and impartial tribunal or regulatory body, in accordance with the requirements of Article 6 of the European Convention on Human Rights;
Notably, the interests of minors are expressly taken into account in the Recommendation through e.g. the facilitation and promotion of “initiatives that assist parents and educators in the selection and use of developmental-age appropriate filters for children and young people,” the development of “common labels for filters to assist parents and educators in making informed choices when acquiring filters and to certify that they meet certain quality requirements,” as well as the possible implementation of filters in certain places, such as schools or libraries.
Article 28a simply requires that for “Member States shall ensure that complaint and redress mechanisms are available for the settlement of disputes between users and video-sharing platform providers relating to the application of paragraphs 1 and 2.”
It is therefore questionable whether the restrictions set forth in Article 28a are necessary in a democratic society, as they seem to suggest that private regulation (regulation at the initiative of video-sharing platforms) is sufficient to meet Article 28a. Under Article 28a, Member States are only encouraged to use co-regulation.
A slightly modified version of this post is available here, with some recommendations at the end.
Sophie Stalla-Bourdillon
Pingback: Law and Media Round Up – 12 June 2017 | Inforrm's Blog