audiovisual media services / content regulation / Copyright / hate speech

Video-sharing platform services and media services : what is the link between the new AVMSD and the old ECD?

avmsd-2

The regulation of digital content seems to be at a turning point. Based on the premise that the Internet of 2016, or should I say 2017, has nothing to do with the Internet of the 2000s, the European Commission (EC) has decided to act and released in May and in September this year two key proposals: a proposal for a new copyright Directive, and a proposal for amending the audiovisual media services Directive (AVMSD).

These texts are obviously political compromises and, as such, are officially meant to strike a balance between different types of (competing) interests. With this said, this regulatory ‘package’ is maybe one of the most politically sensitive to be found on the table of the EU law-makers at the moment.

The purpose of this post is not to undertake a comprehensive assessment of these two pieces of legislation (see my previous posts on the subject, such as discussed here in June). Rather, it aims to focus upon one hotly debated issue: the fate of the status given to certain categories of intermediary providers by the E-commerce Directive 2000/31/EC (ECD), which is, as some say, “a reminiscence of the 2000s, and therefore clearly outdated.”

The proposal for amending the AVMSD has on occasion been described as an instrument going in the same direction as the proposed copyright Directive. How is this explained?

In the following way: As a great number of platforms that store user-generated content (UGC) cannot be deemed as neutral and passive actors anymore, it is time to impose upon them a certain number of duties, even if they do not exactly obey the same logic as the one followed by traditional mass media. But because citizens, including kids, consume more Internet content [note that Internet content does not mean much actually given the variety of content available through the means of the Internet, which is only a network of networks] than they consume traditional broadcasting content, the same rules as those applying to traditional broadcasting media should be exported to regulate Internet media (so to speak).

The logic may sound somewhat circular, but one thing is sure, a handful of Internet ‘platforms’ (“which cover a wide-ranging set of activities including  online advertising platforms, marketplaces, search engines, social media and creative content outlets, application distribution platforms, communications services, payment systems, and platforms for the collaborative economy” as per the EC in its May Communication) have at their disposal huge quantities of  UGC and generate stellar revenues.

Two key questions have thus arisen, which really should not be confused with one another:

  1. Whether these platforms should take the initiative to regulate UGC?; and,
  2. Whether the value generated by these platforms should be partially redistributed to content creators, or at least repurposed to partially contribute to the production of EU works? [Psst: the strategy followed by the new Copyright Directive might not be using the most appropriate means to achieve this goal!].

As regards online platforms, the new AVMSD does not address question #2, as its Article 13 (that would oblige EU Member States, for example, to ensure that providers of AVM services under their jurisdiction secure at least a 20% share of European works in their catalogue and ensure prominence of these works) only targets on-demand audiovisual media services.

The new AVMSD only answers question #1 by setting two principles of UGC regulatory responsibility, which at first glance could seem quite far-reaching. However, several seem to be of the view that because these principles are about content harmful to minors, and hate speech, respectively, there can only be consensus.

Well, to be brief… it is still important to examine the nature of the means by which video-sharing platform providers should implement these principles.

Under Article 28a of the AMVSD [although to note that in the version of the proposed legislation that is before the European Parliament for its review, Article 28a has disappeared and has been merged with general principles applicable to all types of services… wow!], video-sharing platform providers shall take approporiate measures to:

(a) protect minors from content which may impair their physical, mental or moral development;

(b) protect all citizens from content containing incitement to violence or hatred directed against a group of persons or a member of such a group defined by reference to sex, race, colour, religion, descent or national or ethnic origin.

What do these principles really mean in practice? Article 28a paragraph 2 goes into more details:

What constitutes an appropriate measure for the purposes of paragraph 1 shall be determined in light of the nature of the content in question, the harm it may cause, the characteristics of the category of persons to be protected as well as the rights and legitimate interests at stake, including those of the video-sharing platform providers and the users having created and/or uploaded the content as well as the public interest.

The paragraph goes on to describe compliance with these principles as seeming to rely on a 5-stage process:

  1. First video-sharing platforms should make sure they relay legal definitions relating to hate speech (i.e. “incitement to violence or hatred”) and content harmful to minors (“content which may impair the physical, mental or moral development of minors”) in their terms of use – so that arguably [or, in an ideal world] when users read these terms of use they can better understand what is allowed and what is not allowed. To be more precise, however, video-sharing platforms do not have a duty as such to only relay legal definitions.
  2. Video-sharing platforms should put in place age verification systems to screen between minors and adults.
  3. Users should flag hate speech and content harmful to minors. Users can also rate content.
  4. Video-sharing platforms should take hate down hate speeech and restrict access to content harmful to minors.
  5. Video-sharing platforms should offer parental control solutions for content harmful to minors.

Sticking to the letter of the text and in particular to the measures expressly identified in paragraph 2 of Article 28a, there is an argument to say that Article 28a is consistent with Articles 14 and 15 of the ECD (regarding the hosting exemption from liability, and the exemption from any general obligation to monitor, respectively). This is the case as long as the focus is on paragraph 2 and not paragraph 1 of Article 28a. In other words, this is the case as long as video-sharing platforms can never have imposed on them, on the ground of paragraph 1, a general obligation to monitor their systems.

Assuming the foregoing holds true, the ‘only’ things that Article 28a would be adding, and actually this is quite important, would be:

  • Transparency obligations to the benefit of users (i.e. the establishment and operation of “systems through which providers of video-sharing platforms explain to users of video-sharing platforms what effect has been given to the reporting and flagging referred to in point” para. 2(f)) [To be honest, this is more than ‘quite’ important, as I have previously argued here that intermediaries should only be allowed to benefit from exemption from liability when they are being transparent with the way they regulate UGC].
  • Complaint and redress mechanisms to the benetifs of users (“Member States shall ensure that complaint and redress mechanisms are available for the settlement of disputes between users and video-sharing platform providers relating to the application of the appropriate measures referred to in paragraphs 1 and 2” para. 6)

To note, not all video-sharing platforms are targeted by Article 28a. For the sake of clarity I will reproduce the whole definition, which raises at least two questions in turn:

‘video-sharing platform service’ means a service, as defined by Articles 56 and 57 of the Treaty on the Functioning of the European Union, which meets the following requirements:

(i) the service consists of the storage of a large amount of programmes or user-generated videos, for which the video-sharing platform provider does not have editorial responsibility;

(ii) the organisation of the stored content is determined by the provider of the service including by automatic means or algorithms, in particular by hosting, displaying, tagging and sequencing;

(iii) the principal purpose of the service or a dissociable section thereof is devoted to providing programmes and user-generated videos to the general public, in order to inform, entertain or educate;

(iv) the service is made available by electronic communications networks within the meaning of point (a) of Article 2 of Directive 2002/21/EC.”;

In other words, only those platforms that store large amounts of programmes or UGC are caught by the AMSVD. This carve-out can be seen as an attempt to target only the biggest platforms operating for profit. However, by focusing on large amounts of UGC it is actually not that obvious that only the biggest ones operating for profit, which means platforms with “big financial pockets”, will only be affected.

Interestingly, while a video-sharing platform service consists of the storage of a large amount of either programmes or UGC (and in this sense platforms only offering UGC seem also to be targeted) the definition does not stop here. The principle purpose of the service or a dissociable section thereof must be devoted to “providing programmes and user-generated videos to the general public”. [What does this “and” really mean? Does it mean that in the end only the platforms offering both programmes (i.e. audiovisual media services) and UGC will be targeted?] By introducing the notion of “main purpose,” the EC may have had the intention to exclude a certain number of platforms that happen to also have videos on their systems… such as social media I am being told. So who is Article 28a for? Essentially youtube and its competitors?

To leave aside definitional issues for a moment, the next question that I hear you whispering is “why can’t we simply ask the video-sharing platforms to “simply” get rid of hate speech and harmful content?!?!” After all, and this a view that could seem convincing at first glance, these large platforms are not neutral players and they know how to monetise content based on sophisticated algorithms. Therefore they should know how to get rid of unlawful content such as hate speech or content harmful to minors.

Unfortunately, the matter is slightly more complicated from a technical viewpoint in practice. Requiring online platforms to get rid of hate speech and to put harmful content behind fences is essentially asking online platforms to monitor all the content stored on their systems in a systematic manner, and thereby, this would amount to imposing a general monitoring obligation upon video-sharing platforms. Yet, as we have seen, imposing a general monitoring obligation of this type contravenes Article 15 of the ECD

But why not create an exception to Article 15 of the ECD then?

Well, because imposing a general monitoring obligation of this type is also very likely to interfere with fundamental rights and in particular Articles 11 and 7 and 8 of the EU Charter of Fundamental Rights. In what ways? This would be the case for a number of reasons:

  • Identifying hate speech or harmful content requires assessing context. Yet there is no way these online platforms can manually check all the content on their systems. They would have to revert to content recognition technologies. And content recognition technologies are not able to assess context! [This is true even in the field of copyright by the way!].
  • What is more, being in a position to assess context is not always enough. One also needs legal expertise to be able to identify hate speech. And actually, even some of the most eminent judges have on occasion been slightly confused when attempting to making such expert determinations. Look at the Delfi case (commented upon here) for example. Was it really a case about hate speech? Or wasn’t it about defamation? And is there really consensus as to what amounts to hate speech, even if one insists that the requirement of incitement to violence is essential?
  • Furthermore, the systematic monitoring of online content posted by users would also violate Article 8 of the EU Charter.
  • It is true that a prima facie violation to a fundamental right is never the end of the story. Prima facie violations can be justified. However, in order to justify such violations, a proportionality test would need to be applied. Yet this is quite intriguing that when it comes to child pornography, which is on the scale of illegal content clearly more problematic than hate speech, Article 25 of Directive 2011/92/EU on combating the sexual abuse and sexual exploitation of children and child pornography does not go as far as imposing the systematic and preventive filtering of online platforms. It allows for the possibility of blocking access to web pages containing or disseminating child pornography as long as certain measures are in place (“These measures must be set by transparent procedures and provide adequate safeguards, in particular to ensure that the restriction is limited to what is necessary and proportionate, and that users are informed of the reason for the restriction. Those safeguards shall also include the possibility of judicial redress”). Recital 47 of that Directive also speaks about “Mechanisms [that] may also be put in place to block access from the Union’s territory to Internet pages identified as containing or disseminating child pornography.” In other words, content blocking orders can only follow procedures of contextual assessment of content.

So what should we do with video-sharing platforms?

There are at least 3 routes worth investigating.

Article 13 of the new AVMSD, which is not about video-sharing platforms, has a very interesting paragraph:

Member States shall waive the requirements laid down in paragraphs 1 and 2 for providers with a low turnover or low audience or if they are small and micro enterprises. Member States may also waive such requirements in cases where they would be impracticable or unjustified by reason of the nature or theme of the on-demand audiovisual media services.

Would it possible to adapt this paragraph for video-sharing platforms and ‘invite’ them to financially contribute to:

  1. The creation of ad hoc fast and expedient judicial institutions competent to assess the lawfulness of content and issue take-down orders;
  2. The development of programmes to enhance user digital literacy and in particular for kids;
  3. The creation of high-quality kid content and services?

Dear Members of the European Parliament, please keep cool when debating hot topics!

Sophie Stalla-Bourdillon

Leave a Reply