
What better timing than less than two weeks after the release of the (poorly drafted) proposed Digital Omnibus to issue a (ground-shaking) judgment on the interplay between the old e‑Commerce Directive (ECD) and the General Data Protection Regulation (GDPR)?
Here’s a freshly delivered ruling from the Court of Justice of the European Union (CJEU) that is sure to get people talking, writing and perhaps reading. [For those who would rather not read there is a video by the president of the CJEU in person here!]
In a nutshell, the case is about Russmedia Digital (Russmedia), the owner of the website https://www.publi24.ro/, “an online marketplace on which advertisements for, inter alia, the sale of goods or the provision of services in Romania can be published either free of charge or for a fee.” [A platform similar to Gumtree or Leboncoin].
An unidentified third party had published on that website an advertisement presenting the applicant as offering sexual services. The ad contained photographs of the applicant, used without her consent, as well as her telephone number.
Replicas of the ad were then placed on other advertising websites, where it appeared with a note referring to its original source. When the applicant reached out to Russmedia, the company took the ad down from its site in under an hour. However, the ad continued to appear on other websites that had reposted it.
The applicant filed a lawsuit against Russmedia on various grounds: violation of image right, rights to honour, reputation, privacy, and data protection. The Court of First Instance of Cluj-Napoca (Romania) ordered Russmedia to pay damages for the non-material harm caused.
Russmedia appealed against that judgment to the Specialised Court of Cluj, which upheld the appeal on the ground that Russmedia merely provided a hosting service for that ad, and therefore could benefit from the liability exemption provided for in Article 14(1)(b) of Law No 365/2002, transposing the ECD.
Remember the old Article 14 ECD? Paragraph 1 reads as follows:
Where an information society service is provided that consists of the storage of information provided by a recipient of the service, Member States shall ensure that the service provider is not liable for the information stored at the request of a recipient of the service, on condition that:
(a) the provider does not have actual knowledge of illegal activity or information and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or information is apparent; or
(b) the provider, upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the information.
The applicant subsequently appealed arguing that Russmedia was not a hosting provider within the meaning of Law No 365/2002, as it played “a management role, intervening in terms of content in order to ensure good information management.” Moreover, the applicant also argued that Law No 365/2002 did not prevent the application of the GDPR to Russmedia. The Court of Appeal of Clujm, whose decision is final, decided to refer a set of 4 questions to the CJEU. Why? Because given the nature of the content at stake, which it viewed as being “manifestly unlawful and deeply harmful to the data subject,” it thought the CJEU had not yet had the opportunity to examine such a situation. [Those who studied intermediary liability prior to the adoption of the Digital Services Act (DSA) will remember that there was a relatively rich case law already on the scope and effects of Articles 12–15 ECD].
The 4 questions referred by the Clujm Court of Appeal touch upon various important EU law concepts such as the definition of hosting provider, controller, joint controller, and the content and effect of obligations of controllers on the basis of several provisions of the GDPR including Articles 5, 6, 7, 24, and 25 GDPR, as well as the effects of Articles 12-15 ECD.
The CJEU states that answering these 4 questions essentially requires tackling 2 questions:
- whether “the operator of an online marketplace, (…) which allows its users to place advertisements anonymously on its online marketplace free of charge or for a fee, has failed to fulfil its obligations under the GDPR” considering that the advertisement placed contained special categories of personal data (special category data);
- whether Articles 12 to 15 ECD are applicable to such an operator.
Departing from its Advocate General (AG)’s opinion on several key points, the CJEU held that:
- The operator of an online marketplace is a controller of the personal data contained in an ad published on that online marketplace (para. 65)
- the operator of an online marketplace is required, before publishing ads, to identify the ads that contain special category data, and verify whether the user advertiser is either the subject of the data or the user advertiser can demonstrate that he/she has an appropriate legal basis to publish such data under Article 9 GDPR (para. 106)
- the operator of an online marketplace is required to implement measures to prevent ads containing sensitive data from being copied and unlawfully published on other websites (para. 126)
- the operator of an online marketplace, as controller of the personal data contained in ads published on its platform, cannot avail itself of Articles 12-15 ECD, with respect to infringements of Article 5(2), Articles 24 to 26 and 32 GDPR (para. 136).
Is the CJEU putting the e‑Commerce Directive [and the DSA by the same occasion] on the chopping block? That’s the question circulating on social media. Let’s break down the key steps in the CJEU’s reasoning to see if we can make sense of what the Court is trying to do [assuming….the Court knows what it is doing].
Many broad EU concepts
The CJEU easily qualify photographs and telephone numbers as personal data within the meaning of Article 4(1) GDPR, restating [one more time] that the EU concept of personal data is a broad one. [Remember what’s at stake with the Digital Omnibus as explained here.]
The CJEU also confirms that the EU concept of special categories of personal data is a broad concept, covering both “inherently sensitive data” and “data revealing information of that nature indirectly, following an intellectual operation involving deduction or cross-referencing” [Remember, the leaked Digital Omnibus touching upon the GDPR had tried to eliminate the second prong].
To continue with the list of broad EU data protection concepts, there is also that of processing, as the CJEU notes that the “operation of loading personal data on a webpage constitutes processing, within the meaning of Article 4(2) of the GDPR.”
Unsurprisingly, the EU concept of “controller” (and thereby “joint controller”) is also a broad one. The CJEU, referring to Nacionalinis, writes at para. 58 that:
“any natural or legal person who exerts influence over the processing of such data, for his or her own purposes, and who participates, as a result, in the determination of the purposes and means of that processing, may be regarded as a controller in respect of such processing”.
For the concept of joint controllership, paragraph 61, still referring to Nacionalinis, adds that:
“Such joint responsibility does not necessarily require the existence of common decisions as to the determination of the purposes and means of the processing of the personal data concerned. Indeed, the Court has held that participation in the determination of the purposes and means of processing can take different forms, since such participation can result from a common decision taken together by two or more entities or from converging decisions which complement each other in such a manner that they each have a tangible impact on the determination of the purposes and means of the processing.”
One radical interpretation
In the case at hand, the user advertiser is considered to have “principally determined the purposes and means of the processing of those data.” (para. 65). The user advertiser is thus a controller for the personal data uploaded on the platform.
The CJEU also considers the platform operator to be a controller, and therefore a joint controller together with the user advertiser.
There are, at least, three important considerations in this case.
First, the content is of manifestly unlawful nature (see para. 32 of the Advocate General’s Opinion which makes it a unique aspect of the case). Because the CJEU refers to special category data, it is very tempting to generalise further the CJEU’s ruling.
Second, the platform user is an advertiser. Again, because the CJEU refers to data controllers to derive requirements, it is very tempting to generalise further the CJEU’s ruling.
Third, the platform operator processes the personal data at stake for “commercial or advertising purposes which go beyond the mere provision of a service which he or she provides to the user advertiser.” (para. 66). Therefore, and again this is important, the platform operator “publishes the personal data contained in the advertisements not on behalf of the user advertisers, or not solely on their behalf, but processes and can exploit those data for its own advertising and commercial purposes.” (para. 67).
Therefore, the CJEU finds that the platform operator exerts influence upon both the processing purposes and processing means. (para. 68 and 70).
More specifically:
- The operator of the online platform holds significant architectural power, which in turn gives rise to structural control over information and amplification power, i.e., so that it exerts “a decisive influence on the overall dissemination of the data at stake (para. 72)
- The online operator also exerts a significant legal power over data accessibility, as it “reserves in particular the right to distribute, transmit, publish, remove or reproduce the information contained in the advertisements, including the personal data contained therein.” (para 73).
These four considerations seem to give rise to the view that the platform operator cannot, so to speak, have its cake and eat it too. But is the CJEU’s reasoning a convincing way to sort things out?
Back in the day, I argued that to solve the interplay between the ECD and the GDPR it is crucial to distinguish between third-party conduct and a platform’s own conduct, and to ensure that platforms do not benefit from exemptions for processing activities that are, in fact, their own, e.g., processing of third-party content for their own advertising purposes. [See here and here].
There are, however, more and less effective ways to distinguish between these types of conduct [and therefore more and less convincing ways to sort things out….]
In Russmedia, the CJEU rejects two approaches:
- Segmenting processing activities by purpose and allocating roles for each segmented purpose. This would have made it possible to distinguish between the platform operator acting as a controller for the re-use of the data at issue, and the platform operator acting as a processor for the primary use. Fashion ID might have helped. (“By contrast, in the light of that information, it seems, at the outset, impossible that Fashion ID determines the purposes and means of subsequent operations involving the processing of personal data carried out by Facebook Ireland after their transmission to the latter, meaning that Fashion ID cannot be considered to be a controller in respect of those operations within the meaning of Article 2(d).”). This is what the Advocate General had tried to do in its opinion (see para. 118 of the opinion).
- Considering that making the content available through the platform by structuring it constitutes the platform’s own processing activity and transforming the platform provider’s data-protection obligations into ex post due-diligence obligations, namely, takedown obligations. Google Spain could have been a source of inspiration. (“[T]he operator of a search engine is obliged to remove from the list of results displayed following a search made on the basis of a person’s name links to web pages, published by third parties and containing information relating to that person, also in a case where that name or information is not erased beforehand or simultaneously from those web pages, and even, as the case may be, when its publication in itself on those pages is lawful.”).
What the CJEU effectively does in Russmedia is designate the platform operator as a joint controller for the primary use and impose ex ante due diligence obligations on it. This would imply that the platform operator must systematically screen content for special category data, and, where matches are found, identify the principal controller, which would mean imposing identity-verification obligations, to verify whether the principal controller can establish that he/she has a valid legal basis to process the data.
Let’s pause…
How [on earth] did the CJEU manage to impose, on the ground of data protection law, both what seems to look like content-filtering and identity-verification obligations? And could these obligations be extended to all types of user-generated content and all users?
The problem with the CJEU’s reasoning, is that although it seems to be primarily concerned about “advertisers” [again, go and check President Lenaerts’s video], the way it constructs its reasoning makes it extremely far-reaching [Perhaps, when you see red… you just bull your way into wobbles!].
Slightly too radical maybe?
One way to read the CJEU’s judgment is to say that if a platform user qualifies as a controller, the platform must systematically screen content to detect special category data and verify both the user’s identity and the soundness of the claims made in relation to lawful basis.
Although the CJEU formulates the obligation in relation to special category data, the obligations to verify the user’s identity would actually even seem to be in scope if the data is not sensitive, as the CJEU grounds its reasoning on Articles 13, 14 and 26 GDPR. And because the CJEU refers to these Articles, it is not clear why such an obligation would only apply when the user is an advertiser.
Such an extensive reading is problematic for several reasons. First, it would undermine any effort to adopt a privacy-preserving approach to identity verification. Second, it raises serious concerns regarding the protection of fundamental rights, beginning with freedom of expression, which explains why we had Article 15 ECD and now have Article 8 DSA and a prohibition of general monitoring obligations. [Even if the scope of the CJEU judgment is limited to advertisers, it would still impact the effects of Article 15 ECD and Article 8 DSA.]
Third, such a reading is problematic for another reason: the characterisation of special category data is context dependent and potentially much broader than the making available of an ad offering sexual services. How would platforms operationalise such a requirement?
Even grounding an obligation to identify platform users on Articles 24 and 25 GDPR, which is what the AG was trying to do (see para. 133-134) by distinguishing uploaded content data and account data, is problematic I would argue. Data protection law is not an online safety law, although it must be reconciled with online safety laws [For an example of such a reconciliation, have a look at what the European Commission tries to achieved with its guidelines on Article 28(1) DSA and some comments here]. Reconciliation could actually imply simply mooting identity verification in a variety of contexts. Yet, on the implications of Articles 24 and 25, the CJEU seems to follow its AG:
“As the Advocate General observed in point 134 of his Opinion, such measures must in particular make it possible to limit the risk of unlawful processing of the personal data of data subjects and to combat unfair use of such an online marketplace, by limiting the feeling of impunity and thus encouraging user advertisers to comply with the requirements of the GDPR when they publish advertisements containing personal data.” (para. 104).
How could this be reconciled with the principle of data minimisation and the acknowledgement in Article 11(1) GDPR that if the processing purpose does not require the identification of the data subject, the “the controller shall not be obliged to maintain, acquire or process additional information in order to identify the data subject for the sole purpose of complying with this Regulation”?
Is there a way to make sense of this judgment beyond maybe….seeing it as a crude reaction, both to a serious type of harm that must be addressed effectively and to exploitative data practices by online platforms?
Could the answer be that the CJEU reserves for itself the ability to decide whether to segment or merge data flows, and that platforms which fail to clearly distinguish roles, by claiming all rights over uploaded content, will systematically be treated as controllers of the data uploaded and made available by users?
In other words, the CJEU would be trying to stop basic exploitative practices and not really to impose identity-verification obligations or even general monitoring obligations across the platform ecosystem. [humm…].
When would online platforms then act as processors? Would that change the analysis significantly? Processors are subject to Article 32 GDPR as well. What would they be asked to do on this basis? Prompt takedown?
Importantly, what the CJEU does not explain here is that preventing further processing of personal data makes sense only when it is not possible to ground the further processing of personal data upon valid legal bases within the meaning of both Articles 6 and 9 GDPR where relevant. Informed consent is only one legal basis. What is more, not all personal data is confidential.
Granted, platforms usually perform a variety of roles vis-à-vis personal data provided by users. They may be processor for the primary use but would then become controller for secondary use, e.g., for AI training, behavioural advertising… But following the CJEU in Fashion ID, platform users should not be controllers in these cases…
Could the answer be found in Article 82(3) GDPR?
Article 82(3) GDPR provides that “[a] controller or processor shall be exempt from liability under paragraph 2 if it proves that it is not in any way responsible for the event giving rise to the damage.”
What this means, as the AG explains it, is that liability under the GDPR remains fault based: “Indeed, the controller’s liability under Article 82 of that regulation is subject to fault on the part of the controller, which is presupposed unless it proves that it is not in any way responsible for the event giving rise to the damage, within the meaning of paragraph 3.” (See also para. 94 ZQ)
Could an online platform argue that when a platform user engages into illegal activity, it cannot be responsible for the event, i.e., the illegal uploading and making available of the content, giving rise to the damage?
In Nacionalinis, the CJEU interpreting Article 83 GDPR but referring to Article 28(10) GDPR, states at para. 85:
“[T]he responsibility and liability of the controller for the conduct of a processor cannot extend to situations where the processor has processed personal data for its own purposes or where that processor has processed such data in a manner incompatible with the framework of, or detailed arrangements for, the processing as determined by the controller, or in such a manner that it cannot reasonably be considered that that controller consented to such processing. In accordance with Article 28(10) of the GDPR, the processor must, in such a situation, be considered to be a controller in respect of such processing.”
How far could one then go with a reasoning by analogy?
Humm…. Humm…
SSB
Great write-up! Could you share/fix the Lenaerts video link by any chance? I’m really interested in his views…
oops here is the link https://www.youtube.com/watch?v=c46t0k5Erko Thanks!