CJU Case C-18/18: the stay-down duties of Facebook according to the Opinion of Advocate General M. Szpunar

Lawyer Alessandro La Rosa

On 4 June 2019 the Advocate General (AG) Szpunar delivered an Opinion in the context of the preliminary ruling request on Case C-18/18, in which the Austrian Supreme Court presented the following questions to the EUCJ:

  1. Does Article 15(1) of Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) generally preclude any of the obligations listed below of a host provider which has not expeditiously removed illegal information, specifically not just this illegal information within the meaning of Article 14(1)(a) of the Directive, but also other identically worded items of information:

a.a. worldwide? a.b. in the relevant Member State? a.c. of the relevant user worldwide? a.d. of the relevant user in the relevant Member State?

  1. In so far as Question 1 is answered in the negative: Does this also apply in each case for information with an equivalent meaning?
  2. Does this also apply for information with an equivalent meaning as soon as the operator has become aware of this circumstance?

The facts

The complaint was made by Ms Eva Glawischnig-Piesczek, member of the Nationalrat (National Council, Austria), chair of the parliamentary party die Grünen (‘the Greens’) and federal spokesperson of that party. On 3 April 2016 a user shared on Facebook an article from the Austrian online news magazine oe24.at entitled ‘Greens: Minimum income for refugees should stay’ and published in connection with the article disparaging comments about the applicant, accusing her of being a ‘lousy traitor of the people’, a ‘corrupt oaf’ and a member of a ‘fascist party’. The content placed online by that user could be consulted by any other Facebook user.

AG Opinion

The AG recalled that the “service provider’s conduct must be limited to that of an ‘intermediary service provider’ within the meaning intended by the legislator in the context of Section 4 of that directive.” (par. 29) and that according to Recital 42 of Directive 2000/31/EC (E Commerce Directive, ECD), the liability limitation is subject to the fact that the service provider’s conduct is purely technical, automatic and passive, which implies that it has neither knowledge nor control over the data it stores and that the role it plays must therefore be neutral (par.29).

Considering Article 15, paragraph 1 ECD the AG clarifies that this provision prevents the intermediary service provider from being forced to monitor all of its user’s data in order to prevent future violations  (as this would go against what the CJEU has established in the SABAM C-360/10 Case). This said, this article doesn’t cover monitoring obligations applicable in specific cases as provided in Article 14, paragraph 3 ECD, on the basis of which a provider may be forced to prevent a violation which “logically implies a certain form of monitoring in the future” (par.41) and in Article 18 ECD, that requires Member States to ensure that Court actions available under national law concerning information society services’ activities allow for the rapid adoption of measures to prevent any further impairment of the interests involved.

The AG continues by adding that “a host provider may be ordered, in the context of an injunction, to remove illegal information which has not yet been disseminated at the time when that injunction is adopted, without the dissemination of that information being brought, again and separately from the original removal request, to its knowledge” (par.44).

However, in accordance with the ban of a general monitoring obligation provided in the ECD, such order must be referred to violations of the same kind, from the same recipients and for the same rights (as already established in Case C-324/09 L’Oreal vs eBay). It follows that, obligations of active monitoring may be imposed to a social network operator, provided that they are related to a specific infringement, that the monitoring period is specified and that clear indications about the nature of the violations are provided, such as their author and object.

Therefore, an injunction may force the provider to search and identify, among the information shared by the users of the platform, identical information to the one qualified as illegal by the judge, as the reproduction of the same content by other users of the platform can usually be detected with technological tools without the need to actively filter, in a non-automatic way, all of the information present on the platform. The injunction may also require a service provider to search and identify equivalent information to the illegal one, only among the information shared by the same user that shared the illegal one.

This is due to the fact that, in this specific case, the illegal material relates to offensive remarks and not to IP infringements. As stated by the AG “it is important not to lose sight of the factual context in which the relevant case-law was developed, namely the context of infringements of intellectual property law. Generally, such infringements consist in the dissemination of the same content as that protected or, at least, a content resembling the protected content, any changes in that content, which are sometimes difficult to make, requiring specific intervention. On the other hand, it is unusual for a defamatory act to use the precise terms of an act of the same type. That is, in part, the result of the personalised nature of the way in which ideas are expressed.[…]For that reason, in connection with defamation, a mere reference to acts of the same nature could not play the same role as in connection with infringements of intellectual property law.” (par. 69-70).

Therefore, the injunction of a judge that orders the removal of such equivalent information must ensure clarity, predictability and must balance the different fundamental rights involved, taking into account the proportionality principle.

If follows that, Article 15, paragraph 1, ECD mustn’t prevent a hosting provider from removing equivalent information to the one identified as illegal, where the obligation to remove such information doesn’t imply a general monitoring but derives from a knowledge resulting from the notice provided by the person harmed by the illegal content and not from another source. In such case, the violation of Article 15, paragraph 1 ECD doesn’t arise.

The AG Opinion is in line with the recently published Communications of the European Commission on these matters. In Communication 555/2017 of 28 September 2017 the Commission clarified that “Online platforms may become aware of the existence of illegal content in a number of different ways, through different channels. Such channels for notifications include (i) court orders or administrative decisions; (ii) notices from competent authorities (e.g. law enforcement bodies), specialised “trusted flaggers”, intellectual property rights holders or ordinary users, or (iii) through the platforms’ own investigations or knowledge.”, and that such operators, “In addition to legal obligations derived from EU and national law”, are required a “duty of care” and “should ensure a safe online environment for users, hostile to criminal and other illegal exploitation, and which deters as well as prevents criminal and other infringing activities online”. The Communication also clarifies that “in light of their central role and capabilities and their associated responsibilities” platforms shouldadopt effective proactive measures to detect and remove illegal content online and not only limit themselves to reacting to notices which they receive”.

In relation to the admissibility of dynamic injunctions, the European Commission has clarified in Communication 708/2017 of 29 November 2017 that “some Member States provide for the possibility of issuing forward-looking, catalogue-wide and dynamic injunctions”. These injunctions force the intermediaries to impede further infringements of the rights detained by a rightholder or of rights that are part of a catalogue or of a repository of a license holder,  following the  established infringement of a sample of these rights.

At national level, the Italian Supreme Court in Case n.7708/19 (RTI c Yahoo!) of 19 March 2019, recently rejected the ruling of the Milan Court of Appeal  (n.29/2015 of 7 January 2015) that provided that there could be no obligation to prevent the publication of the same illegal content that had already violated others ‘rights.

In conclusion, specific removal obligations for illicit information that is identical or equivalent is confirmed, in relation to copyright law, in the explicit stay-down obligations provided in Article 17, paragraph 4, letter c of the Directive on Copyright in the Digital Single Market, published in the Official Journal on 17 May 2019, which reads that “If no authorisation is granted, online content-sharing service providers shall be liable for unauthorised acts of communication to the public, including making available to the public, of copyright-protected works and other subject matter, unless the service providers demonstrate that they have […]acted expeditiously, upon receiving a sufficiently substantiated notice from the rightholders, to disable access to, or to remove from their websites, the notified works or other subject matter, and made best efforts to prevent their future uploads”.