(Thumb)nail in the coffin for the prohibition on monitoring?

Article 15(1) of the E-Commerce Directive (2000/31/EC) has long been a useful weapon in the armoury of social media platforms and search engines by prohibiting a “general monitoring obligation”. This, they argue, means that they can only be required to remove specific unlawful content that is identified by the complainant or court, but no more. The problem with this is that it is very easy for the unlawful content to be spread far and wide and the complainant is required to play whac-a-mole, identifying every repetition and variation of that content.

In today’s judgment in Eva Glawischnig-Piesczek v Facebook Ireland Limited (Case C-18/18), the CJEU has given important guidance in relation to the removal of content which contains identical wording to the original unlawful content or which has “equivalent” content.

In this case the Austrian Courts had found that statements about a Green Party politician which were included in a Facebook post (alongside a thumbnail of an article in an online news magazine) were intended to damage her reputation, to insult her and defame her. The question that the Austrian Supreme Court needed assistance with concerned whether the prohibition of a general monitoring obligation precluded ordering Facebook to remove not just the offending post, but also other identically worded information (and to what geographical extent) and “information with an equivalent meaning”.

Identically worded information

The CJEU observed that whilst Article 15(1) prohibits Member States from imposing on host providers a “general” obligation to monitor information which they transmit or store, or a “general” obligation actively to seek facts or circumstances indicating illegal activity, this does not concern the monitoring obligations “in a specific case”. This was such a case – the court had found the particular piece of information to be illegal, there was a genuine risk that information which was held to be illegal would subsequently be reproduced and shared by another user of that network and (at §37):

“In those circumstances, in order to ensure that the host provider at issue prevents any further impairment of the interests involved, it is legitimate for the court having jurisdiction to be able to require that host provider to block access to the information stored, the content of which is identical to the content previously declared to be illegal, or to remove that information, irrespective of who requested the storage of that information. In particular, in view of the identical content of the information concerned, the injunction granted for that purpose cannot be regarded as imposing on the host provider an obligation to monitor generally the information which it stores, or a general obligation actively to seek facts or circumstances indicating illegal activity, as provided for in Article 15(1) of Directive 2000/31.”

Facebook could be required to remove all identically worded information irrespective of the identity of the person posting the material and without it being brought to Facebook’s attention by the complainant or court.

Information with an equivalent meaning

The CJEU considered that the referring court’s reference to “information with an equivalent meaning” was to “a message the content of which remains essentially unchanged and therefore diverges very little from the content which gave rise to the finding of illegality”.

The CJEU found that this information could also be ordered to be removed (at §41):

“in order for an injunction which is intended to bring an end to an illegal act and to prevent it being repeated, in addition to any further impairment of the interests involved, to be capable of achieving those objectives effectively, that injunction must be able to extend to information, the content of which, whilst essentially conveying the same message, is worded slightly differently, because of the words used or their combination, compared with the information whose content was declared to be illegal. Otherwise, as the referring court made clear, the effects of such an injunction could easily be circumvented by the storing of messages which are scarcely different from those which were previously declared to be illegal, which could result in the person concerned having to initiate multiple proceedings in order to bring an end to the conduct of which he is a victim.”

So it seems that the game of whac-a-mole may be coming to an end, but only where the information is identical or where it is “essentially conveying the same message” albeit “worded slightly differently”.

A word of caution though– the CJEU also warned that this must not impose “an excessive obligation on the host provider”. In particular, “Differences in the wording of that equivalent content, compared with the content which was declared to be illegal, must not, in any event, be such as to require the host provider concerned to carry out an independent assessment of that content”.  In the present case it was considered that the use of automated search tools and technologies would not impose too great a burden on Facebook.

Finally, worldwide effect

In a mere five paragraphs at the end of the judgment, the CJEU also holds that the E-Commerce Directive does not preclude those injunction measures from having “worldwide effect” and that an injunction can be used to “block access to that information worldwide within the framework of the relevant international law”.

The E-Commerce Directive itself does not make provision for territorial limitation so the CJEU’s finding that it does “not preclude” such an injunction is not particularly enlightening and should not be overblown.

Ultimately, this is a matter of private and public international law, which is not harmonised at EU level. The CJEU sensibly left it up to Member States to ensure that the measures which they adopt and which produce effects worldwide take “due account of the rules applicable at an international level”.

Just as the CJEU passed the buck to Member States and the domestic courts as to how that may work in practice, I too will pass the buck, and direct you to the Advocate General’s more detailed analysis at §§88-103 as well as yesterday’s interesting post by Robin Hopkins on Google LLC v CNIL(Case C‑507/17) which looks at the exciting topic of geo blocking.

 

Julian Blake