Unsurprisingly, the frontiers of privacy and data protection law are often explored and extended by reference to what Google does. Panopticon has, for example, covered disputes over Google Street View (on which a US lawsuit was settled in recent months), Google’s status as a ‘publisher’ of blogs containing allegedly defamatory material (see Tamiz v Google [2013] EWCA Civ 68) and its responsibility for search results directing users to allegedly inaccurate or out-of-date personal data (see Google Spain v Agencia Espanola de Proteccion de Datos (application C-131/12), in which judgment is due in the coming months).
A recent decision of a German appellate court appears to have extended the frontiers further. The case (BGH, VI ZR 269/12 of 14th May 2013) concerned Google’s ‘autocomplete’ function. When the complainants’ names were typed into Google’s search bar, the autocomplete function added the ensuing words “Scientology” and “fraud”. This was not because there was lots of content linking that individual with those terms. Rather, it was because these were the terms other Google users had most frequently searched for in conjunction with that person’s name. This was due to rumours the truth or accuracy of which the complainants denied. They complained that the continuing association of their names with these terms infringed their rights to personality and reputation as protected by German law (Articles 823(1) and 1004 of the German Civil Code).
In the Google Spain case, Google has said that the responsibility lies with the generators of the content, not with the search engine which offers users that content. In the recent German case, Google has argued in a similar vein that the autocomplete suggestions are down to what other users have searched for, not what Google says or does.
In allowing the complainants’ appeals, the Federal Court of Justice in Karlsruhe has disagreed with Google. The result is that once Google has been alerted to the fact that an autocomplete suggestion links someone to libellous words, it must remove that suggestion. The case is well covered by Jeremy Phillips at IPKat and by Karin Matussek of Bloomberg in Berlin.
The case is important in terms of the frontiers of legal protection for personal integrity and how we allocate responsibility for harm. Google says that, in these contexts, it is a facilitator not a generator. It says it should not liable for what people write (see Tamiz and Google Spain), not for what they search for (the recent German case). Not for the first time, courts in Europe have allocated responsibility differently.
Notably, this case was not brought under data protection law. In principle, it seems that such complaints could be expressed in data protection terms. Perhaps, if the EU’s final Data Protection Regulation retains the severe penalty provisions proposed in the draft version, data protection will move centre-stage in these sorts of cases.
Robin Hopkins