Anonymity: publication and open justice

The tension between transparency and individual privacy is part of what makes information rights such a fascinating and important area. When it comes to high-public interest issues involving particular individuals, prevailing wisdom has tended to be something like this: say as much as possible on an open basis, but redact and anonymise so as to protect the identity of the individuals involved. Increasingly, however, transparency is outmuscling privacy. See for example my post about the Tribunal’s order of disclosure, in the FOIA context, of the details of the compensation package of a Chief Executive of an NHS Trust (the case of Dicker v IC (EA/2012/0250).

The recent Care Quality Commission debate is the highest-profile recent illustration: the health regulator published a consultant’s report into failings regarding the deaths of babies at Furness General Hospital, but withheld the names of the individuals being criticised (including for alleged ‘cover-ups’), relying on the Data Protection Act 1998. The anonymisation was not endorsed by the Information Commissioner, and attracted widespread criticism in media and political circles. Transparency pressures held sway.

In a similar vein, the BBC has come under great pressure over the past week – particularly from Parliament’s Public Accounts Committee – to reveal the names of approximately 150 departing senior managers who received pay-offs averaging £164,000 in the past three years. As the Telegraph reports, the Committee is threatening to use parliamentary privilege to publish those names. The BBC admits that it “got things wrong” by overpaying in many cases (as confirmed by the National Audit Office), but is concerned to protect the DPA and privacy rights of the affected individuals, as well as to safeguard its own independence. The Committee says the public interest in transparency is compelling; Lord Patten, chair of the BBC Trust, says there will be “one hell of an argument” about this.

Such arguments become all the more thorny in the context of open justice disputes, of which there have been a number in recent weeks.

In the matter of Global Torch Ltd/Apex Global Management Ltd (The Guardian, The Financial Times and others intervening) [2013] EWCA Civ 819 involved competing petitions of unfair prejudice alleging misconduct in the affairs of a particular company. Two Saudi Arabian princes and one of their private advisers applied to have the interlocutory hearings held in private under CPR rule 39.2(3). The Court of Appeal agreed with the judge who dismissed those applications. It rejected the contention that the judge had elevated open justice above Article 8 ECHR rights as a matter of law. Rather, he noted that some general presumptions were valid (for example, open justice is likely to trump reputational damage) and applied those in the factual context of this case. Maurice Kay LJ said  (paragraph 34) that there was sometimes a “need for a degree of protection so as to avoid the full application of the open justice principle exposing a victim to the very detriment which his cause of action is designed to prevent… If such an approach were to be extended to a case such as the present one, it could equally be applied to countless commercial and other cases in which allegations of serious misconduct are made. That would result in a significant erosion of the open justice principle. It cannot be justified where adequate protection exists in the form of vindication of the innocent through the judicial process to trial”.

Open justice is of course fundamental not only to freedom of expression, but is also the default setting for fair trials. This is illustrated in the regulatory/disciplinary context by Miller v General Medical Council [2013] EWHC 1934 (Admin). The case involved a challenge to a decision by a Fitness to Practise Panel of the Council’s Medical Practitioners Tribunal Service that a fitness to practise hearing should take place in private because it considered that the complainant, a former patient of the claimant, was otherwise unlikely to give evidence. HHJ Pelling quashed the decision; there was insufficient evidence for the Panel’s conclusion about witness participation, and in any event the Panel “fell into error at the outset by not reminding itself sufficiently strongly or at all that the clear default position under Article 6 is that the hearing should be in public. It failed to remind itself that Article 6 creates or declares rights that are the rights of the Claimant and that it was for the GMC to prove both the need for any derogation from those rights and for a need to derogate to the extent claimed” (paragraph 20).

Robin Hopkins

Prism and Tempora: Privacy International commences legal action

Panopticon has reported in recent weeks that, following the Edward Snowden/Prism disclosures, Liberty has brought legal proceedings against the UK’s security bodies. This week, Privacy International has announced that it too is bringing a claim in the Investigatory Powers Tribunal – concerning both the Prism and Tempora programmes. It summarises its claim in these terms:

“Firstly, for the failure to have a publicly accessible legal framework in which communications data of those located in the UK is accessed after obtained and passed on by the US National Security Agency through the Prism programme.  Secondly, for the indiscriminate interception and storing of huge amounts of data via tapping undersea fibre optic cables through the Tempora programme.”

Legal complaints on Prism-related transfers have been made elsewhere on data protection grounds also. A group of students who are members of a group called Europe vs. Facebook have filed complaints to the data protection authorities in Ireland (against Facebook and Apple), Luxembourg (against Skype and Microsoft) and Germany (against Yahoo).

European authorities have expressed concerns on these issues in their own right. For example, the Vice President of the European Commission, Viviane Reding, has written to the British Foreign Secretary, William Hague, about the Tempora programme, and has directed similar concerns at the US (including in a piece in the New York Times). The European Parliament has also announced that a panel of its Committee on Civil Liberties, Justice and Home Affairs will be convened to investigate the Prism-related surveillance of EU citizens. It says the panel will report by the end of 2013.

In terms of push-back within the US, it has been reported that Texas has introduced a bill strengthening the requirements for warrants to be obtained before any emails (as opposed to merely unread ones) can be disclosed to state and local law enforcement agencies.

Further complaints, litigation and potential legal challenges will doubtless arise concerning Prism, Tempora and the like.

Robin Hopkins

Google and data protection: no such thing as the ‘right to be forgotten’

Chris Knight has blogged recently about enforcement action against Google by European Data Protection authorities (but not yet the UK’s ICO). I blogged last month about a German case (BGH, VI ZR 269/12 of 14th May 2013) concerning Google’s ‘autocomplete’ function, and earlier this year about the Google Spain case (Case C‑131/12). The latter arises out of complaints made to that authority by a number of Spanish citizens whose names, when Googled, generated results linking them to allegedly false, inaccurate or out-of-date information (contrary to the data protection principles) – for example an old story mentioning a surgeon’s being charged with criminal negligence, without mentioning that he had been acquitted. The Spanish authority ordered Google to remove the offending entries. Google challenged this order, arguing that it was for the authors or publishers of those websites to remedy such matters. The case was referred to the CJEU by the Spanish courts.

Advocate General Jääskinen this week issued his opinion in this case.

The first point concerns territorial jurisdiction. Google claims that no processing of personal data relating to its search engine takes place in Spain. Google Spain acts merely as commercial representative of Google for its advertising functions. In this capacity it has taken responsibility for the processing of personal data relating to its Spanish advertising customers. The Advocate General has disagreed with Google on this point. His view is that national data protection legislation is applicable to a search engine provider when it sets up in a member state, for the promotion and sale of advertising space on the search engine, an office which orientates its activity towards the inhabitants of that state.

The second point is substantive, and is good news for Google. The Advocate General says that Google is not generally to be considered – either in law or in fact – as a ‘data controller’ of the personal data appearing on web pages it processes. It has no control over the content included on third party web pages and cannot even distinguish between personal data and other data on those pages.

Thirdly, the Advocate General tells us that there is no such thing as the so-called “right to be forgotten” (a favourite theme of debates on the work-in-progress new Data Protection Regulation) under the current Directive. The Directive offers accuracy as to safeguards and so on, but Google had not itself said anything inaccurate here. At paragraph 108 of his opinion, the Advocate General says this:

“… I consider that the Directive does not provide for a general right to be forgotten in the sense that a data subject is entitled to restrict or terminate dissemination of personal data that he considers to be harmful or contrary to his interests. The purpose of processing and the interests served by it, when compared to those of the data subject, are the criteria to be applied when data is processed without the subject’s consent, and not the subjective preferences of the latter. A subjective preference alone does not amount to a compelling legitimate ground within the meaning of Article 14(a) of the Directive.”

It remains to be seen of course whether the Court agrees with the Advocate General. The territorial issue and the ‘data controller’ question are of great significance to Google’s business model – and to those whose businesses face similar issues. The point about objectivity rather than subjectivity being the essential yardstick for compliance with data protection standards is potentially of even wider application.

“This is a good opinion for free expression,” Bill Echikson, a spokesman for Google, said in an e-mailed statement reported by Bloomberg.

Robin Hopkins

RIPA: hacked voicemails and undercover officers

The Regulation of Investigatory Powers Act 2000 (RIPA) has featured prominently in the news in recent weeks, both as regards undercover police officers/“covert human intelligence sources” and as regards the phone-hacking scandal.

Hacked voicemails

This morning, the Court of Appeal gave judgment in Edmonson, Weatherup, Brooks, Coulson & Kuttner v R [2013] EWCA Crim 1026. As is well known, the appellants face charges arising out of the News of the World phone-hacking controversy – specifically, conspiring unlawfully to intercept communications in the course of their transmission without lawful authority contrary to section 1(1) of the Criminal Law Act 1977.

The communications in question are voicemails. Under section 1(1)(b) of RIPA, it is an offence intentionally to intercept, without lawful authority, any communication in the course of its transmission by means of a public telecommunications system (my emphasis). The central provision is section 2(7) of RIPA:

“(7) For the purposes of this section the times while a communication is being transmitted by means of a telecommunication system shall be taken to include any time when the system by means of which the communication is being, or has been, transmitted is used for storing it in a manner that enables the intended recipient to collect it or otherwise to have access to it.”

The appellants applied to have the charges dismissed on the grounds that the words “in the course of transmission” in section 1(1) of RIPA do not extend to voicemail messages once they have been listened to (by the intended recipient, that is, rather than by any alleged phone-hacker). They argued that the ordinary meaning of “transmission” is conveyance from one person or place to another and that section 2(7) is intended to extend the concept of “transmission” only so as to cover periods of transient storage that arising through modern phone and email usage, and when the intended recipient is not immediately available. Thus, once the message has been listened to, it can no longer be “in the course of transmission”.

The point had previously been decided against the appellant. The Court of Appeal (the Lord Chief Justice, Lloyd Jones LJ, Openshaw J) took a similar view. While it accepted that the application of section 2(7) may differ as between, for example, voicemails and emails, “there is nothing in the language of the statute to indicate that section 2(7) should be read in such a limited way” (as the appellants had contended) (paragraph 23). Further, the words “has been transmitted” in section 2(7) “make entirely clear that the course of transmission may continue notwithstanding that the voicemail message has already been received and read by the intended recipient” (paragraph 26).

The same conclusion was reached by focusing on the mischief which section 2(7) is intended to remedy, “namely unauthorized access to communications, whether oral or text, whilst they remain on the system by which they were transmitted. As the prosecution submits, unlawful access and intrusion is not somehow less objectionable because the message has been read or listened to by the intended recipient before the unauthorized access takes place” (paragraph 28, quoting an earlier judgment in this matter from Fulford LJ).

The Court accepted that section 2(7) went further than the prohibitions imposed by Directive 97/66/EC concerning the processing of personal data and the protection of privacy in the telecommunications sector (which RIPA sought to implement) and its successor, Directive 2002/58/EC concerning the processing of personal data and the protection of privacy in the electronic communications sector (which postdates RIPA).  The Court found, however, that the Directives imposed minimum harmonisation; Parliament was entitled to go further and to set higher standards for the protection of privacy of electronic communications, provided that those additional obligations are compatible with EU law (paragraph 42).

Both the Data Protection Act 1998 and the Computer Misuse Act 1990 also raised their heads. The DPA, for example, contains a public interest defence which is not available under RIPA. It was argued that this risked creation parallel offences without parallel defences, violating the principle of legal certainty. This submission too was rejected (paragraphs 44-45).

The cases will now proceed to trial, apparently to commence in September.

Undercover officers

As regards the activities of undercover police officers, the major issue this week has concerned the alleged smearing of the family and friends of Stephen Lawrence: see for example The Guardian’s Q&A session with undercover-officer-turned-whistleblower Peter Francis.

The other major ongoing case regarding a former undercover officer concerns Mark Kennedy, who (together with others) infiltrated political and environmental activists over a period of years. Claims were commenced in the High Court, with part of the conduct complained of involving ensuing sexual relations between activists/their partners and undercover officers.

Earlier this year, J and others v Commissioner of Police for the Metropolis [2013] EWHC 32 (QB) saw part of the claims struck out. The Court held that the Investigatory Powers Tribunal had exclusive jurisdiction over the claims under the Human Rights Act 1998; it struck out these parts accordingly. It observed that conduct breaching Article 3 (inhuman and degrading treatment) – which included the claims relating to sexual activity – could not be authorised under RIPA, but conduct breaching Article 8 (privacy) could be authorised. Sexual activity with undercover officers did not necessarily engage Article 3.

Those parts of the claims which did not concern the Human Rights Act 1998 (actions at common law and for alleged breaches of statutory duties) were not exclusively within the Investigatory Powers Tribunal’s jurisdiction and were thus not struck out as an abuse of process, notwithstanding the police’s difficulties in presenting its case due to the ‘neither confirm nor deny’ approach to covert sources.

Unlike with the phone-hacking cases, it is not clear when this case will resume before the Court/Tribunal.

Robin Hopkins

T v Manchester goes to the Supreme Court

One of the most important privacy judgments of the year thus far has been that of the Court of Appeal in R (T & others) v Chief Constable of Greater Manchester & others [2013] EWCA Civ 25, on which Chris Knight blogged in January. In a nutshell, the Court of Appeal held that the criminal records disclosure regime (including the exceptions to the Rehabilitation of Offenders Act 1974) violated Article 8 ECHR.

Permission has been granted for a further appeal to the Supreme Court, which will hear the case on 24 and 25 July of this year. Watch this space.

Robin Hopkins

Google: autocomplete and the frontiers of privacy

Unsurprisingly, the frontiers of privacy and data protection law are often explored and extended by reference to what Google does. Panopticon has, for example, covered disputes over Google Street View (on which a US lawsuit was settled in recent months), Google’s status as a ‘publisher’ of blogs containing allegedly defamatory material (see Tamiz v Google [2013] EWCA Civ 68) and its responsibility for search results directing users to allegedly inaccurate or out-of-date personal data (see Google Spain v Agencia Espanola de Proteccion de Datos (application C-131/12), in which judgment is due in the coming months).

A recent decision of a German appellate court appears to have extended the frontiers further. The case (BGH, VI ZR 269/12 of 14th May 2013) concerned Google’s ‘autocomplete’ function. When the complainants’ names were typed into Google’s search bar, the autocomplete function added the ensuing words “Scientology” and “fraud”. This was not because there was lots of content linking that individual with those terms. Rather, it was because these were the terms other Google users had most frequently searched for in conjunction with that person’s name. This was due to rumours the truth or accuracy of which the complainants denied. They complained that the continuing association of their names with these terms infringed their rights to personality and reputation as protected by German law (Articles 823(1) and 1004 of the German Civil Code).

In the Google Spain case, Google has said that the responsibility lies with the generators of the content, not with the search engine which offers users that content. In the recent German case, Google has argued in a similar vein that the autocomplete suggestions are down to what other users have searched for, not what Google says or does.

In allowing the complainants’ appeals, the Federal Court of Justice in Karlsruhe has disagreed with Google. The result is that once Google has been alerted to the fact that an autocomplete suggestion links someone to libellous words, it must remove that suggestion. The case is well covered by Jeremy Phillips at IPKat and by Karin Matussek of Bloomberg in Berlin.

The case is important in terms of the frontiers of legal protection for personal integrity and how we allocate responsibility for harm. Google says that, in these contexts, it is a facilitator not a generator. It says it should not liable for what people write (see Tamiz and Google Spain), not for what they search for (the recent German case). Not for the first time, courts in Europe have allocated responsibility differently.

Notably, this case was not brought under data protection law. In principle, it seems that such complaints could be expressed in data protection terms. Perhaps, if the EU’s final Data Protection Regulation retains the severe penalty provisions proposed in the draft version, data protection will move centre-stage in these sorts of cases.

Robin Hopkins