‘Plebgate’ and the protection of journalistic sources

December 17th, 2015 by Robin Hopkins

It has been a mixed day for the media’s entanglements with the judiciary. Chris Knight posted earlier today about the unhappy outcome for Mirror Group Newspapers before the Court of Appeal in the Gulati privacy damages litigation arising from phone-hacking.

News Group Newspapers, however – together with Sun journalist claims Tom Newton Dunn, Anthony France and Craig Woodehouse – had a happier outcome in another case about telephone privacy, though this time with the media as victim rather than perpetrator of the interference.

Judgment IPT/14/176/H saw the claimants succeed in part in their claim against the Metropolitan Police in the Investigatory Powers Tribunal (‘IPT’). Read more »

 

FOI and Article 10: life after Kennedy (and Kenedi)

November 4th, 2015 by Robin Hopkins

The right to freedom of expression under Article 10(1) of the European Convention on Human Rights includes “freedom… to receive and impart information and ideas without interference by public authority”. Does that mean that there is a human right to freedom of information?

The question has haunted the courtrooms of the UK and other EU member states in recent years. In England and Wales, the last domestic word has been Kennedy v Charity Commission [2014] UKSC 20. The answer in Kennedy was ‘no’: Article 10 ECHR does not impose a positive, free-standing duty on public authorities to disclose information upon request.

That is not, however, the final word. Kennedy is to be heard by the European Court of Human Rights in Strasbourg – but the case has been stayed. This is because the Grand Chamber accepted another case raising essentially the same question.

The case is Magyar Helsinki Bizottság v Hungary (18030/11). The applicant, a human rights NGO, asked police forces to disclose information about ‘public defenders’, i.e. defence counsel appointed in criminal proceedings. The police forces refused, and the Hungarian court refused to order disclosure. The applicant complains that the refusal interferes with its rights under Article 10.

The case Bizottság was heard by the Grand Chamber today.

The UK government was an intervener. It urged the Court to conclude that Article 10 ECHR does not create a right to receive information from a public authority, in accordance with a line of authority (Leander v Sweden (1987) 9 EHRR 433, Gaskin v United Kingdom (1990) 12 EHRR 36, Guerra v Italy (1998) 26 EHRR 357 and Roche v United Kingdom (2006) 42 EHRR 30).

The Hungarian government’s position was to the same effect. It contended that concessions made in cases supporting the link between Article 10 and freedom of information (such as Társaság a Szabadsagjogokert v Hungary (2011) 53 EHRR 3 and Kenedi v Hungary 27 BHRC 335) were fact-specific.

Statutory rights to freedom of information in England and Wales are currently under threat of curtailment. Kennedy introduced (or confirmed) that, at least in certain circumstances, freedom of information also has a common law foundation. The Grand Chamber’s judgment in Bizottság will reveal whether, in addition to its statutory and common law pillars, freedom of information has a human rights basis as well.

Jason Coppel QC, Karen Steyn QC and Christopher Knight of 11KBW represented intervening parties in Bizottság.

Robin Hopkins @hopkinsrobin

 

Some results may have been removed under data protection law in Europe. Learn more.

July 3rd, 2014 by Robin Hopkins

This is the message that now regularly greets those using Google to search for information on named individuals. It relates, of course, to the CJEU’s troublesome Google Spain judgment of 13 May 2014.

I certainly wish to learn more.

So I take Google up on its educational offer and click through to its FAQ page, where the folks at Google tell me inter alia that “Since this ruling was published on 13 May 2014, we’ve been working around the clock to comply. This is a complicated process because we need to assess each individual request and balance the rights of the individual to control his or her personal data with the public’s right to know and distribute information”.

The same page also leads me to the form on which I can ask Google to remove from its search results certain URLs about me. I need to fill in gaps like this: “This URL is about me because… This page should not be included as a search result because…” 

This is indeed helpful in terms of process, but I want to understand more about the substance of decision-making. How does (and/or should) Google determine whether or not to accede to my request? Perhaps understandably (as Google remarks, this is a complicated business on which the dust is yet to settle), Google doesn’t tell me much about that just yet.

So I look to the obvious source – the CJEU’s judgment itself – for guidance. Here I learn that I can in principle ask that “inadequate, irrelevant or no longer relevant” information about me not be returned through a Google search. I also get some broad – and quite startling – rules of thumb, for example at paragraph 81, which tells me this:

“In the light of the potential seriousness of that interference, it is clear that it cannot be justified by merely the economic interest which the operator of such an engine has in that processing. However, inasmuch as the removal of links from the list of results could, depending on the information at issue, have effects upon the legitimate interest of internet users potentially interested in having access to that information, in situations such as that at issue in the main proceedings a fair balance should be sought in particular between that interest and the data subject’s fundamental rights under Articles 7 and 8 of the Charter. Whilst it is true that the data subject’s rights protected by those articles also override, as a general rule, that interest of internet users, that balance may however depend, in specific cases, on the nature of the information in question and its sensitivity for the data subject’s private life and on the interest of the public in having that information, an interest which may vary, in particular, according to the role played by the data subject in public life.”

So it seems that, in general (and subject to the sensitivity of the information and my prominence in public life), my privacy rights trump Google’s economic rights and other people’s rights to find information about me in this way. So the CJEU has provided some firm steers on points of principle.

But still I wish to learn more about how these principles will play out in practice. Media reports in recent weeks have told us about the volume of ‘right to be forgotten’ requests received by Google.

The picture this week has moved on from volumes to particulars. In the past few days, we have begun to learn how Google’s decisions filter back to journalists responsible for the content on some of the URLs which objectors pasted into the forms they sent to Google. We learn that journalists and media organisations, for example, are now being sent messages like this:

“Notice of removal from Google Search: we regret to inform you that we are no longer able to show the following pages from your website in response to certain searches on European versions of Google.”

Unsurprisingly, some of those journalists find this puzzling and/or objectionable. Concerns have been ventilated in the last day or two, most notably by the BBC’s Robert Peston (who feels that, through teething problems with the new procedures, he has been ‘cast into oblivion’) and The Guardian’s James Ball (who neatly illustrates some of the oddities of the new regime). See also The Washington Post’s roundup of UK media coverage.

That coverage suggests that the Google Spain ruling – which made no overt mention of free expression rights under Article 10 ECHR – has started to bite into the media’s freedom. The Guardian’s Chris Moran, however, has today posted an invaluable piece clarifying some misconceptions about the right to be forgotten. Academic commentators such as Paul Bernal have also offered shrewd insights into the fallout from Google Spain.

So, by following the trail from Google’s pithy new message, I am able to learn a fair amount about the tenor of this post-Google Spain world.

Inevitably, however, given my line of work, I am interested in the harder edges of enforcement and litigation: in particular, if someone objects to the outcome of a ‘please forget me’ request to Google, what exactly can they do about it?

On such questions, it is too early to tell. Google says on its FAQ page that “we look forward to working closely with data protection authorities and others over the coming months as we refine our approach”. For its part, the ICO tells us that it and its EU counterparts are working hard on figuring this out. Its newsletter from today says for example that:

“The ICO and its European counterparts on the Article 29 Working Party are working on guidelines to help data protection authorities respond to complaints about the removal of personal information from search engine results… The recommendations aim to ensure a consistent approach by European data protection authorities in response to complaints when takedown requests are refused by the search engine provider.”

So for the moment, there remain lots of unanswered questions. For example, the tone of the CJEU’s judgment is that DPA rights will generally defeat economic rights and the public’s information rights. But what about a contest between two individuals’ DPA rights?

Suppose, for example, that I am an investigative journalist with substantial reputational and career investment in articles about a particular individual who then persuades Google to ensure that my articles do not surface in EU Google searches for his name? Those articles also contain my name, work and opinions, i.e. they also contain my personal data. In acceding to the ‘please forget me’ request without seeking my input, could Google be said to have processed my personal data unfairly, whittling away my online personal and professional output (at least to the extent that the relevant EU Google searches are curtailed)? Could this be said to cause me damage or distress? If so, can I plausibly issue a notice under s. 10 of the DPA, seek damages under s. 13, or ask the ICO to take enforcement action under s. 40?

The same questions could arise, for example, if my personal backstory is heavily entwined with that of another person who persuades Google to remove from its EU search results articles discussing both of us – that may be beneficial for the requester, but detrimental to me in terms of the adequacy of personal data about me which Google makes available to the interested searcher.

So: some results may have been removed under data protection law in Europe, and I do indeed wish to learn more. But I will have to wait.

Robin Hopkins @hopkinsrobin

 

GCHQ’s internet surveillance – privacy and free expression join forces

July 3rd, 2014 by Robin Hopkins

A year ago, I blogged about Privacy International’s legal challenge – alongside Liberty – against GCHQ, the Security Services and others concerning the Prism/Tempora programmes which came to public attention following Edward Snowden’s whistleblowing. That case is now before the Investigatory Powers Tribunal. It will be heard for 5 days, commencing on 14 July.

Privacy International has also brought a second claim against GCHQ: in May 2014, it issued proceedings concerning the use of ‘hacking’ tools and software by intelligence services.

It has been announced this week that Privacy International is party to a third challenge which has been filed with the Investigatory Powers Tribunal. This time, the claim is being brought alongside 7 internet service providers: GreenNet (UK), Chaos Computer Club (Germany); GreenHost (Netherlands); Jimbonet (Korea), Mango (Zimbabwe), May First/People Link (US) and Riseup (US).

The claim is interesting on a number of fronts. One is the interplay between global reach (see the diversity of the claimants’ homes) and this specific legal jurisdiction (the target is GCHQ and the jurisdiction is the UK – as opposed, for example, to bringing claims in the US). Another is that it sees private companies – and therefore Article 1 Protocol 1 ECHR issues about property, business goodwill and the like – surfacing in the UK’s internet surveillance debate.

Also, the privacy rights not only of ‘ordinary’ citizens (network users) but also specifically those of the claimants’ employees are being raised.

Finally, this claim sees the right to free expression under Article 10 ECHR – conspicuously absent, for example, in the Google Spain judgment – flexing its muscle in the surveillance context. Privacy and free expression rights are so often in tension, but here they make common cause.

The claims are as follows (quoting from the claimants’ press releases):

(1) By interfering with network assets and computers belonging to the network providers, GCHQ has contravened the UK Computer Misuse Act and Article 1 of the First Additional Protocol (A1AP) of the European Convention of Human Rights (ECHR), which guarantees the individual’s peaceful enjoyment of their possessions

(2) Conducting surveillance of the network providers’ employees is in contravention of Article 8 ECHR (the right to privacy) and Article 10 ECHR (freedom of expression)

(3) Surveillance of the network providers’ users that is made possible by exploitation of their internet infrastructure, is in contravention of Arts. 8 and 10 ECHR; and

(4) By diluting the network providers’ goodwill and relationship with their users, GCHQ has contravened A1AP ECHR.

Robin Hopkins @hopkinsrobin

 

Yet more on Article 10 ECHR and FOIA

July 3rd, 2013 by Robin Hopkins

The question of whether the right to freedom of expression conferred by Article 10 of the European Convention on Human Rights has a bearing on the Freedom of Information Act 2000 (particularly as regards absolute exemptions) is an interesting and important one. The Supreme Court will address it later this year in the Kennedy litigation.

In the meantime, there is free expression aplenty on this issue within the Panopticon fold. Joseph Barrett’s post of earlier today is not the only example; Christopher Knight’s recent piece in Public Law is a must-read. The reference is: CJS Knight, ‘Article 10 and a Right of Access to Information’ [2013] PL 468.

Robin Hopkins

 

Redacting for anonymisation: Article 8 v Article 10 in child protection context

December 13th, 2012 by Robin Hopkins

Panopticon has reported recently on the ICO’s new Code of Practice on Anonymisation: see Rachel Kamm’s post here. That Code offers guidance for ensuring data protection-compliant disclosure in difficult cases such as those involving apparently anonymous statistics, and situations where someone with inside knowledge (or a ‘motivated intruder’) could identify someone referred to anonymously in a disclosed document. The Upper Tribunal in Information Commissioner v Magherafelt District Council [2012] UKUT 263 AAC grappled with those issues earlier this year in the context of disclosing a summarised schedule of disciplinary action.

Redaction is often crucial in achieving anonymisation. Getting redaction right can be difficult: too much redaction undermines transparency, too much undermines privacy. The Court of Appeal’s recent judgment In the matter of X and Y (Children) [2012] EWCA Civ 1500 is a case in point. It involved the publication of a summary report from a serious case review by a Welsh local authority’s Safeguarding Children Board. The case involved very strong competing interests in terms of Article 8 and Article 10 ECHR. For obvious reasons (anonymity being the key concern here) little could be said of the underlying facts, but the key points are these.

A parent was convicted in the Crown Court of a serious offence relating to one of the children of the family (X). The trial received extensive coverage in the local media. The parent was named. The parent’s address was given. The fact that there were other siblings was reported, as also their number. All of this coverage was lawful.

The local authority’s Safeguarding Children Board conducted a Serious Case Review in accordance with the provisions of the Children Act 2004 and The Local Safeguarding Children Boards (Wales) Regulations 2006. Those Regulations require the Board to produce an “overview report” and also an anonymised summary of the overview report. The relevant Guidance provides that the Board should also “arrange for an anonymised executive summary to be prepared, to be made publicly available at the principal offices of the Board”.

Here two features of the draft Executive Summary were pivotal.

First, reference was made to the proceedings in the Crown Court in such a way as would enable many readers to recognise immediately which family was being referred to and would enable anyone else so inclined to obtain that information by only a few minutes searching of the internet.

Second, it referred, and in some detail, to the fact, which had not emerged during the proceedings in the Crown Court and which is not in the public domain, that another child in the family (Y), had also been the victim of parental abuse.

The local authority wanted to publish the Executive Summary, seeking to be transparent about its efforts to put right what went wrong and that it has learned lessons from X’s death. It recognised the impact on Y, but argued for a relaxtion of a restricted reporting order to allow it to publish the Executive Summary with some redactions. It was supported by media organisations who were legally represented.

The judge (Peter Jackson J) undertook a balance of interests under Articles 8 and 10. He allowed publication, with redactions which were (in the Court of Appeal’s words) “in substance confined to three matters: the number, the gender and the ages of the children.”

In assessing the adequacy of these redaction, the Court of Appeal considered this point from the judgment of Baroness Hale in ZH (Tanzania) v Secretary of State for the Home Department [2011] UKSC 4, [2011] 2 AC 166, at paragraph 33:

“In making the proportionality assessment under article 8, the best interests of the child must be a primary consideration. This means that they must be considered first. They can, of course, be outweighed by the cumulative effect of other considerations.”

Munby LJ thus concluded (paragraph 47 of this judgment) that “it will be a rare case where the identity of a living child is not anonymised”.

He recognised, on the other hand, that Article 10 factors always retained their importance: “there could be circumstances where the Article 8 claims are so dominant as to preclude publication altogether, though I suspect that such occasions will be very rare.”

On the approach to anonymisation through redaction, Munby LJ had this to say (paragraph 48):

“In some cases the requisite degree of anonymisation may be achieved simply by removing names and substituting initials. In other cases, merely removing a name or even many names will be quite inadequate. Where a person is well known or the circumstances are notorious, the removal of other identifying particulars will be necessary – how many depending of course on the particular circumstances of the case.”

In the present case, the redactions had been inadequate. They did not “address the difficulty presented by the two key features of the draft, namely, the reference to the proceedings in the Crown Court and the reference to the fact that Y had also been the victim of parental abuse” (paragraph 53).

Far more drastic redaction was required in these circumstances: to that extent, privacy trumped transparency, notwithstanding the legislation and the Guidance’s emphasis on disclosure. In cases such as this (involving serious incidents with respect to children), those taking disclosure decisions should err on the side of heavy redaction.

Robin Hopkins