Time to End the Time Debate

The apparently endless APPGER litigation has produced yet another decision of the Upper Tribunal for seasoned FOIA watchers, which amongst some very fact-specific issues, also contains two important clarifications of law: APPGER v ICO & FCO [2015] UKUT 377 (AAC).

As anyone who has ever done any information law ever will know, the APPGER litigation concerns requests under FOIA for information related to alleged British involvement in extraordinary rendition. Some information has been released, some has been released following earlier rounds of litigation, some remains withheld under various exemptions.

Following previous hearings staying various points, the present round of litigation concerned the application of section 23 (the security bodies exemption) and section 27 (international relations). There were two points of wider interest discussed in particular. One is the time at which the public interest is assessed (relevant to section 27), and one is the breadth of the “relates to” limb of section 23.

The time point was one which only really arose because of the Upper Tribunal’s desire to throw a mangy cat amongst the pigeons by suggesting in Defra v ICO & Badger Trust [2014] UKUT 526 (AAC) at [44]-[48] that the correct time to assess the public interest might be the date of Tribunal hearing. As some wise and learned commentators have pointed out, this rather seemed to have been overtaken by the Supreme Court’s – technically obiter – reasoning in R (Evans) v Attorney General [2015] UKSC 21 at [72]-[73] that the time was at the point of the authority’s refusal.

The Upper Tribunal in APPGER (containing at least one member of the panel in Badger Trust) issued a mea culpa and accepted that Evans was right: at [49]-[57]. It did not reach any more specific decision on situations where, for example, the authority has been late in complying. Doubtless the difference in time will often not matter very much. But the principle of the point now seems resolved.

Section 23(1) was not a point answered by Evans, and an argument was run by the requestor that “relates to” should be construed narrowly, as in the DPA. The Upper Tribunal disagreed: at [15]-[19]. The ordinary meaning of the language was broad, it was consistent with the aim of shutting the backdoor to the security bodies, it was consistent with authority, and met the contextual aim of FOIA where the contextual aim of the DPA was very different. The idea of requiring a “focus or main focus” was rejected.

Whilst agreeing that it should not attempt to gloss the statutory language, the Upper Tribunal nonetheless sought to assist future cases by indicating that asking whether the information requested had been supplied to a security body for the purposes of the discharge of its statutory functions (a test attributed to Mitting J) would have considerable utility. It would enable a clear explanation, it would allow differentiation within and without the scope of the exemption, and it was less likely to require a detailed line-by-line approach to redactions: at [33]. The language remains broad, but the practical application of it appears to have been ‘guided’ into a slightly narrower pigeon-hole than might have otherwise been the case.

The judgment as a whole is worth reading on the application of those exemptions to the particular information and the treatment of the evidence by the Upper Tribunal, but those two points of principle are the keys to take away. And about time too.

Timothy Pitt-Payne QC and Joanne Clement appeared for APPGER; Karen Steyn QC appeared for the FCO; Robin Hopkins appeared for the ICO.

Christopher Knight

Facebook, child protection and outsourced monitoring

Facebook is no stranger to complaints about the content of posts. Usually, one user complains to Facebook about what other users’ posts say about him. By making the offending posts available, Facebook is processing the complainant’s personal data, and must do so in compliance with data protection law.

More unusually, a user could also complain about their own Facebook posts. Surely a complainant cannot make data protection criticisms about information they deliberately posted about themselves? After all, Facebook processes those posts with the author’s consent, doesn’t it?

Generally, yes – but that will not necessarily be true in every instance, especially when it comes to Facebook posts by children. This is the nature of the complaint in striking litigation currently afoot before the High Court in Northern Ireland.

The case is HL v Facebook Inc, Facebook Ireland Ltd, the Northern Health & Social Care Trust and DCMS [2015] NIQB 61. It is currently only in its preliminary stages, but it raises very interesting and important issues about Facebook’s procedures for preventing underage users from utilising the social network. Those issues are illuminated in the recent judgment of Stephen J, who is no stranger to claims against Facebook – he heard the recent case of CG v Facebook [2015] NIQB 11, concerning posts about a convicted paedophile.

From the age of 11 onwards, HL maintained a Facebook page on which she made posts of an inappropriate sexual nature. She was exposed to responses from sexual predators. She says that Facebook is liable for its failure to prevent her from making these posts. She alleges that Facebook (i) unlawfully processed her sensitive personal data, (ii) facilitated her harassment by others, and (iii) was negligent in failing to have proper systems in place to minimise the risks of children setting up Facebook accounts by lying about their age.

The data protection claim raises a number of issues of great importance to the business of Facebook and others with comparable business models. One is the extent to which a child can validly consent to the processing of their personal data – especially sensitive personal data. Minors are (legitimately or not) increasingly active online, and consent is a cornerstone of online business. The consent issue is of one of wide application beyond the HL litigation.

A second issue is whether, in its processing of personal data, Facebook does enough to stop minors using their own personal data in ways which could harm them. In her claim, for example, HL refers to evidence given to a committee of the Australian Parliament – apparently by a senior privacy advisor to Facebook (though Facebook was unable to tell Stephens J who he was). That evidence apparently said that Facebook removes 20,000 under-age user profiles a day.

Stephens J was also referred to comments apparently made by a US Senator to Mark Zuckerberg about the vulnerability of underage Facebook users.

Another element of HL’s case concerns Facebook’s use of an outsourcing company called oDesk, operating for example from Morocco, to moderate complaints about Facebook posts. She calls into question the adequacy of these oversight measures: ‘where then is the oversight body for these underpaid global police?’ (to quote from a Telegraph article referred to in the recent HL judgment). Facebook says that – given its number of users in multiple languages across the globe – effective policing is a tall order (an argument J summed up at paragraph 22 as ‘the needle in a haystack argument, there is just too much to monitor, the task of dealing with underage users is impossible’).

In short, HL says that Facebook seems to be aware of the scale and seriousness of the problem of underage use of its network and has not done enough to tackle that problem.

Again, the issue is one of wider import for online multinationals for whom personal data is stock-in-trade.

The same goes for the third important data protection issue surfacing in the HL litigation. This concerns jurisdiction, cross-border data controllers and section 5 of the Data Protection Act 1998. For example, is Facebook Ireland established in the UK by having an office, branch or agency, and does it process the personal data in Facebook posts in the context of that establishment?

These issues are all still to be decided. Stephens J’s recent judgment in HL was not about the substantive issues, but about HL’s applications for specific discovery and interrogatories. He granted those applications. In addition to details of HL’s Facebook account usage, he ordered the Facebook defendants to disclose agreements between them and Facebook (UK) Ltd and between them and o-Desk (to whom some moderating processes were outsourced). He has also ordered the Facebook defendants to answer interrogatory questions about their procedures for preventing underage Facebook use.

In short, the HL litigation has – thus far – raised difficult data protection and privacy issues which are fundamental to Facebook’s business, and it has required Facebook to lay bare internal details of its safeguarding practices. The case is only just beginning. The substantive hearing, which is listed for next term, could groundbreaking.

Robin Hopkins @hopkinsrobin

Journalism and data protection – new Strasbourg judgment

There has been much debate as of late as to how data privacy rights should be reconciled with journalistic freedoms under the data protection legislation. This is a difficult issue which surfaced domestically in the recent case of Steinmetz & Ors v Global Witness and is now being debated across Europe in the context of the controversial right to be forgotten regime. One of the many important questions which remains at large on this issue is: what degree of protection is to be afforded under the data protection legislation to those publication activities which might be said to be of low public interest value (i.e. they satisfy the curiosity of readers but do not per se contribute to public debate).

It was precisely this question which the European Court of Human Rights was recently called upon to consider in the case of Satakunnan Markkinapörssi Oy And Satamedia Oy V. Finland(Application No. 931/13). In Satamedia, the Finnish Supreme Court had concluded that a magazine which published publicly available tax data could lawfully be prevented from publishing that data on the basis that this was required in order to protect the data privacy rights of the individuals whose tax data was in issue. The Finnish Court held that this constituted a fair balancing of the Article 10 rights of the publishers and the data privacy rights of affected individuals, particularly given that: (a) the freedom of expression derogation provided for under the Finnish data protection legislation had to be interpreted strictly and (b) the publication of the tax data was not itself required in the public interest, albeit that it may have satisfied the curiosity of readers. The owners of the magazine took the case to Strasbourg. They argued that the conclusions reached by the Finnish Court constituted an unjustified interference with their Article 10 rights. The Strasbourg Court disagreed. It concluded that the Finnish Court had taken into account relevant Strasbourg jurisprudence on the balancing of Article 10 and Article 8 rights (including Von Hannover v. Germany (no. 2) and Axel Springer AG v. Germany) and had arrived at a permissible result in terms of the balancing of the relevant interests (see para. 72).

There are three key points emerging from the judgment:

– first, it confirms the point made not least in the ICO’s recent guidance on data protection and the media, namely that there is no blanket protection for journalistic activities under the data protection legislation;

– second, it makes clear that, where there is a clash between data privacy rights and Article 10 rights, the courts will closely scrutinise the public interest value of the publication in issue (or lack thereof);

– third, it confirms that the lower the public interest value of the publication in question (as assessed by the court), the more likely it is that the rights of the data subject will be treated as preeminent.

Anya Proops

 

Right to be forgotten claim rejected by the administrative court

So here’s the question: you’re an individual who wants to have certain links containing information about you deindexed by Google; Google has refused to accede to your request and, upon complaint to the ICO, the Commissioner has decided that your complaint is unfounded and so he refuses to take enforcement action against Google under s. 40 DPA 1998; can you nonetheless secure the result you seek in terms of getting your data forgotten by mounting a judicial review challenge to the ICO’s decision? Well if the recent decision by the Administrative Court in the case of R(Khashaba) v Information Commissioner (CO/2399/2015) is anything to go by, it seems that you’ll be facing a rather mountainous uphill struggle.

In Khashaba, Mr Khashaba had complained to the Commissioner about Google’s refusal to de-index certain articles which apparently contained information revealing that Mr Khashaba had failed in his legal attempts to get his gun licences reinstated and had also failed to obtain placement on the Register of Medical Specialists in Ireland. The Commissioner concluded that Google had acted lawfully under the DPA 1998 in refusing to de-index the articles in question. Mr Khashaba was evidently unhappy with this result. Accordingly, he brought a judicial review claim against the Commissioner in which he contended in essence that the Commissioner had erred: (a) when he concluded, in exercise of his assessment powers under s. 42, that Google had acted lawfully in refusing to de-index the articles and (b) by failing to take enforcement action against Google under s. 40. By way of an order dated 17 July 2015, Hickinbottom J dismissed Mr Khashaba’s application for permission to judicially review the Commissioner’s decision. His reasoning was based on the Commissioner’s summary grounds, upon which the court felt itself unable to improve:

– first, permission was refused on the ground that Mr Khashaba had an alternative remedy because it was open to him to bring proceedings against Google directly in connection with its refusal of his application to be forgotten;

– second, the Commissioner had a wide discretion under s. 42 as to the manner in which he conducts his assessment and as to his conclusions on breach. He also had a wide discretion when it came to the issue of enforcement under s. 40. There was no basis for concluding that the way in which the Commissioner had exercised his powers in response to Mr Khashaba’s complaint was unreasonable or otherwise disproportionate.

All of which tends to suggest that: (a) the courts are likely to be very slow in impugning a decision of the Commissioner that particular information should not be forgotten and (b) that, if you’re an applicant who wants your data to be forgotten, you may yet find that the regulatory route offers little by way of comfort in terms of securing the necessary amnesiac effect.

11KBW’s Christopher Knight represented the Commissioner.

Anya Proops

 

FOIA Under Review

An important rule of Government is to outsource anything difficult or potentially controversial to an independent body which can then deliver a report to be ignored or implemented as required or the political mood dictate. The recent investigation into new runways at Heathrow was a good example, at least until it came up with an answer the Prime Minister didn’t entirely want to hear, and the Commission on a Bill of Rights was a superlative instance of a very learned study which achieved precisely nothing other than kicking a political football into the long grass.

Now it is the turn of the Freedom of Information Act 2000 to be undergone scrutiny by the Independent Commission on Freedom of Information. Snappy title. It is chaired by Lord Burns (former senior civil servant at HM Treasury) and contains such luminaries as Jack Straw, Lord Michael Howard, Lord Carlisle and Dame Patricia Hodgson (of Ofcom). Just in case anyone was suffering under the delusion that the Commission would be looking into widening the scope and application of FOIA, the terms of reference are set by the Cabinet Office as:

  • whether there is an appropriate public interest balance between transparency, accountability and the need for sensitive information to have robust protection
  • whether the operation of the Act adequately recognises the need for a ‘safe space’ for policy development and implementation and frank advice
  • the balance between the need to maintain public access to information, the burden of the Act on public authorities and whether change is needed to moderate that while maintaining public access to information

One would not, however, wish readers to think that the Government were anything less than fully committed to revealing information. On the contrary, the written statement laid by the Minister, Lord Bridges, opens by saying “We are committed to being the most transparent government in the world.” Well, quite. “We fully support the Freedom of Information Act [could there be a ‘but’ coming?] but [ah yes, there it is] after more than a decade in operation it is time that the process is reviewed, to make sure it’s working effectively.” The new Commission has a webpage here and is to report by November, which gives the grass limited time to lengthen… The Commission won’t, of course, be able to do anything about the EIRs.

Responsibility for FOIA has also been transferred to the Cabinet Office, which at least gives Michael Gove one less constitutional headache to deal with.

Christopher Knight

DRIPA 2014 declared unlawful

In a judgment of the Divisional Court handed down this morning, Bean LJ and Collins J have declared section 1 of the Data Retention and Investigatory Powers Act 2014 (DRIPA) to be unlawful.

For the background to that legislation, see our posts on Digital Rights Ireland and then on the UK’s response, i.e. passing DRIPA in an attempt to preserve data retention powers.

That attempt has today suffered a serious setback via the successful challenges brought by the MPs David Davis and Tom Watson, as well as Messrs Brice and Lewis. The Divisional Court did, however, suspend the effect of its order until after 31 March 2016, so as to give Parliament time to consider how to put things right.

Analysis to follow in due course, but for now, here is the judgment: Davis Watson Judgment.

Robin Hopkins @hopkinsrobin