Vidal-Hall to the Supreme Court

Has the announcement of the death of section 13(2) DPA been premature? Might it, after all, be nuzzling up the bars, ready to go ‘Voom’? Perhaps, but the Supreme Court is taking on the role of Burke and Hare because it has today announced that it has given leave to appeal on the following two questions:

  1. Whether the Court of Appeal was right to hold that section 13(2) of the Data Protection Act 1998 was incompatible with Article 23 of the Directive.
  2. Whether the Court of Appeal was right to disapply section 13(2) of the Data Protection Act 1998 on the grounds that it conflicts with the rights guaranteed by Articles 7 and 8 of the EU Charter of Fundamental Rights.

A further question, on whether it was correct to classify the misuse of private information claims as tortious ones, was refused leave, presumably on the basis that the Supreme Court only wants to think about the super-cool DPA issues.

A hearing is highly unlikely before 2016, but Panopticon will let you know when it knows. In the meantime, section 13(2) is still dead, so get your damages while they are still hot…

Christopher Knight

Circle the Wagons: They are Coming for the Information Tribunal

We all fell for it, didn’t we? If the greatest trick the Devil ever pulled was convincing the world he didn’t exist, then Michael Gove’s may have been to convince everyone that he wasn’t interested in FOIA. His shunting responsibility for FOIA/EIR matters off to the Cabinet Office, and the Cabinet Office’s announcement of the Commission on Freedom of Information (generally staffed by people who publicly don’t much like it), last week has led to a lot of comment and reaction – mostly adverse – from social media, blogs and even the mainstream press.

And that has rather caused everyone to take their eye off the ball, including Panopticon (which was alerted by an informant known only as Deep Throat), because in the midst of kerfuffle over the possible threat to the substance of aspects of FOIA through the new Commission the Ministry of Justice has announced a consultation on a more insidious threat to seekers of information and transparency: the introduction of Tribunal fees.

Contained within a document which is also the Government’s Response to an earlier consultation exercise on raising fees in various aspects of civil litigation (also problematic, but not relevant here) is a consultation is the introduction, for the first time, of fees to use certain parts of the First-tier Tribunal and Upper Tribunal. The Upper Tribunal (Administrative Appeals Chamber) is not within the scope of the proposal – although there is no explanation as to why not – but the First-tier Tribunal (General Regulatory Chamber) is.

The proposal is that there will be a £100 fee for an appeal to be issued, and a further £500 fee if an oral hearing takes place. Cases referred to the Upper Tribunal under rule 19 of the GRC Rules will also be subject to the same fee. There will be a system of remissions in place.

This gives rise to a number of issues. Anyone who has had anything to do with the Employment Tribunals over the last few years will know that since the introduction of fees (and, to be fair, compulsory settlement discussion periods) the workload of the ET has gone through the floor. It seems highly likely that something very similar will happen for FOIA/EIR appeals, where so many of the cases are brought by individuals, many of whom will not be able to afford to spend that kind of money on something which has no prospect, unlike the ET, of ever winning them any money and relates almost inevitably to the public interest rather than their own private interest. There must be a real difference of type between such litigation and private interest litigation elsewhere in the GRC. (To be fair, it is of course the case that judicial reviews brought on a public interest basis still incur fees, but they are the exception to the ordinary use of Part 54, whereas the public interest is at the heart of the vast majority of FOIA appeals, and assessed on that basis.) Why, when the legislation is requestor-blind, should the Tribunal system not be too? Alternatively, one might mount a plausible argument that if a public authority wishes to appeal a Decision Notice then it should have to pay a fee (because it is seeking to avoid transparency) but a requestor should not.

It is pretty likely, although the figures aren’t given in the consultation, that FOIA/EIR appeals make up a large proportion of the GRC’s work. But given their public interest element, are they really the cases to which a fee should be targeted? The GRC also hears appeals concerning the regulation of estate agents, driving instructors, claims management services and exam boards (amongst others). Those are all private interests, as are appeals against fines levied on public authorities for bugging phones without warrants. Is not an appeal about the release of public authority information worthy of greater ring-fencing from fees than an appeal about Defra banning you from micro-chipping a dog? (I may be barking up the wrong tree, but I am not making this up.)

In the calculations of the Government, the GRC costs them £1.6m a year. At best, they expect to recover £0.4m in fees, and that will reduce if the caseload drops as a result of fee introduction (which, bizarrely, they apparently do not anticipate). One might think that that was a relative drop in the ocean, although of course every penny counts, but fees won’t pay the GRC’s way (and Gambling disproportionately contributes already given the very high fees and the very low number of cases).

But aside from the issues of principle, there are also real problems of practice, particularly around the hearing fee. The proposal says “The claimant may alternatively elect for an oral hearing, in which case a further fee of £500 would be payable.” But this doesn’t reflect the reality of the GRC Rules. Rule 32 requires the GRC to hold a hearing where one party requests it: what if the ICO or the public authority request one? What if both do? Who pays then? What if the Tribunal itself lists a hearing, against the wishes of the parties, because it thinks it cannot do justice without one under rule 32(1)(b)? Who pays for that decision? The proposal appears to anticipate that the appellant will still have to pay the fee, presumably on the basis that that it is their ‘fault’ that the appeal exists at all, but that seems very unfair. What about directions hearings – does the fee apply to those, and who has to request one for it to be triggered? The proposal seems remarkably un-thought through and the consultation will need to point that out.

Why is it the GRC which is being targeted by fees rather than the appellate stage in the Upper Tribunal? Surely having a second bite on appeal is more worthy of a financial penalty, and a discouragement to unnecessary appeals on the facts? If fees still apply to rule 19 transfers, will it not be in the interest of every litigant to try and get a case transferred to the UT on the basis that if he has to pay, he may as well get the best court he can for his buck?

 

The consultation paper and the impact assessment on tribunal fees are both online. Panopticon strongly encourages readers to respond to the consultation, which closes on 15 September 2015.

Christopher Knight

Child Protection and Data Protection

The spectre of Jimmy Saville casts a long shadow and now it extends to data protection, the Data Protection Act 1998 being the latest august and uniformly popular institution (following the BBC, Broadmoor and Margaret Thatcher to name just some) to suffer as a result of his actions. The perennial sight of investigations and public inquiries into historic sex abuse of children in local authority, chiefly arising out of the wider ramifications of Operation Yewtree, has provided a very ready explanation for local authorities for the need to retain child protection data.

The fifth data protection principle says data should not be kept longer than necessary for the purposes for which it is processed, whereas the reality will often be that the information of greatest significance (accusations of abuse or records of care) will only become significant after the expiry of a lot of time and the child’s growth into an adult able to confront the abuse they have suffered.

As a result, there is no consistent practice across the country. The High Court in R (C) v Northumberland County Council & ICO [2015] EWHC 2134 (Admin) was informed that authorities adopt an approach which ranges from retention until the 21st birthday, to six years after the 18th birthday, to 75 years from the date of birth, to 35 years from the closure of a case: at [10]. This obviously poses concerns about compliance with the DPA and Article 8 ECHR.

C sought the destruction of his child protection held by Northumberland CC and considered that it had been retained under the 35 year policy applicable in Northumberland for too long. C considered that a period of six years after his 18th birthday would have been the cut-off point, and the ICO agreed intervening (although the ICO copped a lot of flak from Simon J for having issued a section 42 DPA determination indicating it was ‘likely’ the Council had complied with the DPA and had subsequently changed its mind).

The judgment of Simon J is not always the easiest to follow. It appears that the key question before the Court was whether the retention for 35 years (which clearly engaged Article 8) was in accordance with the law, and if it was, whether it was proportionate. Although the judgment does not actually reason expressly in this way, it seems as though the analysis revolves around the fifth principle: if data retention does not breach the longer than necessary test, it will be in accordance with the law and it will be proportionate. This is not actually what the judgment says; it must be broadly how the analysis goes (see at [9]), and it is open to some debate whether those assumptions are correct in law or in analysis of the judgment.

Simon J held that the purpose for retaining child protection records was not limited to defending litigation, and so an adoption of six years – based on the limitation period – did not read across. The purpose was broader: it was to protect other children, to allow data subjects access in later life, and to make the information available to subsequent investigation: at [33]. The Judge was clearly influenced by the difficulty of seeing the importance of information at the time, and its significance only becoming clear through a more historical lens: at [37]. The clearest examples are, of course, Saville-esque: at [49]-[53]. A six year cut off period would, in the view of Simon J, restrict the ability of people over 24 from making a request and learning about their child protection file contents: at [45]-[47]. Simon J concluded that the Council was not required to adopt a “cumbersome and time-consuming predictive exercise” and retention would help to identify risks only seen with hindsight: at [56]. Regular review, every seven years, was considered a disproportionate use of labour: at [58]. 35 years “fell within the bracket of legitimate periods of retention”: at [61].

One can readily sympathise with the position of the Council in a case like C, which will be (with other linked agencies) between a rock and a hard place on child protection data. If they delete too quickly they risk being castigated by history for not being able to answer questions; if they don’t delete they are hoarders of sensitive and traumatic data. Simon J clearly sympathised very strongly with this. However, the structure of the reasoning is regrettably unclear. The reader is left uncertain whether Simon J has found the fifth principle complied with (probably, on the basis of a wider reading of its purposes), whether that has meant the interference with Article 8 was in accordance with the law (presumably, but query how that works where it only falls within a legitimate bracket), and how the structured proportionality analysis has been carried out. It may well not matter on the conclusions of the judgment, but it does mean it will be harder to advise on and apply in related contexts. Nor does it give much guidance as to other periods adopted; is 6 years too short and is 75 years too long? Doubtless further case law will explore the undiscovered country. In the meantime, some national guidance wouldn’t go amiss…

Paul Greatorex appeared for C, Karen Steyn QC for Northumberland and Robin Hopkins for the ICO.

Christopher Knight

Time to End the Time Debate

The apparently endless APPGER litigation has produced yet another decision of the Upper Tribunal for seasoned FOIA watchers, which amongst some very fact-specific issues, also contains two important clarifications of law: APPGER v ICO & FCO [2015] UKUT 377 (AAC).

As anyone who has ever done any information law ever will know, the APPGER litigation concerns requests under FOIA for information related to alleged British involvement in extraordinary rendition. Some information has been released, some has been released following earlier rounds of litigation, some remains withheld under various exemptions.

Following previous hearings staying various points, the present round of litigation concerned the application of section 23 (the security bodies exemption) and section 27 (international relations). There were two points of wider interest discussed in particular. One is the time at which the public interest is assessed (relevant to section 27), and one is the breadth of the “relates to” limb of section 23.

The time point was one which only really arose because of the Upper Tribunal’s desire to throw a mangy cat amongst the pigeons by suggesting in Defra v ICO & Badger Trust [2014] UKUT 526 (AAC) at [44]-[48] that the correct time to assess the public interest might be the date of Tribunal hearing. As some wise and learned commentators have pointed out, this rather seemed to have been overtaken by the Supreme Court’s – technically obiter – reasoning in R (Evans) v Attorney General [2015] UKSC 21 at [72]-[73] that the time was at the point of the authority’s refusal.

The Upper Tribunal in APPGER (containing at least one member of the panel in Badger Trust) issued a mea culpa and accepted that Evans was right: at [49]-[57]. It did not reach any more specific decision on situations where, for example, the authority has been late in complying. Doubtless the difference in time will often not matter very much. But the principle of the point now seems resolved.

Section 23(1) was not a point answered by Evans, and an argument was run by the requestor that “relates to” should be construed narrowly, as in the DPA. The Upper Tribunal disagreed: at [15]-[19]. The ordinary meaning of the language was broad, it was consistent with the aim of shutting the backdoor to the security bodies, it was consistent with authority, and met the contextual aim of FOIA where the contextual aim of the DPA was very different. The idea of requiring a “focus or main focus” was rejected.

Whilst agreeing that it should not attempt to gloss the statutory language, the Upper Tribunal nonetheless sought to assist future cases by indicating that asking whether the information requested had been supplied to a security body for the purposes of the discharge of its statutory functions (a test attributed to Mitting J) would have considerable utility. It would enable a clear explanation, it would allow differentiation within and without the scope of the exemption, and it was less likely to require a detailed line-by-line approach to redactions: at [33]. The language remains broad, but the practical application of it appears to have been ‘guided’ into a slightly narrower pigeon-hole than might have otherwise been the case.

The judgment as a whole is worth reading on the application of those exemptions to the particular information and the treatment of the evidence by the Upper Tribunal, but those two points of principle are the keys to take away. And about time too.

Timothy Pitt-Payne QC and Joanne Clement appeared for APPGER; Karen Steyn QC appeared for the FCO; Robin Hopkins appeared for the ICO.

Christopher Knight

Facebook, child protection and outsourced monitoring

Facebook is no stranger to complaints about the content of posts. Usually, one user complains to Facebook about what other users’ posts say about him. By making the offending posts available, Facebook is processing the complainant’s personal data, and must do so in compliance with data protection law.

More unusually, a user could also complain about their own Facebook posts. Surely a complainant cannot make data protection criticisms about information they deliberately posted about themselves? After all, Facebook processes those posts with the author’s consent, doesn’t it?

Generally, yes – but that will not necessarily be true in every instance, especially when it comes to Facebook posts by children. This is the nature of the complaint in striking litigation currently afoot before the High Court in Northern Ireland.

The case is HL v Facebook Inc, Facebook Ireland Ltd, the Northern Health & Social Care Trust and DCMS [2015] NIQB 61. It is currently only in its preliminary stages, but it raises very interesting and important issues about Facebook’s procedures for preventing underage users from utilising the social network. Those issues are illuminated in the recent judgment of Stephen J, who is no stranger to claims against Facebook – he heard the recent case of CG v Facebook [2015] NIQB 11, concerning posts about a convicted paedophile.

From the age of 11 onwards, HL maintained a Facebook page on which she made posts of an inappropriate sexual nature. She was exposed to responses from sexual predators. She says that Facebook is liable for its failure to prevent her from making these posts. She alleges that Facebook (i) unlawfully processed her sensitive personal data, (ii) facilitated her harassment by others, and (iii) was negligent in failing to have proper systems in place to minimise the risks of children setting up Facebook accounts by lying about their age.

The data protection claim raises a number of issues of great importance to the business of Facebook and others with comparable business models. One is the extent to which a child can validly consent to the processing of their personal data – especially sensitive personal data. Minors are (legitimately or not) increasingly active online, and consent is a cornerstone of online business. The consent issue is of one of wide application beyond the HL litigation.

A second issue is whether, in its processing of personal data, Facebook does enough to stop minors using their own personal data in ways which could harm them. In her claim, for example, HL refers to evidence given to a committee of the Australian Parliament – apparently by a senior privacy advisor to Facebook (though Facebook was unable to tell Stephens J who he was). That evidence apparently said that Facebook removes 20,000 under-age user profiles a day.

Stephens J was also referred to comments apparently made by a US Senator to Mark Zuckerberg about the vulnerability of underage Facebook users.

Another element of HL’s case concerns Facebook’s use of an outsourcing company called oDesk, operating for example from Morocco, to moderate complaints about Facebook posts. She calls into question the adequacy of these oversight measures: ‘where then is the oversight body for these underpaid global police?’ (to quote from a Telegraph article referred to in the recent HL judgment). Facebook says that – given its number of users in multiple languages across the globe – effective policing is a tall order (an argument J summed up at paragraph 22 as ‘the needle in a haystack argument, there is just too much to monitor, the task of dealing with underage users is impossible’).

In short, HL says that Facebook seems to be aware of the scale and seriousness of the problem of underage use of its network and has not done enough to tackle that problem.

Again, the issue is one of wider import for online multinationals for whom personal data is stock-in-trade.

The same goes for the third important data protection issue surfacing in the HL litigation. This concerns jurisdiction, cross-border data controllers and section 5 of the Data Protection Act 1998. For example, is Facebook Ireland established in the UK by having an office, branch or agency, and does it process the personal data in Facebook posts in the context of that establishment?

These issues are all still to be decided. Stephens J’s recent judgment in HL was not about the substantive issues, but about HL’s applications for specific discovery and interrogatories. He granted those applications. In addition to details of HL’s Facebook account usage, he ordered the Facebook defendants to disclose agreements between them and Facebook (UK) Ltd and between them and o-Desk (to whom some moderating processes were outsourced). He has also ordered the Facebook defendants to answer interrogatory questions about their procedures for preventing underage Facebook use.

In short, the HL litigation has – thus far – raised difficult data protection and privacy issues which are fundamental to Facebook’s business, and it has required Facebook to lay bare internal details of its safeguarding practices. The case is only just beginning. The substantive hearing, which is listed for next term, could groundbreaking.

Robin Hopkins @hopkinsrobin

Journalism and data protection – new Strasbourg judgment

There has been much debate as of late as to how data privacy rights should be reconciled with journalistic freedoms under the data protection legislation. This is a difficult issue which surfaced domestically in the recent case of Steinmetz & Ors v Global Witness and is now being debated across Europe in the context of the controversial right to be forgotten regime. One of the many important questions which remains at large on this issue is: what degree of protection is to be afforded under the data protection legislation to those publication activities which might be said to be of low public interest value (i.e. they satisfy the curiosity of readers but do not per se contribute to public debate).

It was precisely this question which the European Court of Human Rights was recently called upon to consider in the case of Satakunnan Markkinapörssi Oy And Satamedia Oy V. Finland(Application No. 931/13). In Satamedia, the Finnish Supreme Court had concluded that a magazine which published publicly available tax data could lawfully be prevented from publishing that data on the basis that this was required in order to protect the data privacy rights of the individuals whose tax data was in issue. The Finnish Court held that this constituted a fair balancing of the Article 10 rights of the publishers and the data privacy rights of affected individuals, particularly given that: (a) the freedom of expression derogation provided for under the Finnish data protection legislation had to be interpreted strictly and (b) the publication of the tax data was not itself required in the public interest, albeit that it may have satisfied the curiosity of readers. The owners of the magazine took the case to Strasbourg. They argued that the conclusions reached by the Finnish Court constituted an unjustified interference with their Article 10 rights. The Strasbourg Court disagreed. It concluded that the Finnish Court had taken into account relevant Strasbourg jurisprudence on the balancing of Article 10 and Article 8 rights (including Von Hannover v. Germany (no. 2) and Axel Springer AG v. Germany) and had arrived at a permissible result in terms of the balancing of the relevant interests (see para. 72).

There are three key points emerging from the judgment:

– first, it confirms the point made not least in the ICO’s recent guidance on data protection and the media, namely that there is no blanket protection for journalistic activities under the data protection legislation;

– second, it makes clear that, where there is a clash between data privacy rights and Article 10 rights, the courts will closely scrutinise the public interest value of the publication in issue (or lack thereof);

– third, it confirms that the lower the public interest value of the publication in question (as assessed by the court), the more likely it is that the rights of the data subject will be treated as preeminent.

Anya Proops