Secret ‘Practice Directions’ and Royal Wills

Mr Brown became a well-known figure in litigation circles when he sought to unseal the Will of Princess Margaret in the belief that it might reveal information showing him to be her illegitimate son. In the course of his unsuccessful litigation, it was revealed that there existed what had been described orally during the court proceedings as a “Practice Direction in respect of the handling of Royal Wills” (although there is dispute over precisely what form this document takes and whether it is really a Practice Direction at all), produced by the-then President of the Family Division following liaison with the Royal Household.

Having failed to unseal the Will, Mr Brown requested a copy of the document from the Attorney General. He was refused, under section 37 FOIA. The First-tier Tribunal upheld that refusal (on which see Robin’s blog here). Mr Brown appealed to the Upper Tribunal on the grounds of inadequacy of the Tribunal’s reasons and a failure to properly apply the public interest test. He was refused permission, but then successfully judicially reviewed the Upper Tribunal for failure to grant him permission (on which, see my blog here).

Much happened subsequently. Having fought hard to prevent disclosure of the ‘Practice Direction’ the AG then released almost all of it to Mr Brown in advance of the substantive appeal hearing before the Upper Tribunal. The unreleased aspect was one paragraph, which was supplied to him in ‘gisted’ form. Nonetheless, Mr Brown sought disclosure of the outstanding paragraph. Perhaps not entirely surprisingly, Charles J in the Upper Tribunal has just refused to give him the final missing piece: Brown v ICO & Attorney General [2015] UKUT 393 (AAC).

The Upper Tribunal decision, in the light of the release by AG, had rather less work to do than it might have done, and the judgment will be of equivalent reduced wider interest. However, Charles J does roundly endorse the proposition that there is a very powerful public interest “against the creation of undisclosed principles and procedures to be applied by the court to an application to seal any will, and this is strengthened when participants in and the decision maker on that application (the court through initially or generally the President of the Family Division) and the normal guardian of the public interest (the Attorney General) have been involved in its creation on a confidential and undisclosed basis, and so in favour of the publication of the principles and procedure to be applied on any such application (particularly if initially or generally the application will be made in private)“. In other words, the AG was right to concede that the material should be disclosed. There was no further interest in the gisted paragraph also being revealed because the essential meaning had been conveyed.

Whether this brings Mr Brown’s campaign to an end is another matter, but whatever one might think of his view as to his parentage, his uncovering of a – to put it neutrally – highly unusual document agreed between the AG, the Royal Household and the President of the Family Division concerning court procedures is a worthy effort.

Robin Hopkins appeared for the ICO; Joanne Clement appeared for the Attorney General and Anya Proops appeared for Mr Brown at some of the earlier stages of proceedings.

Christopher Knight

Google and the ordinary person’s right to be forgotten

The Guardian has reported today on data emerging from Google about how it has implemented the Google Spain ‘right to be forgotten’ principle over the past year or so: see this very interesting article by Julia Powles.

While the data is rough-and-ready, it appears to indicate that the vast majority of RTBF requests actioned by Google have concerned ‘ordinary people’. By that I mean people who are neither famous nor infamous, and who seek not to have high-public-interest stories erased from history, but to have low-public-interest personal information removed from the fingertips of anyone who cares to Google their name. Okay, that explanation here is itself rough-and-ready, but you get the point: most RTBF requests come not from Max Mosley types, but from Mario Costeja González types (he being the man who brought the Google Spain complaint in the first place).

As Julia Powles points out, today’s rough-and-ready is thus far the best we have to go on in terms of understanding how the RTBF is actually working in practice. There is very little transparency on this. Blame for that opaqueness cannot fairly be levelled only at Google and its ilk – though, as the Powles articles argues, they may have a vested interest in maintaining that opaqueness. Opaqueness was inevitable following a judgment like Google Spain, and European regulators have, perhaps forgivably, not yet produced detailed guidance at a European level on how the public can expect such requests to be dealt with. In the UK, the ICO has given guidance (see here) and initiated complaints process (see here).

Today’s data suggests to me that a further reason for this opaqueness is the ‘ordinary person’ factor: the Max Mosleys of the world tend to litigate (and then settle) when they are dissatisfied, but the ordinary person tends not to (Mr González being an exception). We remain largely in the dark about how this web-shaping issue works.

So: the ordinary person is most in need of transparent RTBF rules, but least equipped to fight for them.

How might that be resolved? Options seem to me to include some combination of (a) clear regulatory guidance, tested in the courts, (b) litigation by a Max Mosley-type figure which runs its course, (c) litigation by more Mr González figures (i.e. ordinary individuals), (d) litigation by groups of ordinary people (as in Vidal Hall, for example) – or perhaps (e) litigation by members of the media who object to their stories disappearing from Google searches.

The RTBF is still in its infancy. Google may be its own judge for now, but one imagines not for long.

Robin Hopkins @hopkinsrobin

Data Sharing between Public Bodies

The principle disadvantage, to the data protection lawyer, of the failure of Esperanto is that every now and then the CJEU hands down a judgment which is only available in French, and even Panopticon cannot blog every entry in Franglais. Such is the problem raised by the Opinion of the Advocate General (Cruz Villalon) in Case C-201/14 Bara v Presedintele Casei Nationala de Asigurari de Sanatate. Readers will have to forgive any failure to capture the nuances.

Bara is a reference from the Romanian courts and contains a number of questions, the majority of which concern the compatibility of national tax authority arrangements with Article 124 TFEU (which prohibits in most cases privileged access for public bodies to financial institutions). Those need not concern us, not least because the AG considered them to be inadmissible.

However, the fourth question referred was in the following terms: “May personal data be processed by authorities for which such data were not intended where such an operation gives rise, retroactively, to financial loss?” The context appears to be that people deriving their income from independent activities were called to pay their contributions to the National Fund for health insurance, following a tax notice issued by the Romanian health insurance fund. However, that tax notice was calculated on the basis of data on income provided National Tax Administration Agency under an internal administrative protocol. The complaint was that the transfer by the Tax Agency to the Health Insurance Fund of personal data, particularly related to income, was in breach of Directive 95/46/EC because no consent had been provided to the transfer, the data subjects had not been informed of the transfer and the transfer was not for the same purpose as the data was originally supplied.

The Advocate General answered the fourth question by saying that the Directive precludes national legislation which allows a public institution of a Member State to process personal data that has been supplied by another public institution, including the data relating to the income of the persons concerned, without the latter having been previously informed of this transmission or treatment. This was despite the fact that the AG recognised that the Romanian bodies had a legitimate interest in being able to properly tax self-employed persons; the informal protocol did not constitute a legislative measure setting out a relevant national exemption under Article 13. The AG stressed that the requirement of notification in Article 11 had not been complied with, and that the data subjects accordingly had been unable to object to the transfer. The data subjects had not given their unambiguous consent. Although Article 7(e) (necessary for the performance of a task) could apply to a transfer of income data, it had to be shown that it was strictly necessary for the realisation of the functions of the Health Insurance Fund. (This appears to be a higher test being imposed than the usual interpretation of necessary as ‘reasonably necessary’, as per the Supreme Court in South Lanarkshire). The AG did not consider that test met.

It remains, of course, to be seen whether the CJEU will take the same approach; but it seems fairly likely that Bara will produce a judgment which confirms the illegality of inter-institutional transfer of personal data without express consent or a carefully defined need which is prescribed by law. There is nothing ground-breaking in that conclusion, but it is an important reiteration of the need for data controllers anywhere in the EU to think carefully about the authorisation they have to hand over personal data to other bodies; informal agreements or policy documents are not sufficient without a legal underpinning (through the DPA) or consent of the data subject.

The forthcoming judgment in Case C-582/14, Breyer will also raise issues over consent in the context of IP information retained by websites, along with the vexed question of whether an IP address can constitute personal data when combined with other information available to a third party (issues similar to those raised in Vidal-Hall v Google, on which see here). When the final judgments in Bara and Breyer appear, so will the analysis of some intrepid blogger of this parish.

Christopher Knight

Do Young Thugs have Human Rights? The Supreme Court has a Riot

Following a period of considered reflection, or laziness depending on one’s view, it is worth noting the decision of the Supreme Court in In the matter of an application by JR38 for Judicial Review [2015] UKSC 42. The case is all about Article 8 ECHR, and is of particular interest because of the dispute about the breadth of the correct test for the engagement of Article 8. The context is also one which will be familiar to English data protection and privacy lawyers: the publication by the police of photographs seeking to identify a suspect. If anyone remembers that famous picture of a youth in a hoodie pointing his fingers like a gun behind an awkward looking David Cameron, JR38 is basically that, but with Molotov cocktails and a sprinkling of sectarian hatred.

In JR38, the suspect in question was a 14 year child whose photograph was published by the PSNI as someone involved in rioting in an area of Derry in 2010 which had particular sectarian tensions. The judgment of Lord Kerr makes clear that JR38 has by no means been a well-behaved young man before or since the riots of 2010. Moreover, and amusingly, it is apparent that he and his father failed to correctly identify his own appearance in pictures published, and originally sued on the basis of images which did not show JR38 at all. However, a correct image was eventually alighted upon.

The judgments contain a lengthy and detailed discussion of the domestic and Strasbourg case law on the engagement of Article 8, but there was a 3-2 split in the Court between the correct approach. Lords Toulson and Clarke (with both of whom Lord Hodge agreed) considered that the overwhelming approach of the existing domestic law was to apply the touchstone of the reasonable/legitimate expectation of privacy test: see Lord Toulson at [87]-[88]. The test (originating, of course, in Campbell) focuses on “the sensibilities of a reasonable person in the position of the person who is the subject of the conduct complained about…If there could be no reasonable expectation of privacy, or legitimate expectation of protection, it is hard to see how there could nevertheless be a lack of respect for their article 8 rights”. The warning in Campbell not to bleed justification matters into the engagement analysis was stressed.

The difference between the majority and minority of Lord Kerr (with whom Lord Wilson agreed) was explained by Lord Clarke at [105]. Does the reasonable expectation of privacy test provide the only touchstone? The majority thought that it did, it being the only test set out clearly in the cases, and it being a broad objective concept to applied in all the circumstances of the case and having regard to the underlying values involved, unconcerned with the subjective expectation of the individual, be they child or adult (see at [98] per Lord Toulson and [109] per Lord Clarke).

In essence, the majority did not consider this context to be one which Article 8 was designed to protect. The identification of a suspect was not within the scope of personal autonomy, although publication of the same picture for a different purpose, other than identification, might be: at [98] (and at [112] where Lord Clarke did not consider the mere fact of criminal activity took the matter outside Article 8). Historic or re-published photos may alter the situation: at [101].

By contrast, Lord Kerr took a broader view, holding that the reasonable expectation of privacy test might be the ‘rule of thumb’, but could not be an inflexible, wholly determinative test. The scope of Article 8 was much broader and was contextual, requiring consideration of factors such as: age, consent, stigmatisation, the context of the photographed activity and the use of the image. Reasonable expectation of privacy failed, in his view, to allow for these factors to be considered: at [56]. Rather than shoehorning such factors into the test, they should bear on the issue in a free-standing footing: at [61]. The focus must be on the publication – i.e. the infringement – rather than the activity the photo displays. For Lord Kerr, the fact that JR38 was a child, taken with the potential effect publication might have on the life of the child, was more than sufficient to engage Article 8 (in the way that it might not for an adult): at [65]-[66].

The debate is an interesting one, but there is a very strong chance that the flexibility of the majority orthodox approach is likely to mean very little difference in substance between the two. It will, however, be worth emphasising the importance of context, particularly in child cases under Article 8.

The Court was, however, unanimous in agreeing that publication was justified in any event; rioters had to be identified (and other methods had been tried internally first), with the peril in which inter-community harmony was placed being particularly important in the fair balance.

Where, readers of this blog might ask, was the DPA in all this namby-pamby human rights discussion? Why is there no mention of schedules and data protection principles and all the other black letter statutory stuff that so gets the blood pumping? Well, it was mentioned, at [70], by Lord Kerr who considered that compliance with the DPA would mean that the limb of proportionality which requires the act to be in accordance with the law would be met. In very brief reasoning, Lord Kerr concluded that this type of case was within section 29 because publication was processing for the purposes of prevention and detection of crime, and that the relevant condition met in both Schedule 2 and 3 (because he agreed it was clearly sensitive personal data) was that of the processing being necessary for the administration of justice. Unfortunately, there was no analysis of the way in it was necessary for the administration of justice, or the extent to which this is the same as the prevention and detection of crime. Nor is it quite the same reasoning as adopted by Lord Woolf CJ in the well-known ‘naming and shaming’ case of R (Ellis) v Chief Constable of Essex Police [2003] EWHC 1321 (Admin), which, at [29], appeared to apply the conditions in Schedules 2 and 3 whereby processing was necessary for the performance of functions by or under any enactment (without further specification). Where the Supreme Court speaks, we follow, but it might have been helpful to detail this aspect a little more, although it is another example of a case in which Article 8 is presumed to do all of the work and the DPA be raced through in a paragraph to avoid having to think about it too much. That Article 8 and the DPA are ensured to be pulling in the same direction is, however, a relief to us all.

 Christopher Knight

Austria will not host Europe vs Facebook showdown

As illustrated by Anya Proops’ recent post on a Hungarian case currently before the CJEU, the territorial jurisdiction of European data protection law can raise difficult questions.

Such questions have bitten hard in the Europe vs Facebook litigation. Max Schrems, an Austrian law graduate, is spearheading a massive class action in which some 25,000 Facebook users allege numerous data protection violations by the social media giant. Those include: unlawful obtaining of personal data (including via plug-ins and “like” buttons); invalid consent to Facebook’s processing of users’ personal data; use of personal data for impermissible purposes, including the unlawful analysing of data/profiling of users (“the Defendant analyses the data available on every user and tries to explore users’ interests, preferences and circumstances…”); unlawful sharing of personal data with third parties and third-party applications. The details of the claim are here.

Importantly, however, the claim is against Facebook Ireland Ltd, a subsidiary of the Californian-based Facebook Inc. The class action has been brought in Austria.

Facebook challenged the Austrian court’s jurisdiction. Last week, it received a judgment in its favour from the Viennese Regional Civil Court. The Court said it lacks jurisdiction in part because Mr Schrems is not deemed to be a ‘consumer’ of Facebook’s services. In part also, it lacks jurisdiction because Austria is not the right place to be bringing the claim. Facebook argued that the claim should be brought either in Ireland or in California, and the Court agreed.

Mr Schrems has announced his intention to appeal. In the meantime, the Austrian decision will continue to raise both eyebrows and questions, particularly given that a number of other judgments in recent years have seen European courts accepting jurisdiction to hear claims against social media companies (such as Google: see Vidal-Hall, for example) based elsewhere.

The Austrian decision also highlights the difficulties of the ‘one-stop shop’ principle which remains part of the draft Data Protection Regulation (albeit in more nuanced and complicated formulation than had earlier been proposed). In short, why should an Austrian user have to sue in Ireland?

Panopticon will report on any developments in this case in due course. It will also report on the other strand of Mr Schrems’ privacy campaign, namely his challenge to the lawfulness of the Safe Harbour regime for the transferring of personal data to the USA. That challenge has been heard by the CJEU, and the Advocate General’s opinion is imminent. The case will have major implications for those whose business involves transatlantic data transfers.

Robin Hopkins @hopkinsrobin

Forget me knot…BBC publishes list of ‘forgotten’ stories

Since the CJEU’s controversial decision in Google Spain,the debates have raged about how the so-called right to be forgotten should cash out in the online world. Particular concerns have been expressed by the media that the judgment rides rough shod over Article 10 rights, including not least the Article 10 rights of the website authors whose stories are being deindexed. Now it seems the BBC is seeking to reassert its Article 10 rights by publishing a list of all the stories which have been deindexed by Google thus far – see here.

The BBC’s position is that the publication of the list does not seek to frustrate the Court’s judgment, because it will not ‘make the stories more findable for anyone looking for a name’. What it will do, according to the BBC is enable a ‘meaningful debate’ about the right to be forgotten to take place. This is a bold step coming from one of the world’s most respected media organisations. It will doubtless provoke a copycat reaction from other media organisations which regard the CJEU’s judgment in Google Spain as an affront to their Article 10 rights. What is interesting about this new approach is that it does very clearly allow the wider public to examine how the right to be forgotten is in practice being weighed against the fundamental right to free expression. No doubt the BBC’s actions will attract criticism from those individuals who had hoped that their requests to be forgotten would result in the relevant links sinking for all time into the soup of online forgetfulness. It remains to be seen how the Information Commissioner will respond to this important and provocative development.

Anya Proops