Multi-billion dollar actions for inaccurate personal data?

Data protection has developed a curious habit of churning up heroic (or anti-heroic, depending on how you view it) figures who take on global behemoths to surprising effect. Maybe I am being too dramatic, but think of Mario Costeja González, the complainant at the heart of the Google Spain ‘right to be forgotten’ case, and Max Schrems, whose litigation has thrown Safe Harbor and transatlantic data transfers into turmoil.

If we maintain a transatlantic gaze, another such figure comes into view. On Monday of this week, the Supreme Court of the United States heard argument in the case of Spokeo Inc v Thomas Robins. Mr Robins – the potential David in this important new David v Goliath episode – is at the forefront of litigation against the ‘people search engine’ Spokeo (see Anya’s earlier post here).

The profile Spokeo compiled about him said he was a graduate, a professional in his 50s and a married man with children. Hardly defamatory stuff, except that none of it was correct. He did not establish that these errors caused him any financial loss, but he seeks damages for the publication of factually incorrect information about his life.

So what, you say? Well, consider the Amicus Briefs put before SCOTUS by Ebay, Facebook, Google and Yahoo. They all say that this is a very big deal. They point out that, as major global tech innovators, they are exposed to numerous federal and state laws which contain statutory damages provisions for private causes of actions. If standing is granted for “no injury” lawsuits “plaintiffs may pursue suits against amici even where they are not actually harmed by an alleged statutory violation, and in certain circumstances, seek class action damages that could run into the billions of dollars”.

The issues in Robins (should you be compensated for mere breaches or for ‘digital injuries’?) resonate with live issues before the courts in the UK: can you be compensated under the Data Protection Act 1998 for mere distress (see Vidal-Hall v Google, en route to the Supreme Court)? How should one compensate for privacy violations (see Gulati, on which the Court of Appeal’s judgment is awaited)?
Regardless of whether Mr Robins emerges as a Goliath-slayer, his case adds to the law’s increasingly intense scrutiny of global tech companies whose stock in trade is personal data.

Robin Hopkins @hopkinsrobin

Unsafe Harbor: some practical implications of the Schrems judgment

Panopticon has been quick-off-the-mark in reporting on today’s enormously significant Schrems judgment from the CJEU: see Chris’ alert and Anya’s commentary. I hope readers will excuse a third excursion into the same waters, given the enormous consequences the judgment. Here are a few observations on what those consequences mean in practice.

  1. Is this the end for Safe Harbor?

In its current form, yes. In theory, it can be fixed, rather than binned. Efforts have in fact been underway for some time aimed at renegotiating and tightening up aspects of the Safe Harbor arrangements, spurred by the Snowden revelations about the extent of US surveillance. The tenor of the judgment, however, is that tweaks will not suffice. ‘Dead in the water’ is the right shorthand for Safe Harbor.

  1. Does the Schrems judgment affect all companies transferring data to the US?

No – it torpedoes the Safe Harbor scheme, but it does not torpedo all EU-US data transfers. The Safe Harbor scheme was one of the major ways in which EU-US transfers of personal data ticked the box in terms of complying with Article 25 of Directive 95/46/EC (or the eighth data protection principle, in UK parlance). But it was not the only way.

Not all US companies were part of that scheme – in fact, you can see the full list of companies that are certified for Safe Harbor on the website of the US Department of Commerce (which administers certification for the scheme) here. There are around 5,000 companies affected by the Schrems judgment.

  1. Without Safe Harbour, how can data transfers to the US be lawful?

Obviously, the options include avoiding transfers to the US henceforth. Data processing arrangements could be retained within the EU, or they could be switched to one of a number of countries which already have an EU seal of approval: see the list here, which include Andorra, New Zealand, Canada, Uruguay, Israel and Argentina. Again, however, the Schrems judgment arguably implies that not even those countries are immune from scrutiny. Though those countries are not tainted by the Snowden/NSA revelations, their approved status is no longer inviolable.

Another option for multinationals transferring data to the US (or elsewhere) is to use Binding Corporate Rules. These provide a framework for how the organisation handles personal data. The data controller drafts its BCRs and submits them to the regulator for approval. Where more than one EU state is involved, the other regulators all need to have their say before the data controller’s arrangements are given the green light.

The BCR process is explained by the ICO here. Note the observation that a straightforward BCR application can take 12 months. So no quick fix for plugging the Safe Harbor gap here. Companies may need to find interim solutions while they work on adopting BCRs.

Another option is the use of Model Contract Clauses, explained by the ICO here. This involves incorporating off-the-shelf, EU-approved provisions into your contracts relating to personal data. These are inflexible, and they will not fit every data controller’s needs. Again, data controllers may need to craft stop-gap contractual solutions.

And again, it is arguably implicit in the Schrems judgment that even BCRs and Model Contract Clauses are flawed, i.e. they do not suffice to ensure that adequate data protection standards are maintained.

Lastly, as a data controller, you are able to do it yourself, i.e. to carry out your own assessment of the level of protection afforded in your data’s destination country. Again, the ICO helpfully explains. Again, however, the solutions are not straightforward.

  1. Are regulators going to take immediate action against all Safe Harbor-based transfers?

Unclear, but it is doubtful that they have the will or the way.

In the immediate term, the Irish Data Protection Commissioner now needs to decide whether or not Facebook’s US data transfers are lawful in the absence of Safe Harbor. This alone will be an important decision.

In the UK, the ICO has issued a press release on Schrems. It recognises that it will take time for businesses to adapt. Its tone is neither immediate nor pitiless.

This is no doubt because the business implications – both for the private sector and the regulators – would be enormous if a whole-scale clampdown were to be commenced immediately. It is likely that many regulators will give data controllers some time to get their houses (or harbors) in order – though the CJEU declined to take a similar approach in its judgment today.

  1. Will the new Data Protection Regulation fix the problem?

No. Its approach to international transfers is largely the same to the one which is currently in place. It contains no automatic fixes to the current quandary.

These are just preliminary observations. The dust has not yet settled, and businesses face some thorny practicalities in the meantime.

Robin Hopkins @hopkinsrobin

Refusing a subject access request: proportionality, anxious scrutiny and judicial discretion

Zaw Lin and Wai Phyo v Commissioner of Police for the Metropolis [2015] EWHC 2484 (QB), a judgment of Green J handed down today, is an interesting – if somewhat fact-specific – contribution to the burgeoning body of case law on how subject access requests (SARs) made under the Data Protection Act 1998 (DPA) should be approached, both by data controllers and by courts.

The Claimants are on trial in Thailand for the murder in September 2014 of British tourists Hannah Witheridge and David Miller. They could face the death penalty if convicted.

Under the Police Act 1996, and following high-level discussions (including at Prime Ministerial level), it was agreed that the Metropolitan Police Service (MPS) would send an officer to observe and review – but not assist with – the Thai police investigation. The MPS compiled a detailed Report. They agreed to keep this confidential, except that it could be summarised verbally to the families of the victims so as to reassure about the state of the investigation and proceedings. The Report has never been provided to the families or the Thai authorities.

The Claimants made SARs, seeking disclosure of the MPS’ Report. Green J summarised their objectives as follows (para 29):

“The Claimants have endeavoured to clothe their arguments in the somewhat technical language of the DPA.  It seems to me that the bottom line of these arguments, stripped bare of technical garb, can be put in two ways.  First, the views of the MPS carry weight. Scotland Yard has an international reputation.  If the Report is seen as favourable to the prosecution and contains material supportive of the RTP [Royal Thai Police] investigation (which is in effect how the Claimants say it has been presented in public by the families) then they should have the right to see the personal data so they can correct any misapprehensions.  Secondly, that in any event they should be able to use any personal data which is favourable to their defence.”

The Claimants were entitled to request disclosure of at least some of the contents of the Report, though Green J estimated that only a small percentage of its contents constituted their personal data (para 25).

The MPS refused the SARs, relying on the exemption for crime and taxation under section 29 DPA.

In determining the claim under section 7(9) DPA, Green J considered arguments as to the applicability (or not) of Directive 95/46/EC (which contains exceptions for criminal matters: see Articles 3 and 13) and the European Convention on Human Rights. His view was that not much turned on these points here (para 49). At common law, the court’s scrutiny must always be fact- and context-specific. In a life-and-death context, anxious scrutiny would be applied to a data controller’s refusal. See para 69:

“… when construing the DPA 1998 (whether through common law or European eyes) decision makers and courts must have regard to all relevant fundamental rights that arise when balancing the interest of the State and those of the individual.  There are no artificial limits to be placed on the exercise.”

Green J expressed his discomfort about the application of section 15(2) DPA, which allows the court – but not the data subject – to view the withheld information. This, together with the prospect of a closed session, raised concerns as to natural and open justice. Given the expedited nature of the case before him, it was not appropriate to appoint a special advocate, but that may need to be considered in future cases where the stakes are very high. Green J proceeded by asking questions and hearing submissions on an open basis in a sufficiently generic and abstract way.

In expressing those procedural misgivings, Green J has touched on an important aspect of DPA litigation which has received little attention to date.

He also took a narrower view of the breadth of his discretion under section 7(9) DPA than has often been assumed. At para 98, he said this of the ‘general and untrammelled’ nature of that judicial discretion:

“If Parliament had intended to confer such a broad residual discretion on the court then, in my view, it would have used far more specific language in section 7(9) than in fact it did. In any event I do not understand the observations in the authorities referred to above to suggest that if I find that the MPS has erred that I should simply make up and then apply whatever test I see fit.  If I find an error on the part of the MPS such that I must form my own view then I should do in accordance with the principles set out in the DPA 1998 and taking account of the relevant background principles in the Directive and the Convention. My discretion is unfettered by the decision that has gone before, and which I find unlawful, but I cannot depart from Parliament’s intent.”

Such an approach to section 7(9) could make a material difference to litigation concerning SARs.

Green J then set out and determined the issues before him as follows:

Issue I: Who has the burden of proof of proving both the right to invoke the exemption? What is the standard of proof?

Following R (Lord) v Secretary of State of the Home Department [2003] EWHC 2073 (Admin), the answer is that the data controller bears the burden. “The burden of proof is thus upon the MPS in this case to show its entitlement to refuse access and it must do this with significant and weighty grounds and evidence” (para 85).

Issue II: Was the personal data in the MPS report “processed” for purposes of (a) the prevention or detection of crime or (b) the apprehension or prosecution of offenders?

Green J’s answer was yes. Although the purposes behind the Report differed from the usual policing context, there should be no artificially narrow interpretation of the ‘prevention and detection of crime/apprehension or prosecution of offenders’.

Issue III: Would granting access be likely to prejudice any of those purposes?

This required a balancing exercise to be performed between the individual’s right to access and the interests being pursued by the data controller in refusing disclosure. This called for a “classic proportionality balancing exercise to be performed” (para 78).

Here, the starting point was the Claimant’s prima facie right to the personal data. This was bolstered by the life-and-death context of the present case.

The MPS’ refusal, however, pursued legitimate and weighty objectives. In assessing those objectives, it was relevant to consider what precedent would be set by disclosure: the “focus of attention was not just on the facts of the instant case but could also take account of the impact on other cases” (as per Lord).

On that basis, and in light of the evidence, the MPS’ ‘chilling effect’ argument was powerful. See para 107:

“… I accept their judgment and opinion as to the risks that release of the Report would give rise to and in particular, their position on: the considerable benefit to the public interest (in relation to crime enforcement and public security) generally in the MPS (and other relevant police authorities) being able to engage with foreign authorities; the high importance that is attached by foreign authorities to confidentiality; and the risk that not being able to give strong assurances as to confidentiality would pose to the ability of the MPS and others to enter into meaningful working relationship with such overseas authorities.”

It was also important to avoid any potential interference with a criminal trial in a foreign country.

The Claimants’ SARs were not made for any improper purposes, i.e. for purposes other than those which Directive 95/46/EC sought to further. In that respect, the present case was wholly unlike Durant.

The balancing exercise, however, favoured the MPS. Having considered each item of personal data, Green J said his “ultimate conclusion is that there is nothing in the personal data which would be of any real value to the Claimants” (para 125). He expressed his unease with both the procedure and the outcome. Permission to appeal was granted, though Panopticon understands that an appeal is not being pursued by the Claimants.

Anya Proops and Christopher Knight acted for the Defendant.

Robin Hopkins @hopkinsrobin

Privacy and data protection – summer roundup

August tends to be a quiet month for lawyers. There has, however, been little by way of a summer break in privacy and data protection developments. Here are some August highlights.

Privacy injunction: sexual affairs of sportsman (not philosophers)

Mrs Justice Laing’s August does not appear to have begun restfully. Following a telephone hearing on the afternoon of Saturday 1 August, she granted what became a widely-reported privacy injunction (lasting only until 5 August) restraining the publication of a story about an affair which a prominent sportsman had some years ago: see the judgment in AMC and KLJ v News Group Newspapers [2015] EWHC 2361 (QB).

As usual in such cases, Article 8 and Article 10 rights were relied upon to competing ends. There is no automatic favourite in such contests – an intense focus on the facts is required.

In this case, notwithstanding submissions about the extent to which the affected individuals ‘courted publicity’ or were not ‘private persons’ – there was a reasonable expectation of privacy about a secret sexual affair conducted years ago. The interference needed to be justified.

The right to free expression did not constitute adequate justification without more: “I cannot balance these two incommensurables [Articles 8 and 10] without asking why, and for what purposes, X and R seek to exercise their article 10 rights… The public interest here is, I remind myself, a contribution to a debate in the general interest”.

On the facts, there was insufficient public interest to justify that interference. The sportsman was not found to have hypocritically projected himself as ‘whiter than white’, and his alleged deceits and breaches of protocols in the coducting of his affair were not persuasive – especially years after the event. In any event, the sportsman was a role model for sportsmen or aspiring sportsmen: “he is not a role model for cooks, or for moral philosophers”. The latter point will no doubt be a weight off many a sporting shoulder.

Subject access requests: upcoming appeals

Subject access requests have traditionally received little attention in the courts. As with data protection matters more broadly, this is changing.

Holly Stout blogged earlier this month about the High Court’s judgment in Dawson-Damer and Ors v Taylor Wessing and Ors [2015] EWHC 2366 (Ch). The case concerned legal professional privilege, manual records and relevant filing systems, disproportionate searches and the court’s discretion under section 7(9) DPA. That case is on its way to the Court of Appeal.

So too is the case of Ittihadieh [2015] EWHC 1491 (QB), in which I appeared. That case concerned, among other issues, identification of relevant data controllers and the domestic purposes exemption. It too is on its way to the Court of Appeal.

Subject access requests: the burden of review and redaction

There has also been judgment this month in a County Court case in which I appeared for the Metropolitan Police Service. Mulcahy v MPS, a judgment of District Judge Langley in the Central London County Court, deals in part with the purposes behind a subject access request. It also deals with proportionality and burden, which – as Holly’s recent post discusses – has tended to be a vexed issue under the DPA (see Ezsias, Elliott, Dawson-Damer and the like).

Mulcahy deals with the proportionality of the burden imposed not so much by searching for information within the scope of a subject access request, but for reviewing (and, where necessary, redacting) that information before disclosure. This is an issue which commonly concerns data controllers. The judgment is available here: Mulcahy Judgment.

Privacy damages: Court of Appeal to hear Gulati appeal

May of 2015 saw Mr Justice Mann deliver a ground-breaking judgment on damages awards for privacy breaches: see Gulati & Ors v MGN Ltd [2015] EWHC 1482 (Ch), which concerned victims of phone-hacking (including Paul Gascoigne and Sadie Frost). The awards ranged between £85,000 and £260,250. The judgment and grounds of appeal against the levels of damages awards are explained in this post by Louise Turner of RPC.

Earlier this month, the Court of Appeal granted MGN permission to appeal. The appeal is likely to be expedited. It will not be long before there is a measure of certainty on quantum for privacy breaches.

ICO monetary penalties

Lastly, I turn to privacy-related financial sanctions of a different kind. August has seen the ICO issue two monetary penalty notices.

One was for £50,000 against ‘Stop the Calls’ (ironically, a company which markets devices for blocking unwanted marketing calls) for serious contraventions of regulation 21 of the Privacy and Electronic Regulations 2003 (direct marketing phone calls to persons who registered their opposition to such calls with the Telephone Preference Service).

Another was for £180,000 for a breach of the seventh data protection principle. It was made against The Money Shop following a burglary in which an unencrypted server containing customers’ personal information was stolen.

Robin Hopkins @hopkinsrobin

Facebook, drag artists and data protection dilemmas: ‘if you stand on our pitch, you must play by our rules’

Facebook is one of the main battlegrounds between privacy and other social goods such as safety and security.

On the one hand, it faces a safeguarding challenge. Interactions through Facebook have the potential to cause harm: defamation, data protection breaches, stalking, harassment, abuse and the like. One safeguard against such harms is to ensure that users are identifiable, i.e. that they really are who they say they are. This facilitates accountability and helps to ensure that only users of an appropriate age are communicating on Facebook. The ongoing litigation before the Northern Irish courts in the HL case raises exactly these sorts of concerns about child protection.

Part of the solution is Facebook’s ‘real names’ policy: you cannot register using a pseudonym, but only with your official identity.

On the other hand, Facebook encounters an argument which runs like this: individuals should be free to decide how they project themselves in their communications with the world. This means that, provided they are doing no harm, they should in principle be allowed to use whatever identity they like, including pseudonyms, working names (for people who wish to keep their private Facebooking and their professional lives separate) or stage names (particularly relevant for drag artists, for example). The real names policy arguably undermines this element of human autonomy, dignity and privacy. There have been colourful recent protests against the policy on these sorts of grounds.

Which is the stronger argument? Well, the answer to the question seems to depend on who you ask, and where you ask.

The Data Protection Commissioner in Ireland, where Facebook has its EU headquarters, has upheld the real names policy. When one of Germany’s regional Data Protection Commissioners (Schleswig-Holstein) took the opposite view, Facebook challenged his ruling and secured a court victory in 2013. The German court suspended the order against the real names policy and, equally importantly, decided that the challenge should proceed in Ireland, not Germany.

This week, however, another German decision turned the tables on the real names policy yet again. The Hamburg data protection authority upheld a complaint from someone who used a pseudonym on Facebook so as to separate her private and professional communications. The Hamburg DPA found against Facebook and held that it was not allowed unilaterally to change users’ chosen usernames to their real names. Nor was it entitled to demand official identification documents – an issue of particular relevance to child protection issues such as those arising in HL.

The Hamburg ruling is notable on a number of fronts. It exemplifies the tension between privacy – in all its nuanced forms – and other values. It illustrates the dilemmas bedevilling the business models of social media companies such as Facebook.

The case also highlights real challenges for the future of European data protection. The General Data Protection Regulation – currently clawing its way from draft to final form – aspires to harmonised pan-European standards. It includes a mechanism for data protection authorities to co-operate and resolve differences. But if authorities within the same country are prone to divergence on issues such as the real names policy, how optimistic can one be that regulators across the EU will sing from the same hymn sheet?

Important questions arise about data protection and multinational internet companies: in which country (or region, for that matter) should a user raise a complaint to a regulator? If they want to complain to a court, where do they do that? If a German user complains to an Irish regulator or court, to what extent do those authorities have to consider German law?

For the moment, Facebook clearly seeks home ground advantage. But its preference for the Irish forum was rejected by the Hamburg authority in this week’s ruling. He is reported as saying that “… Facebook cannot again argue that only Irish Data Protection law would be applicable … anyone who stands on our pitch also has to play our game”.

The draft Regulation has something to say on these matters, but is far from clear as to how to decide on the right pitch and the right rules for vital privacy battles like these.

Robin Hopkins @hopkinsrobin

Facebook, child protection and outsourced monitoring

Facebook is no stranger to complaints about the content of posts. Usually, one user complains to Facebook about what other users’ posts say about him. By making the offending posts available, Facebook is processing the complainant’s personal data, and must do so in compliance with data protection law.

More unusually, a user could also complain about their own Facebook posts. Surely a complainant cannot make data protection criticisms about information they deliberately posted about themselves? After all, Facebook processes those posts with the author’s consent, doesn’t it?

Generally, yes – but that will not necessarily be true in every instance, especially when it comes to Facebook posts by children. This is the nature of the complaint in striking litigation currently afoot before the High Court in Northern Ireland.

The case is HL v Facebook Inc, Facebook Ireland Ltd, the Northern Health & Social Care Trust and DCMS [2015] NIQB 61. It is currently only in its preliminary stages, but it raises very interesting and important issues about Facebook’s procedures for preventing underage users from utilising the social network. Those issues are illuminated in the recent judgment of Stephen J, who is no stranger to claims against Facebook – he heard the recent case of CG v Facebook [2015] NIQB 11, concerning posts about a convicted paedophile.

From the age of 11 onwards, HL maintained a Facebook page on which she made posts of an inappropriate sexual nature. She was exposed to responses from sexual predators. She says that Facebook is liable for its failure to prevent her from making these posts. She alleges that Facebook (i) unlawfully processed her sensitive personal data, (ii) facilitated her harassment by others, and (iii) was negligent in failing to have proper systems in place to minimise the risks of children setting up Facebook accounts by lying about their age.

The data protection claim raises a number of issues of great importance to the business of Facebook and others with comparable business models. One is the extent to which a child can validly consent to the processing of their personal data – especially sensitive personal data. Minors are (legitimately or not) increasingly active online, and consent is a cornerstone of online business. The consent issue is of one of wide application beyond the HL litigation.

A second issue is whether, in its processing of personal data, Facebook does enough to stop minors using their own personal data in ways which could harm them. In her claim, for example, HL refers to evidence given to a committee of the Australian Parliament – apparently by a senior privacy advisor to Facebook (though Facebook was unable to tell Stephens J who he was). That evidence apparently said that Facebook removes 20,000 under-age user profiles a day.

Stephens J was also referred to comments apparently made by a US Senator to Mark Zuckerberg about the vulnerability of underage Facebook users.

Another element of HL’s case concerns Facebook’s use of an outsourcing company called oDesk, operating for example from Morocco, to moderate complaints about Facebook posts. She calls into question the adequacy of these oversight measures: ‘where then is the oversight body for these underpaid global police?’ (to quote from a Telegraph article referred to in the recent HL judgment). Facebook says that – given its number of users in multiple languages across the globe – effective policing is a tall order (an argument J summed up at paragraph 22 as ‘the needle in a haystack argument, there is just too much to monitor, the task of dealing with underage users is impossible’).

In short, HL says that Facebook seems to be aware of the scale and seriousness of the problem of underage use of its network and has not done enough to tackle that problem.

Again, the issue is one of wider import for online multinationals for whom personal data is stock-in-trade.

The same goes for the third important data protection issue surfacing in the HL litigation. This concerns jurisdiction, cross-border data controllers and section 5 of the Data Protection Act 1998. For example, is Facebook Ireland established in the UK by having an office, branch or agency, and does it process the personal data in Facebook posts in the context of that establishment?

These issues are all still to be decided. Stephens J’s recent judgment in HL was not about the substantive issues, but about HL’s applications for specific discovery and interrogatories. He granted those applications. In addition to details of HL’s Facebook account usage, he ordered the Facebook defendants to disclose agreements between them and Facebook (UK) Ltd and between them and o-Desk (to whom some moderating processes were outsourced). He has also ordered the Facebook defendants to answer interrogatory questions about their procedures for preventing underage Facebook use.

In short, the HL litigation has – thus far – raised difficult data protection and privacy issues which are fundamental to Facebook’s business, and it has required Facebook to lay bare internal details of its safeguarding practices. The case is only just beginning. The substantive hearing, which is listed for next term, could groundbreaking.

Robin Hopkins @hopkinsrobin