Multi-billion dollar actions for inaccurate personal data?

Data protection has developed a curious habit of churning up heroic (or anti-heroic, depending on how you view it) figures who take on global behemoths to surprising effect. Maybe I am being too dramatic, but think of Mario Costeja González, the complainant at the heart of the Google Spain ‘right to be forgotten’ case, and Max Schrems, whose litigation has thrown Safe Harbor and transatlantic data transfers into turmoil.

If we maintain a transatlantic gaze, another such figure comes into view. On Monday of this week, the Supreme Court of the United States heard argument in the case of Spokeo Inc v Thomas Robins. Mr Robins – the potential David in this important new David v Goliath episode – is at the forefront of litigation against the ‘people search engine’ Spokeo (see Anya’s earlier post here).

The profile Spokeo compiled about him said he was a graduate, a professional in his 50s and a married man with children. Hardly defamatory stuff, except that none of it was correct. He did not establish that these errors caused him any financial loss, but he seeks damages for the publication of factually incorrect information about his life.

So what, you say? Well, consider the Amicus Briefs put before SCOTUS by Ebay, Facebook, Google and Yahoo. They all say that this is a very big deal. They point out that, as major global tech innovators, they are exposed to numerous federal and state laws which contain statutory damages provisions for private causes of actions. If standing is granted for “no injury” lawsuits “plaintiffs may pursue suits against amici even where they are not actually harmed by an alleged statutory violation, and in certain circumstances, seek class action damages that could run into the billions of dollars”.

The issues in Robins (should you be compensated for mere breaches or for ‘digital injuries’?) resonate with live issues before the courts in the UK: can you be compensated under the Data Protection Act 1998 for mere distress (see Vidal-Hall v Google, en route to the Supreme Court)? How should one compensate for privacy violations (see Gulati, on which the Court of Appeal’s judgment is awaited)?
Regardless of whether Mr Robins emerges as a Goliath-slayer, his case adds to the law’s increasingly intense scrutiny of global tech companies whose stock in trade is personal data.

Robin Hopkins @hopkinsrobin

Court of Appeal considers damages for privacy breaches – data protection to follow suit?

This week, the Court of Appeal is grappling with a difficult and important question: how do you value an invasion of privacy? In other words, where someone has suffered a breach of their privacy rights, how do you go about determining the compensation they should receive?

The appeal is brought by MGN against the judgment of Mann J in Gulati & Ors v MGN Ltd [2015] EWHC 1482 (Ch). That judgment concerned victims of blagging and phone-hacking (including Paul Gascoigne, Sadie Frost and Alan Yentob) for which Mirror Group Newspapers was held responsible.

Mann J awarded the claimants compensation ranging between £85,000 and £260,250. His judgment was ground-breaking, in part due to the size of those awards. (By way of comparison, the previous highest award in a privacy case had been made to Max Mosely, in the region of £60,000 – but most awards have been much lower).

It was also ground-breaking in terms of the methodology adopted to calculate quantum for privacy breaches. Here is how Mann J summarised the rival arguments (paragraph 108; I have underlined the components put forward by the claimants):

“… The case of the claimants is that the compensation should have several elements.  There is compensation for loss of privacy or “autonomy” resulting from the hacking or blagging that went on; there is compensation for injury to feelings (including distress); and there is compensation for “damage or affront to dignity or standing”.  The defendant disputes this and submits that all that can be compensated for is distress or injury to feelings…  It is accepted that such things as loss of autonomy are relevant, but only as causes of the distress which is then compensated for.  They are not capable of sustaining separate heads of compensation…”

As is clear from that synopsis, the debate is not just about money, observable cause-and-effect or hard-edged law. The debate also has difficult philosophical and ethical dimensions. It seems that neither society nor the law (which sometimes overlap) has yet got to the bottom of what it really means to have one’s privacy invaded.

In any event, Mann J certainly did his bit to progress that debate. He preferred the analysis of the claimants – hence the large awards they received. See for example his paragraphs 143-144:

“… The tort is not a right to be prevented from upset in a particular way.  It is a right to have one’s privacy respected.  Misappropriating (misusing) private information without causing “upset” is still a wrong.  I fail to see why it should not, of itself, attract damages.  Otherwise the right becomes empty, contrary to what the European jurisprudence requires.  Upset adds another basis for damages; it does not provide the only basis. I shall therefore approach the consideration of quantum in this case on the footing that compensation can be given for things other than distress, and in particular can be given for the commission of the wrong itself so far as that commission impacts on the values protected by the right.”

The Court of Appeal’s judgment in MGN’s appeal will have a huge impact on the size of awards in privacy cases, and thereby on the privacy litigation landscape itself. It will also no doubt contribute to our understanding of how 21st-century society values (or ought to value) privacy.

What impact will it have on compensation under section 13 of the Data Protection Act 1998?

As with privacy compensation, data protection compensation is having a revolutionary year: see the striking down of section 13(2) in Vidal-Hall v Google [2015] EWCA Civ 311. Traditionally, few people brought claims under section 13 DPA, because it was assumed that they could only be compensated for distress (their primary complaint) if they also suffered financial loss (which mostly they hadn’t). Vidal-Hall overturned that: you can be compensated for distress alone under section 13 DPA. This point will be considered by the Supreme Court next year, but for now, the removal of this barrier to successful section 13 claims is hugely important.

Another barrier, however, lingers: section 13 DPA awards tend to be discouragingly low, from a claimant’s perspective. See most crucially Halliday v Creation Consumer Finance [2013] EWCA Civ 333 (where an award for £750 was made): “the sum to be awarded should be of a relatively modest nature since it is not the intention of the legislation to produce some kind of substantial award. It is intended to be compensation…” (per Arden LJ at paragraph 36).

Increasingly, however, case law emphasises the intimate relationship between data protection and fundamental privacy rights: see for example Vidal-Hall, and last year’s ‘right to be forgotten’ judgment in the Google Spain case.

So, if Mann J’s wide, claimant-friendly approach to quantifying damages is upheld in the privacy context, how long before the same approach infiltrates data protection litigation?

Robin Hopkins @hopkinsrobin

Unsafe Harbor: some practical implications of the Schrems judgment

Panopticon has been quick-off-the-mark in reporting on today’s enormously significant Schrems judgment from the CJEU: see Chris’ alert and Anya’s commentary. I hope readers will excuse a third excursion into the same waters, given the enormous consequences the judgment. Here are a few observations on what those consequences mean in practice.

  1. Is this the end for Safe Harbor?

In its current form, yes. In theory, it can be fixed, rather than binned. Efforts have in fact been underway for some time aimed at renegotiating and tightening up aspects of the Safe Harbor arrangements, spurred by the Snowden revelations about the extent of US surveillance. The tenor of the judgment, however, is that tweaks will not suffice. ‘Dead in the water’ is the right shorthand for Safe Harbor.

  1. Does the Schrems judgment affect all companies transferring data to the US?

No – it torpedoes the Safe Harbor scheme, but it does not torpedo all EU-US data transfers. The Safe Harbor scheme was one of the major ways in which EU-US transfers of personal data ticked the box in terms of complying with Article 25 of Directive 95/46/EC (or the eighth data protection principle, in UK parlance). But it was not the only way.

Not all US companies were part of that scheme – in fact, you can see the full list of companies that are certified for Safe Harbor on the website of the US Department of Commerce (which administers certification for the scheme) here. There are around 5,000 companies affected by the Schrems judgment.

  1. Without Safe Harbour, how can data transfers to the US be lawful?

Obviously, the options include avoiding transfers to the US henceforth. Data processing arrangements could be retained within the EU, or they could be switched to one of a number of countries which already have an EU seal of approval: see the list here, which include Andorra, New Zealand, Canada, Uruguay, Israel and Argentina. Again, however, the Schrems judgment arguably implies that not even those countries are immune from scrutiny. Though those countries are not tainted by the Snowden/NSA revelations, their approved status is no longer inviolable.

Another option for multinationals transferring data to the US (or elsewhere) is to use Binding Corporate Rules. These provide a framework for how the organisation handles personal data. The data controller drafts its BCRs and submits them to the regulator for approval. Where more than one EU state is involved, the other regulators all need to have their say before the data controller’s arrangements are given the green light.

The BCR process is explained by the ICO here. Note the observation that a straightforward BCR application can take 12 months. So no quick fix for plugging the Safe Harbor gap here. Companies may need to find interim solutions while they work on adopting BCRs.

Another option is the use of Model Contract Clauses, explained by the ICO here. This involves incorporating off-the-shelf, EU-approved provisions into your contracts relating to personal data. These are inflexible, and they will not fit every data controller’s needs. Again, data controllers may need to craft stop-gap contractual solutions.

And again, it is arguably implicit in the Schrems judgment that even BCRs and Model Contract Clauses are flawed, i.e. they do not suffice to ensure that adequate data protection standards are maintained.

Lastly, as a data controller, you are able to do it yourself, i.e. to carry out your own assessment of the level of protection afforded in your data’s destination country. Again, the ICO helpfully explains. Again, however, the solutions are not straightforward.

  1. Are regulators going to take immediate action against all Safe Harbor-based transfers?

Unclear, but it is doubtful that they have the will or the way.

In the immediate term, the Irish Data Protection Commissioner now needs to decide whether or not Facebook’s US data transfers are lawful in the absence of Safe Harbor. This alone will be an important decision.

In the UK, the ICO has issued a press release on Schrems. It recognises that it will take time for businesses to adapt. Its tone is neither immediate nor pitiless.

This is no doubt because the business implications – both for the private sector and the regulators – would be enormous if a whole-scale clampdown were to be commenced immediately. It is likely that many regulators will give data controllers some time to get their houses (or harbors) in order – though the CJEU declined to take a similar approach in its judgment today.

  1. Will the new Data Protection Regulation fix the problem?

No. Its approach to international transfers is largely the same to the one which is currently in place. It contains no automatic fixes to the current quandary.

These are just preliminary observations. The dust has not yet settled, and businesses face some thorny practicalities in the meantime.

Robin Hopkins @hopkinsrobin

‘Vilified’ doctor cannot publish patient’s private information

In the Matter of C (A Child) (Application by Dr X and Y) [2015] EWFC 79 involved, in the words of Munby J, an unusual and indeed unprecedented application. It pitted the right to defend one’s reputation against the privacy and confidentiality rights of others. In this case, the latter won.
Dr X had treated C and C’s mother; he had also been an expert witness in the family court care proceedings concerning C. C’s mother was unhappy about the treatment given by Dr X. She complained about him to the GMC, whose Fitness to Practise panel in due course found the allegations against Dr X to be unproven. C’s mother also criticised Dr X publicly in the media.
Dr X felt that his “otherwise unblemished reputation … has been cataclysmically damaged … through inaccurate reporting and internet postings” and that he has been “unfairly and unjustly pilloried by the mother and, through her, by the press” (his skeleton argument, cited at para 10 of Munby J’s judgment).
Dr X wanted to be able to put his side of the story, and to have the original source documents – from the family court proceedings and the Fitness to Practice proceedings – available, to quote from (while respecting anonymity) if his public statements were challenged. He sought disclosure of documents from those proceedings.
One difficulty he faced was that the law restricts the use to which documents from family proceedings could be put. The court had a discretion to allow disclosure, but generally subject to restrictions on the use to which documents could be put.
A further major difficulty was that he was bound by doctor-patient confidentiality, both as a matter of legal duty and professional confidentiality. That duty permits of exceptions – for example, to allow a doctor who is being unfairly vilified by a patient to defend himself – but even then any departure from confidentiality obligations must be proportionate.
The same applies to interference with patients’ privacy under Article 8 ECHR; privacy rights were particularly acute here, because what was sought (for disclosure, and for deployment in public statements) was “a mass of medical materials relating to the mother’s mental health” (Munby J at paragraph 42). Disclosure of those materials, even in redacted form, would have major implications for the privacy of the child, C.
Those difficulties were fatal to the application. Munby J said that “the remedy being sought by Dr X – permission to put the mother’s medical records and related documents into the public domain, at a time and in circumstances of his own choosing and without any of the safeguards usually imposed – is wholly disproportionate to anything which he can legitimately or reasonably demand”.
In relation to the documents filed in the Fitness to Practise proceedings but which were not part of the documentation filed in the care proceedings, the court had no jurisdiction to grant an application for disclosure. In any event, disclosure of the confidential material Dr X sought for deployment in the public domain would again be wholly disproportionate.
Heather Emmerson of 11KBW appeared for the GMC.​
Robin Hopkins @hopkinsrobin

Privacy and data protection – summer roundup

August tends to be a quiet month for lawyers. There has, however, been little by way of a summer break in privacy and data protection developments. Here are some August highlights.

Privacy injunction: sexual affairs of sportsman (not philosophers)

Mrs Justice Laing’s August does not appear to have begun restfully. Following a telephone hearing on the afternoon of Saturday 1 August, she granted what became a widely-reported privacy injunction (lasting only until 5 August) restraining the publication of a story about an affair which a prominent sportsman had some years ago: see the judgment in AMC and KLJ v News Group Newspapers [2015] EWHC 2361 (QB).

As usual in such cases, Article 8 and Article 10 rights were relied upon to competing ends. There is no automatic favourite in such contests – an intense focus on the facts is required.

In this case, notwithstanding submissions about the extent to which the affected individuals ‘courted publicity’ or were not ‘private persons’ – there was a reasonable expectation of privacy about a secret sexual affair conducted years ago. The interference needed to be justified.

The right to free expression did not constitute adequate justification without more: “I cannot balance these two incommensurables [Articles 8 and 10] without asking why, and for what purposes, X and R seek to exercise their article 10 rights… The public interest here is, I remind myself, a contribution to a debate in the general interest”.

On the facts, there was insufficient public interest to justify that interference. The sportsman was not found to have hypocritically projected himself as ‘whiter than white’, and his alleged deceits and breaches of protocols in the coducting of his affair were not persuasive – especially years after the event. In any event, the sportsman was a role model for sportsmen or aspiring sportsmen: “he is not a role model for cooks, or for moral philosophers”. The latter point will no doubt be a weight off many a sporting shoulder.

Subject access requests: upcoming appeals

Subject access requests have traditionally received little attention in the courts. As with data protection matters more broadly, this is changing.

Holly Stout blogged earlier this month about the High Court’s judgment in Dawson-Damer and Ors v Taylor Wessing and Ors [2015] EWHC 2366 (Ch). The case concerned legal professional privilege, manual records and relevant filing systems, disproportionate searches and the court’s discretion under section 7(9) DPA. That case is on its way to the Court of Appeal.

So too is the case of Ittihadieh [2015] EWHC 1491 (QB), in which I appeared. That case concerned, among other issues, identification of relevant data controllers and the domestic purposes exemption. It too is on its way to the Court of Appeal.

Subject access requests: the burden of review and redaction

There has also been judgment this month in a County Court case in which I appeared for the Metropolitan Police Service. Mulcahy v MPS, a judgment of District Judge Langley in the Central London County Court, deals in part with the purposes behind a subject access request. It also deals with proportionality and burden, which – as Holly’s recent post discusses – has tended to be a vexed issue under the DPA (see Ezsias, Elliott, Dawson-Damer and the like).

Mulcahy deals with the proportionality of the burden imposed not so much by searching for information within the scope of a subject access request, but for reviewing (and, where necessary, redacting) that information before disclosure. This is an issue which commonly concerns data controllers. The judgment is available here: Mulcahy Judgment.

Privacy damages: Court of Appeal to hear Gulati appeal

May of 2015 saw Mr Justice Mann deliver a ground-breaking judgment on damages awards for privacy breaches: see Gulati & Ors v MGN Ltd [2015] EWHC 1482 (Ch), which concerned victims of phone-hacking (including Paul Gascoigne and Sadie Frost). The awards ranged between £85,000 and £260,250. The judgment and grounds of appeal against the levels of damages awards are explained in this post by Louise Turner of RPC.

Earlier this month, the Court of Appeal granted MGN permission to appeal. The appeal is likely to be expedited. It will not be long before there is a measure of certainty on quantum for privacy breaches.

ICO monetary penalties

Lastly, I turn to privacy-related financial sanctions of a different kind. August has seen the ICO issue two monetary penalty notices.

One was for £50,000 against ‘Stop the Calls’ (ironically, a company which markets devices for blocking unwanted marketing calls) for serious contraventions of regulation 21 of the Privacy and Electronic Regulations 2003 (direct marketing phone calls to persons who registered their opposition to such calls with the Telephone Preference Service).

Another was for £180,000 for a breach of the seventh data protection principle. It was made against The Money Shop following a burglary in which an unencrypted server containing customers’ personal information was stolen.

Robin Hopkins @hopkinsrobin

Facebook, drag artists and data protection dilemmas: ‘if you stand on our pitch, you must play by our rules’

Facebook is one of the main battlegrounds between privacy and other social goods such as safety and security.

On the one hand, it faces a safeguarding challenge. Interactions through Facebook have the potential to cause harm: defamation, data protection breaches, stalking, harassment, abuse and the like. One safeguard against such harms is to ensure that users are identifiable, i.e. that they really are who they say they are. This facilitates accountability and helps to ensure that only users of an appropriate age are communicating on Facebook. The ongoing litigation before the Northern Irish courts in the HL case raises exactly these sorts of concerns about child protection.

Part of the solution is Facebook’s ‘real names’ policy: you cannot register using a pseudonym, but only with your official identity.

On the other hand, Facebook encounters an argument which runs like this: individuals should be free to decide how they project themselves in their communications with the world. This means that, provided they are doing no harm, they should in principle be allowed to use whatever identity they like, including pseudonyms, working names (for people who wish to keep their private Facebooking and their professional lives separate) or stage names (particularly relevant for drag artists, for example). The real names policy arguably undermines this element of human autonomy, dignity and privacy. There have been colourful recent protests against the policy on these sorts of grounds.

Which is the stronger argument? Well, the answer to the question seems to depend on who you ask, and where you ask.

The Data Protection Commissioner in Ireland, where Facebook has its EU headquarters, has upheld the real names policy. When one of Germany’s regional Data Protection Commissioners (Schleswig-Holstein) took the opposite view, Facebook challenged his ruling and secured a court victory in 2013. The German court suspended the order against the real names policy and, equally importantly, decided that the challenge should proceed in Ireland, not Germany.

This week, however, another German decision turned the tables on the real names policy yet again. The Hamburg data protection authority upheld a complaint from someone who used a pseudonym on Facebook so as to separate her private and professional communications. The Hamburg DPA found against Facebook and held that it was not allowed unilaterally to change users’ chosen usernames to their real names. Nor was it entitled to demand official identification documents – an issue of particular relevance to child protection issues such as those arising in HL.

The Hamburg ruling is notable on a number of fronts. It exemplifies the tension between privacy – in all its nuanced forms – and other values. It illustrates the dilemmas bedevilling the business models of social media companies such as Facebook.

The case also highlights real challenges for the future of European data protection. The General Data Protection Regulation – currently clawing its way from draft to final form – aspires to harmonised pan-European standards. It includes a mechanism for data protection authorities to co-operate and resolve differences. But if authorities within the same country are prone to divergence on issues such as the real names policy, how optimistic can one be that regulators across the EU will sing from the same hymn sheet?

Important questions arise about data protection and multinational internet companies: in which country (or region, for that matter) should a user raise a complaint to a regulator? If they want to complain to a court, where do they do that? If a German user complains to an Irish regulator or court, to what extent do those authorities have to consider German law?

For the moment, Facebook clearly seeks home ground advantage. But its preference for the Irish forum was rejected by the Hamburg authority in this week’s ruling. He is reported as saying that “… Facebook cannot again argue that only Irish Data Protection law would be applicable … anyone who stands on our pitch also has to play our game”.

The draft Regulation has something to say on these matters, but is far from clear as to how to decide on the right pitch and the right rules for vital privacy battles like these.

Robin Hopkins @hopkinsrobin