Interfering with the fundamental rights of practically the entire European population

In the Digital Rights Ireland case, the Grand Chamber of the CJEU has this week declared invalid the 2006 Directive which provides for the mass retention – and disclosure to policing and security authorities – of individuals’ online traffic data. It found this regime to be a disproportionate interference with privacy rights. Depending on your perspective, this is a major step forward for digital privacy, or a major step backwards in countering terrorism and serious crime. It probably introduces even more uncertainty in terms of the wider project of data protection reform at the EU level. Here is my synopsis of this week’s Grand Chamber judgment.

Digital privacy vs national security: a brief history

There is an overlapping mesh of rights under European law which aims to protect citizens’ rights with respect to their personal data – an increasingly important strand of the broader right to privacy. The Data Protection Directive (95/46/EC) was passed in 1995, when the internet was in its infancy. It provides that personal data must be processed (obtained, held, used, disclosed) fairly and lawfully, securely, for legitimate purposes and so on.

Then, as the web began to mature into a fundamental aspect of everyday life, a supplementary Directive was passed in 2002 (2002/58/EC) on privacy and electronic communications. It is about privacy, confidentiality and the free movement of electronic personal data in particular.

In the first decade of the 21st century, however, security objectives became increasingly urgent. Following the London bomings of 2005 in particular, the monitoring of would-be criminals’ web activity was felt to be vital to effective counter-terrorism and law enforcement. The digital confidentiality agenda needed to make space for a measure of state surveillance.

This is how Directive 2006/24 came to be. In a nutshell, it provides for traffic and location data (rather than content-related information) about individuals’ online activity to be retained by communications providers and made available to policing and security bodies. This data was to be held for a minimum of six months and a maximum of 24 months.

That Directive – like all others – is however subject to the EU’s Charter of Fundamental Rights. Article 7 of that Charter enshrines the right to respect for one’s private and family life, home and communications. Article 8 is about the right to the protection and fair processing of one’s personal data.

Privacy and Digital Rights Ireland prevail

Digital Rights Ireland took the view that the 2006 Directive was not compatible with those fundamental rights. It asked the Irish Courts to refer this to the CJEU. Similar references were made during different litigation before the Austrian Courts.

The CJEU gave its answer this week. In Digital Rights Ireland Ltd v Minister for Communications, Marine and Natural Resources and Others (C‑293/12) joined with Kärntner Landesregierung and Others (C‑594/12), the Grand Chamber held the 2006 Directive to be invalid on the grounds of its incompatibility with fundamental privacy rights.

The Grand Chamber accepted that, while privacy rights were interfered with, this was in pursuit of compelling social objectives (the combatting of terrorism and serious crime). The question was one of proportionality. Given that fundamental rights were being interfered with, the Courts would allow the European legislature little lee-way: anxious scrutiny would be applied.

Here, in no particular order, are some of the reasons why the 2006 Directive failed its anxious scrutiny test (quotations are all from the Grand Chamber’s judgment). Unsurprisingly, this reads rather like a privacy impact assessment which data controllers are habitually called upon to conduct.

The seriousness of the privacy impact

First, consider the nature of the data which, under Articles 3 and 5 the 2006 Directive, must be retained and made available. “Those data make it possible, in particular, to know the identity of the person with whom a subscriber or registered user has communicated and by what means, and to identify the time of the communication as well as the place from which that communication took place. They also make it possible to know the frequency of the communications of the subscriber or registered user with certain persons during a given period.”

This makes for a serious incursion into privacy: “Those data, taken as a whole, may allow very precise conclusions to be drawn concerning the private lives of the persons whose data has been retained, such as the habits of everyday life, permanent or temporary places of residence, daily or other movements, the activities carried out, the social relationships of those persons and the social environments frequented by them.”

Second, consider the volume of data gathered and the number of people affected. Given the ubiquity of internet communications, the 206 Directive “entails an interference with the fundamental rights of practically the entire European population”.

Admittedly, the 2006 regime does not undermine “the essence” of data protection rights (because it is confined to traffic data – the contents of communications are not retained), and is still subject to data security rules (see the seventh data protection principle under the UK’s DPA 1998).

Nonetheless, this is a serious interference with privacy rights. It has objective and subjective impact: “it is wide-ranging, and it must be considered to be particularly serious… the fact that data are retained and subsequently used without the subscriber or registered user being informed is likely to generate in the minds of the persons concerned the feeling that their private lives are the subject of constant surveillance.”

Such a law, said the Grand Chamber, can only be proportionate if it includes clear and precise laws governing the scope of the measures and providing minimum safeguards for individual rights. The 2006 Directive fell short of those tests.

Inadequate rules, boundaries and safeguards

The regime has no boundaries, in terms of affected individuals: it “applies even to persons for whom there is no evidence capable of suggesting that their conduct might have a link, even an indirect or remote one, with serious crime”.

It also makes no exception for “persons whose communications are subject, according to rules of national law, to the obligation of professional secrecy”.

There are no sufficiently specific limits on the circumstances in which this can be accessed by security bodies, on the purposes to which that data can be put by those bodies, or the persons with whom those particular bodies may share the data.

There are no adequate procedural safeguards: no court or administrative authority is required to sign off the transfers.

There are also no objective criteria for justifying the retention period of 6-24 months.

The Grand Chamber’s conclusion

In summary, the Grand Chamber found that “in the first place, Article 7 of Directive 2006/24 does not lay down rules which are specific and adapted to (i) the vast quantity of data whose retention is required by that directive, (ii) the sensitive nature of that data and (iii) the risk of unlawful access to that data, rules which would serve, in particular, to govern the protection and security of the data in question in a clear and strict manner in order to ensure their full integrity and confidentiality. Furthermore, a specific obligation on Member States to establish such rules has also not been laid down…”

There was also an international transfer aspect to its concern: “in the second place, it should be added that that directive does not require the data in question to be retained within the European Union…”

This last point is of course highly relevant to another of the stand-offs between digital privacy and national security which looms in UK litigation, namely the post-Snowden litigation against security bodies.

Robin Hopkins @hopkinsrobin

Steinmetz and Others v Global Witness: latest developments

Panopticon devotees will have noted that important DPA litigation is afoot between a group of businessmen (Beny Steinmetz and others) and the NGO Global Witness. The Economist has recently reported on the latest developments in the case: see here.

I particularly like the article’s subtitle: “Libel laws have become laxer. Try invoking data protection instead”. This is an observation I (and others) have made in the past: see here for example. The point appears to be gathering momentum.

Robin Hopkins @hopkinsrobin

Data protection and compensation: the “irreversible march” towards revolutionary change

At 11KBW’s Information Law conference this past Tuesday, I talked a bit about the progress of the draft EU Data Protection Regulation. I omitted to mention last week’s development (my reason: I was on holiday in Venice, where data protection seemed less pressing). In a plenary session on 12 March, the European Parliament voted overwhelmingly in support of the Commission’s current draft of the Regulation. This is all explain in this Memo from the European Commission. Here are some key points.

One is the apparently “irreversible” progress towards getting the Regulation onto the EU statute books. “The position of the Parliament is now set in stone and will not change even if the composition of the Parliament changes following the European elections in May. As a reminder, the remaining stage is for the European Council to agree to the proposal. Its ministers are meeting again in early June. So far, they have been broadly supportive.

Another point is about business size and data protection risk: SMEs will not need to notify (so where will the ICO get its funding?), they won’t need to have data protection officers or carry out privacy impact assessments as a default rule. “We want to make sure that obligations are not imposed except where they are necessary to protect personal data: the baker on the corner will not be subject to the same rules as a (multinational) data processing specialist.”

A third point has great consequences for international transfers: “Non-European companies, when offering services to European consumers, will have to apply the same rules and adhere to the same levels of protection of personal data. The reasoning is simple: if companies outside Europe want to take advantage of the European market with more than 500 million potential customers, then they have to play by the European rules”.

Fourth, the “right to be forgotten” is still very much on the agenda. “If an individual no longer wants his or her personal data to be processed or stored by a data controller, and if there is no legitimate reason for keeping it, the data should be removed from their system” (subject to freedom of expression). This “citizen in the driving seat” principle, like the consistency aim (the same rules applied the same away across the whole EU) and the “one-stop shop” regulatory model has been part of the reform package from the outset.

A final point is that the Parliament wants regulators to be able to impose big fines: “It has proposed strengthening the Commission’s proposal by making sure that fines can go up to 5% of the annual worldwide turnover of a company (up from 2% in the Commission’s proposal)”. Monetary penalties will not be mandatory, but they will potentially be huge.

On this last point about money: as under the current law, a regulatory fine is one thing and the individual’s right to be compensated another. At out seminar on Tuesday, we discussed whether there would soon be a sweeping away (see for example the Vidal-Hall v Google litigation) of the long-established Johnson v MDU principle that in order to be compensated for distress under section 13 of the DPA, you need first to prove that you suffered financial loss. That may well be so for the DPA, in which case the short- and medium-term consequences for data protection litigation in the UK will be huge.

But it is important to be clear about the longer term: this is going to happen anyway, regardless of any case-law development in UK jurisprudence. Article 77 of the current draft of the Regulation begins like this “Any person who has suffered damage, including non-pecuniary damage, as a result of an unlawful processing operation or of an action incompatible with this Regulation shall have the right to claim compensation from the controller or the processor for the damage suffered”.

If we are indeed irreversibly on track towards a new Regulation, then data protection litigation – notably, though not only about compensating data subjects – is guaranteed to be revolutionised.

Robin Hopkins @hopkinsrobin

The EU’s Data Protection Regulation: where are we?

The replacement of Directive 95/26/EC – the bedrock of data protection in Europe – with a new Regulation is intended as a radical overhaul, making protections for personal data fit for the digital world. It has now been over two years since the first substantive draft of that Regulation was made public. I dimly recall Tim Pitt-Payne and I summarising it – see here.

The Regulation is yet to emerge. As a number of Panopticon readers have asked: where have we got to? Here are five points by way of summary.

1. Two members of the trinity are on board

Following seemingly interminable negotiations, the European Parliament’s civil liberties committee (LIBE) now endorses the European Commission’s position on the modified draft. This means that two of the three key bodies at the EU level appear to be of one mind. The next step is for the third body, the European Council, to be persuaded during negotiations. See this blog post by the ICO’s Deputy Commissioner, David Smith.

2. In search of the cardinal virtues – consent, consistency, proportionality

In a very illuminating summary of the major principles at issue, the ICO tells us that it welcomes the following features of the current draft: a stringent approach to consent (or, in low-risk situations, a ‘legitimate interests’ condition justifying the processing of personal data); consistency and an EU-wide ‘one-stop shop’ model; ensuring that processing conditions are proportionate to risk (by, for example, requiring data subjects to be notified ‘without delay’ rather than within 24 hours, as was originally proposed).

The ICO remains concerned, however, that the draft Regulation continues to suffer from some vices: its use of the ‘pseudonymisation’ concept muddies the distinction between personal and non-personal data; the approach to profiling is insufficiently nuanced, and the international transfer rules may be unrealistically stringent.

3. The Regulation is dead!

Peter Fleischer, Google’s global privacy counsel, considers that the stalled progress of 2013 effectively means that “the old draft is dead”. His view, however, is that this delay will provide an opportunity for a more realistic re-think: “Whatever comes next will be the most important privacy legislation in the world, setting the global standards. I’m hopeful that this pause will give lawmakers time to write a better, more modern and more balanced law.”

4. Long live the Regulation!

EU officials are, however, optimistic about the current draft being spurred on to finality in 2014. Peter Hustinx, the outgoing European Data Protection Supervisor (curiously, no successor has yet been appointed), hopes that Greece’s imminent turn in the presidency seat will provide a fresh impetus for productive negotiation. Importantly, he sees Germany (often characterised as setting very stringent standards for data protection) as being in the driving seat: “The new German government can tackle this subject with the necessary drive and energy and thereby gain acceptance of the German position at European level and lead Europe to a higher level of data protection.”

5. Are the Americans Safe?

The processing of EU citizens’ data by US-based companies sits outside the direct reach of the envisaged Regulation, as with the current Directive. Since 2000, transfers of personal data to the US have been governed by the Safe Harbour Agreement, under which approximately 3,300 companies have been certified as safe (in the sense of being EU compliant in their data protection standards).

The European Council and Parliament have, however, expressed concern about the fitness for purpose of the Safe Harbour scheme. They have observed that “Web companies such as Google, Facebook, Microsoft, Apple, Yahoo have hundreds of millions of clients in Europe and transfer personal data for processing to the US on a scale inconceivable in the year 2000 when the Safe Harbour was created”. They area also concerned about the ongoing revelations about surveillance: “divergent responses of data protection authorities to the surveillance revelations demonstrate the real risk of the fragmentation of the Safe Harbour scheme and raise questions as to the extent to which it is enforced”.

Progress by the US Department of Commerce is now sought – by March 2014 – on improving transparency, the application of EU principles and enforcement. The arrangements will be further reviewed in 2014.

Robin Hopkins @hopkinsrobin

The Google/Safari users case: a potential revolution in DPA litigation?

I posted earlier on Tugendhat J’s judgment this morning in Vidal-Hall and Others v Google Inc [2014] EWHC 13 (QB). The judgment is now available here – thanks as ever to Bailii.

This is what the case is about: a group of claimants say that, by tracking and collating information relating to their internet usage on the Apple Safari browser without their consent, Google (a) misused their private information (b) breached their confidences, and (c) breached its duties under the Data Protection Act 1998 – in particular, under the first, second, sixth and seventh data protection principles. They sought damages and injunctive relief.

As regards damages, “what they claim damages for is the damage they suffered by reason of the fact that the information collected from their devices was used to generate advertisements which were displayed on their screens. These were targeted to their apparent interests (as deduced from the information collected from the devices they used). The advertisements that they saw disclosed information about themselves. This was, or might have been, disclosed also to other persons who either had viewed, or might have viewed, these same advertisements on the screen of each Claimant’s device” (paragraph 24).

It is important to note that “what each of the Claimants claims in the present case is that they have suffered acute distress and anxiety. None of them claims any financial or special damage. And none of them claims that any third party, who may have had sight of the screen of a device used by them, in fact thereby discovered information about that Claimant which was detrimental” (paragraph 25).

The Claimants needed permission to serve proceedings on the US-based Google. They got permission and served their claim forms. Google then sought to have that service nullified, by seeking an order declaring that the English court has no jurisdiction to try these particular claims (i.e. it was not saying that it could never be sued in the English courts).

Tugendhat J disagreed – as things stand, the claims will now progress before the High Court (although Google says it intends to appeal).

Today’s judgment focused in part on construction of the CPR rules about service outside of this jurisdiction. I wanted to highlight some of the other points.

One of the issues was whether the breach of confidence and misuse of private information claims were “torts”. Tugendhat J said this of the approach: “Judges commonly adopt one or both of two approaches to resolving issues as to the meaning of a legal term, in this case the word “tort”. One approach is to look back to the history or evolution of the disputed term. The other is to look forward to the legislative purpose of the rule in which the disputed word appears”. Having looked to the history, he observed that “history does not determine identity. The fact that dogs evolved from wolves does not mean that dogs are wolves”.

The outcome (paragraphs 68-71): misuse of private information is a tort (and the oft-cited proposition that “the tort of invasion of privacy is unknown in English law” needs revisiting) but breach of confidence is not (given Kitetechnology BV v Unicor GmbH Plastmaschinen [1995] FSR 765).

Google also objected to the DPA claims being heard. This was partly because they were raised late; this objection was dismissed.

Google also said that, based on Johnson v MDU [2007] EWCA Civ 262; (2007) 96 BMLR 99, financial loss was required before damages under section 13 of the DPA could be awarded. Here, the Claimants alleged no financial loss. The Claimants argued against the Johnson proposition: they relied on Copland v UK 62617/00 [2007] ECHR 253, argued for a construction of the DPA that accords with Directive 95/46/EC as regards relief, and argued that – unlike in Johnson – this was a case in which their Article 8 ECHR rights were engaged. Tugendhat J has allowed this to proceed to trial, where it will be determined: “This is a controversial question of law in a developing area, and it is desirable that the facts should be found”.

If the Johnson approach is overturned – i.e. if the requirement for financial loss is dispensed with, at least for some types of DPA claim – then this could revolutionise data protection litigation in the UK. Claims under section 13 could be brought without claimants having suffered financially due to the alleged DPA breaches they have suffered.

Tugendhat went on to find that there were sufficiently serious issues to be tried here so as to justify service out of the jurisdiction – it could not be said that they were “not worth the candle”.

Further, there was an arguable case that the underlying information was, contrary to Google’s case, “private” and that it constituted “personal data” for DPA purposes (Google say the ‘identification’ limb of that definition is not met here).

Tugendhat was also satisfied that this jurisdiction was “clearly the appropriate one” (paragraph 134). He accepted the argument of Hugh Tomlinson QC (for the Claimants) that “in the world in which Google Inc operates, the location of documents is likely to be insignificant, since they are likely to be in electronic form, accessible from anywhere in the world”.

Subject to an appeal from Google, the claims will proceed in the UK. Allegations about Google’s conduct in other countries are unlikely to feature. Tugendhat J indicated a focus on what Google has done in the UK, to these individuals: “I think it very unlikely that a court would permit the Claimants in this case to adduce evidence of what Mr Tench refers to as alleged wrongdoing by Google Inc against other individuals, in particular given that it occurred in other parts of the world, governed by laws other than the law of England” (paragraph 47).

Robin Hopkins @hopkinsrobin

High Court to hear Safari users’ privacy claim against Google

Panopticon has from time to reported on Google’s jurisdictional argument when faced with privacy/data protection actions in European countries: it tends to argue that such claims should be dismissed and must be brought in California instead. This argument is not always successful.

The same jurisdictional argument was advanced before Mr Justice Tugendhat in response to a claim brought by a group calling itself ‘Safari Users Against Google’s Secret Tracking’ who, as their name suggests, complain that Google unlawfully gathers data from Safari browser usage.

This morning, Mr Justice Tugendhat dismissed that jurisdictional argument. The case can be heard in the UK. Matthew Sparkes reports in the Daily Telegraph that the judge said “I am satisfied that there is a serious issue to be tried in each of the claimant’s claims for misuse of private information” and that “the claimants have clearly established that this jurisdiction is the appropriate one in which to try each of the above claims”.

The same article says that Google will appeal. This follows Google’s announcement yesterday that it will appeal a substantial fine issued by the French data protection authority for unlawful processing (gathering and storing) of user data.

Panopticon will continue to gather data on these and other Google-related matters.

Robin Hopkins @hopkinsrobin