Data protection reform in the EU

In 1913, Parliament was debating the Welsh Church Disestablishment Bill.  F. E. Smith described it as “a Bill which has shocked the conscience of every Christian community in Europe”.  This prompted a stinging rebuke from G.K. Chesterton:  was it remotely plausible that, say Breton fishermen, or Russian peasants, had the slightest interest in any of this?

“ Do they, fasting, trembling bleeding

Wait the news from this our city?

Groaning, ‘That’s the Second Reading!’

Hissing ‘There is still Committee!’

If the voice of Cecil falters,

If McKenna’s point has pith,

Do they tremble for their altars?

Do they, Smith?”

A hundred years later, the European Parliament is debating data protection reform.  To suggest that every citizen of the Union is hanging on the words of Jan-Philipp Albrecht or Viviane Reding would invite Chestertonian derision.  But there must be a number of businesses that are trembling (if not perhaps fasting or bleeding, as yet) at talk of fines of up to 100 million Euros (or 5% of global turnover, whichever is the greater) for breach of the new requirements.  And the level of interest among ordinary citizens, at any rate in some countries in the EU, should not be underestimated.

The above reflections are prompted by the news that the LIBE Committee of the European Parliament has adopted an agreed position on the proposed new Regulations and Directive.  This gives a mandate for the rapporteurs – MEPs Jan-Philipp Albrecht and Dimitrious Droutsas – to negotiate with the EU Council on Parliament’s behalf.

The full text of the proposed version of the legislation approved by the LIBE Committee has not been made public.  However, this press release from the Commission indicates that there are some important differences between the Commission’s original proposal in January 2012 and the text being put forward by the LIBE Committee.  Notably, the Committee is proposing maximum sanctions of 100 million euros or up to 5% of annual worldwide turnover, as compared with 1 million euros or up to 2% of annual worldwide turnover.

The Committee also wishes to strengthen the territorial scope of the reforms.  The Commission’s original proposal was that in specified circumstances the Regulation should apply to the processing of personal data of subject residing in the Union, by a controller not established in the Union.  The Committee is proposing that the Regulation should apply to the processing by a controller or processor not established in the Union.

The Commission’s proposal was that this extra-territorial reach of the Regulation should apply where the processing activities were related to the offering of goods and services to data subjects in the Union, or to the monitoring of their behaviour.  The Committee is proposing that the Regulation should apply to the offering of goods or services to data subjects in the Union irrespective of whether a payment of the data subject is required.  So, on the Committee’s text, a social networking site established outside the EU would be caught if it offered membership to individuals in the Union, even if membership was free.   The Committee also proposes that the Regulation should apply to the monitoring of such subjects (not just to the monitoring of their behaviour).

The Committee’s text also would prohibit disclosure outside the EU of personal data processed in the EU, where such disclosure was ordered by a non-EU court or tribunal, unless the transfer was authorised in advance by the relevant EU national data protection authority.  So, it would appear, if a US court ordered disclosure of personal data about UK citizens, then a US company that complied with that order without the prior authorisation of the ICO would be in breach of the Regulation and could be fined.

Media and online comment (see e.g. here and here) has suggested that the European Parliament’s current approach – strengthening the protection for data subjects, in particular in relation to international transfers – is partly a reaction to the revelations by Edward Snowden about the disclosure of personal information to the NSA.

The next step will be for the Council to decide on its position.  There will be a Council discussion between heads of state and government on 24th – 25th October, relating to the digital single market, followed by a meeting of Justice Ministers on data protection reform on 4th – 5th December.  There will then be a “trilogue” between Parliament, the Council, and the Commission.  The President of the European Commission has called for a final text to be agreed before the European Parliamentary elections in May 2014 – though it seems likely that there will be a further 2 years or so before the new legislation comes into effect.

Timothy Pitt-Payne

 

Fingerprints requirement for passport does not infringe data protection rights

Mr Schwarz applied to his regional authority, the city of Bochum, for a passport. He was required to submit a photograph and fingerprints. He did not like the fingerprint part. He considered it unduly invasive. He refused. So Bochum refused to give him a passport. He asked the court to order it to give him one. The court referred to the Court of Justice of the European Union questions about whether the requirement to submit fingerprints in addition to photographs complied with the Data Protection Directive 95/46/EC.

Last week, the Fourth Chamber of the CJEU gave its judgment: the requirement is data protection-compliant.

The requirement had a legal basis, namely Article 1(2) of Council Regulation 2252/2004, which set down minimum security standards for identity-confirmation purposes in passports.

This pursued a legitimate aim, namely preventing illegal entry into the EU.

Moreover, while the requirements entailed the processing of personal data and an interference with privacy rights, the ‘minimum security standards’ rules continued to “respect the essence” of the individual’s right to privacy.

The fingerprint requirement was proportionate because while the underlying technology is not 100% successful in fraud-detection terms, it works well enough. The only real alternative as an identity-verifier is an iris scan, which is no less intrusive and is technologically less robust. The taking of fingerprints is not very intrusive or intimate – it is comparable to having a photograph taken for official purposes, which people don’t tend to complain about when it comes to passports.

Importantly, the underlying Regulation provided that the fingerprints could only be used for identity-verification purposes and that there would be no central database of fingerprints (instead, each set is stored only in the passport).

This is all common-sense stuff in terms of data protection compliance. Data controllers take heart!

Robin Hopkins

Penalties, PECR and PPI

 Niebel v Information Commissioner is the first Tribunal decision about penalties under the Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”).  Mr.Niebel successfully appealed against a penalty of £300,000.

The First-tier Tribunal stated that the material before them showed that Mr. Niebel and his company, Tetrus, had sent hundreds of thousands of unsolicited text messages seeking out potential claims for the mis-selling of PPI or for accidents.  There was no dispute that he had breached the requirements under PECR regulation 22, relating to the sending of text messages for direct marketing.  Until 26th May 2011 there was no power to impose penalties for such a breach, but with effect from that date the monetary penalty provisions in the Data Protection Act 1998 (sections 55A-E of the Act) had been extended to cover breaches of PECR.

In the present case, the monetary penalty notice was imposed on 26th November 2011, requiring payment of £300,000.  The Tribunal emphasised the importance of a clear statement in the notice identifying the contravention for which a penalty was imposed.  At the very least this should indicate the regulation contravened, the content of the contravention, and its scale, including roughly how many individual acts there were and how many people were affected.

In this case the Tribunal considered that the notice had failed clearly to identify the contravention.  The notice seemed to be confined to 411 cases, involving a total of 732 texts, in which the recipient had complained to the ICO.  However, some parts of the penalty notice referred to contravention on a much wider scale.

A further difficulty was that the ICO subsequently discovered that most of the 732 texts referred to had been sent before 26th May 2011 (the date when the power to issue penalties came into effect); and the ICO accepted that these earlier texts could not properly be taken into account.  The ICO therefore relied at the Tribunal hearing on 286 texts, not 732:  the number of affected individuals was not stated, but the Tribunal indicated (if the ratio of texts to complaints was consistent) that this would be about 160.

The appeal was brought on one short point.  It was argued that the contravention was not of a kind likely to cause substantial damage or substantial distress, since it was now described as relating to just 286 texts; therefore one of the statutory preconditions for a monetary penalty was not satisfied.

The Tribunal proceeded on the basis that the likelihood of damage and distress should be assessed by reference to the 286 texts now relied upon by the ICO as constituting the contravention, rather than by reference to other evidence showing very large numbers of unsolicited text messages.  On this basis, the requirement that the contravention was not likely to cause substantial damage or substantial distress was not satisfied.  As far as damage was concerned, recipients might incur charges for replying “stop”, and there might be a small charge if texts were received abroad, but none of this was likely to cause substantial damage.  As to distress, the Tribunal considered that the effect of the contravention was likely to be widespread irritation rather than substantial distress.  The Tribunal allowed the appeal and cancelled the penalty notice.

The decision leaves open one very important question.  Would the sending of hundreds of thousands of unwanted marketing messages be likely to give rise to substantial damage or substantial distress?  Could one say that, in aggregate, the small costs imposed on a very large number of individuals amounted to substantial damage? Or that the irritation caused to such a large number constituted substantial distress? This issue will no doubt be of great importance in future appeals about monetary penalties under PECR.

Two of my colleagues appeared in this case:  James Cornwell for the ICO, and Robin Hopkins for the Appellant.  Neither of them, of course, bears any responsibility for the content of this blog post.

Timothy Pitt-Payne

Haunted by one’s past – yet another criminal records case

As I mentioned in my post last week, the case of T v Secretary of State for the Home Department, which concerns the legality of the current CRB regime, is shortly to be considered by the Supreme Court. The issue in T is whether the blanket requirement that criminal convictions and cautions must be disclosed in the context of an enhanced criminal record check (“ECRC”) undertaken for the purposes of certain types of employment (particularly employment with children or vulnerable adults), even though they are spent, is Article 8 compliant.

But what of cases where an accused has been through the criminal justice system only then to be acquitted of the alleged offenses? Should the data slate in respect of that individual be wiped clean, with the result that the allegations can never surface in the context of an ECRC? Answering that question brings into play the important maxim that, within the criminal justice system, one must be deemed innocent until proven guilty. However, balanced against that maxim, is the recognition that there will be cases where an accused was in fact guilty of the crimes alleged against them, albeit that the Crown was unable to prove that guilt beyond all reasonable doubt. Such individuals may well pose a substantial threat to society, despite their acquittal in the criminal courts. So how should the relevant disclosure bodies balance these competing considerations in the context of the ECRC scheme?

Earlier this year I blogged about two cases where the courts had considered this difficult question in respect of allegations of criminal conduct which had been made, but not proven, as against teachers. In the first case, R (L) v Chief Constable of Cumbria Constabulary [2013] EWHC 869 (Admin), the allegations against the teacher never reached the stage of a criminal prosecution. In the second case, RK v (1) Chief Constable of South Yorkshire (2) Disclosure and Banning Service [2013] EWHC 1555 (Admin), the teacher was acquitted following a criminal trial (see my post here). In both cases, the court held that the inclusion in the relevant ECRCs of information relating to the allegations was unlawful as constituting an unjustified interference with the teacher’s Article 8 rights. A key feature of both of these judgments is that, in the court’s view, the police had acted unlawfully by effectively suggesting that the allegations had been well-founded, despite the lack of any criminal conviction. In a sense, these judgments are unsurprising. After all it cannot be right for the police to suggest that an individual is guilty of an offence when they have not been convicted of any offence following a criminal prosecution.

But does that mean that it will always be unlawful to disclose information about criminal allegations where those allegations have not culminated in a conviction? The recent judgment of the High Court in the case of R(AR) v Chief Constable of Greater Manchester Police & Secretary of State for the Home Department (Case No: CO/13845/2012) indicates that the answer to that question is no.

In AR, an individual who had previously worked as a taxi driver had been accused of raping a particular passenger. He had been acquitted following a criminal trial taking place in January 2011. In March 2012, the Criminal Records Bureau issued an ECRC in connection with an application made by AR for a licence as a private-hire driver. The ECRC made reference to the allegation of rape as against AR. It also confirmed that he had been acquitted following a trial before the Crown Court. AR sought a judicial review in connection with that certificate on the basis that it breached his Article 8 right to privacy. The High Court held that the certificate was unimpeachable. In reaching this conclusion, it is clear that the court was of the view that: (a) the certificate was itself a fairly balanced document and, further, (b) this was a case where the Chief Constable had properly recognised that, whilst the allegations against AR had not been proved to the criminal standard, there was sufficient evidence to suggest that they may yet be well founded and (c) it was reasonable and proportionate to include the allegations in the ECRC given the risk posed to vulnerable passengers if AR had in fact committed the crimes alleged against him.

The court also rejected arguments to the effect that the police’s retention of the data was unlawful under Article 8 and, further, that the police had acted unlawfully by not consulting AR prior to including the information in the ECRC. So far as data retention was concerned, the court held that the police had legitimate reasons for retaining the data both because it may be relevant if further allegations were made against AR and also because other matters could arise involving the complainant. On the procedural challenge relating to the lack of consultation, the court held that this was not well founded both because AR had had an opportunity to put his case in the context of an earlier comparable ECRC and because the police had in any event anticipated all the substantive arguments AR might have wanted to make.

Importantly therefore, an acquittal is not the get out of jail free card it might at first appear to be, certainly in terms of the accused’s data rights.

Jason Coppel QC, who is also acting in the T case, appeared for the Secretary of State.

Anya Proops

PRISM and TEMPORA: ECtHR proceedings issued against UK

Panopticon reported in July that Privacy International had commenced proceedings in the Investigatory Powers Tribunal against the UK intelligence and security agencies concerning PRISM and TEMPORA.

Big Brother Watch, the Open Rights Group, English PEN and Dr Constance Kurz announced yesterday that they have issued proceedings on the same issues – this time in the European Court of Human Rights. They have also published their pleadings and expert evidence (see the bottom of this page). To quote from their pleadings, they challenge on Article 8 ECHR grounds:

(a)    The soliciting or receipt and use by the UK intelligence services (“UKIS”), of data obtained from foreign intelligence partners, in particular the US National Security Agency’s “PRISM” and “UPSTREAM” programmes; and

(b)   The acquisition of worldwide and domestic communications by the Government Communications Head Quarters (“GCHQ”) for use by UKIS and other UK and foreign agencies through the interception, under global and rolling warrants, of electronic data transmitted on transatlantic fibre-optic cables (the “TEMPORA” programme).

The claim is put in summary terms as follows (again, quoting from the pleadings):

(1) In relation to receipt of foreign intercept material—i.e. the receipt, use, retention and dissemination of information received by UKIS from foreign intelligence partners which have themselves obtained it by communications intercept—the legal framework [including RIPA 2000] is inadequate to comply with the “in accordance with the law” requirement under Article 8(2).

(2) In relation to GCHQ’s own generic interception capability, the provisions contained in RIPA relating to external communications warrants allow UKIS to obtain general warrants permitting indiscriminate capturing of vast amounts of communication, effectively on an indefinite basis. The legal provisions which permit generic warrants in relation to such external communications are insufficiently protective to provide an ascertainable check against arbitrary use of secret and intrusive state power.

(3) Such legal provisions do not enable persons to foresee the general circumstances in which external communications may be the subject of surveillance (other than that any use may be made of communications if considered in the interests of national security—a concept of very broad scope in UK law); they do not require authorisations to be granted in relation to specific categories of persons or premises; they permit indiscriminate capture of communications data by reference only to its means of transmission; and they impose no significant restrictions on the access that foreign intelligence partners may have to such intercepted material. In short, there are no defined limits on the scope of discretion conferred on the competent authorities or the manner of its exercise. Moreover, there is no adequate degree of independent or democratic oversight. Indiscriminate and generic interception and the legal provisions under which it is carried out thereby breach the requirements that interferences with Article 8 must be “in accordance with the law” and must be proportionate.

To quote the briefing note, the applicants “are asking the Court to declare that the UK’s internet surveillance practices are disproportionate and that the legislation intended to protect the public’s rights to privacy in this context is not fit for purpose”.

In other words, this is challenge not only to specific actions, but to the UK’s regulatory regime for surveillance more broadly. The applicants also draw attention (pleadings, paragraph 121.7) to the fact that the Data Protection Act 1998 is powerless to protect personal data in this context, given the exemption for national security at s. 28 of that Act.

Robin Hopkins

Refusal to destroy part of a ‘life story’ justified under Article 8(2) ECHR

The High Court of Justice (Northern Ireland) has today given judgment In the matter of JR60’s application for judicial review [2013] NIQB 93. The applicant sought to challenge the right of the two Social Care Trusts to keep and use various records generated when she was a resident of children’s homes and a training school between the years 1978-1991.

In most cases of challenges to the retention of records, the applicant seeks to expunge information which suggests they have done wrong. This application is interesting because it focused (though not exclusively) on what the applicant had suffered, as opposed to what she had done. In short, she wished to erase from the record a part of her life story which was painful for her to recall. The application failed: there were weightier reasons for retaining those records, and in any event whatever her current wish to forget matters of such import, she might come to change her mind.

The applicant was described as having had a very difficult childhood, to which those records relate. It was not known who her father was. She had grown up to achieve impressive qualifications. Horner J described her as having “survived the most adverse conditions imaginable and triumphed through the force of her will. By any objective measurement she is a success”.

She wished to move on, and to have the records about her childhood expunged. The Trusts refused; their policy was to retain such information for a 75-year period. The applicant challenged this refusal on Article 8 ECHR grounds. Horner J readily agreed that the retention of such information interfered with her rights under Article 8, but dismissed her application on the grounds that the interference was justified.

The applicant had argued that (i) she did not intend to make any claim for ill-treatment or abuse while she was in care, (ii) she did not want to retrieve information about her life story, (iii) she did not want the records to be used to carry out checks on her, as persons who were not in care would not be burdened by such records in respect of their early lives, and (iv) she did not want others, including her own child, to be able to access these records.

In response to the applicant’s assertion that she did not want and did not envisage wanting access to her records, Horner J said this at paragraph 19:

“Even if the applicant does not want to know at present what is in her records, it does not follow that she may not want to find out in the future what they contain for all sorts of reasons. She may, following the birth of a grandchild, be interested in her personal history for that grandchild’s sake. She may want to find out about her genetic inheritance because she may discover, for example, that she, or her off-spring, is genetically predisposed to a certain illness whether mental or physical. She may want to know whether or not this has been passed down through her mother’s side or her father’s side. There may be other reasons about which it is unnecessary to speculate that will make her want to seek out her lost siblings. There are any number of reasons why she may change her mind in the future about accessing her care records. Of course, if the records are destroyed then the opportunity to consider them is lost forever.”

The Trusts argued that they needed to retain such records for the purposes of their own accountability, any background checks on the applicant or related individuals which may become necessary, for the purposes of (hypothetical) public interest issues such as inquiries, and for responding to subject access requests under the Data Protection Act 1998. Horner J observed that the “right for an individual to be able to establish details of his or her identity applies not just to the Looked After Child but also, inter alia, to that child’s offspring”.

In the circumstances, the application failed; the Trusts’ interference with the applicant’s Article 8 rights was justified.

Horner J added a short concluding observation about the DPA (paragraph 29):

“It is significant that no challenge has been made to the Trust’s storage of personal information of the applicant on the basis that such storage constitutes a breach of the Data Protection Act 1998. This act strengthens the safeguards under the 1984 Act which it replaced. The Act protects “personal data which is data relating to a living individual who can be identified from data whether taken alone or read with other information which is the possession (or is likely to come into possession) of the data controller: see 12-63 of Clayton and Tomlinson on The Law of Human Rights (2nd Edition). It will be noted that “personal” has been interpreted as almost meaning the same as “private”: see Durant v Financial Services Authority [2004] FSR 28 at paragraph [4].”

Robin Hopkins