Privacy and data protection developments in 2013: Google, Facebook, Leveson and more

Data protection law was designed to be a fundamental and concrete dimension of the individual’s right to privacy, the primary safeguard against misuse of personal information. Given those ambitions, it is surprisingly rarely litigated in the UK. It also attracts criticism as imposing burdensome bureaucracy but delivering little in the way of tangible protection in a digital age. Arguably then, data protection law has tended to punch below its weight. There are a number of reasons for this.

One is that Directive 95/46/EC, the bedrock of data protection laws in the European Union, is the product of a largely pre-digital world; its drafters can scarcely have imagined the ubiquity of Google, Twitter, Facebook and the like.

Another is that in the UK, the evolution of Article 8 ECHR and common law privacy and breach of confidence actions has tended to deprive the Data Protection Act 1998 of the oxygen of litigation – before the House of Lords in Campbell v MGN [2004] UKHL 22, for example, it was agreed that the DPA cause of action “added nothing” to the supermodel’s breach of confidence claim (para. 130).

A further factor is that the DPA 1998 has historically lacked teeth: a court’s discretion to enforce subject access rights under s. 7(9) is “general and untrammelled” (Durant v FSA [2003] EWCA Civ 1746 at para. 74); damages under s. 13 can only be awarded if financial loss has been incurred, and the Information Commissioner has, until recently, lacked robust enforcement powers.

This landscape is, however, undergoing very significant changes which (one hopes) will improve data protection’s fitness for purpose and amplify its contribution to privacy law. Here is an overview of some of the more notable developments so far in 2013.

The draft Data Protection Regulation

The most fundamental feature of this landscape is of course EU law. The draft DP Regulation, paired with a draft Directive tailored to the crime and security contexts, was leaked in December 2011 and published in January 2012 (see Panopticon’s analysis here). The draft Regulation, unlike its predecessor would be directly effective and therefore not dependent on implementation through member states’ domestic legislation. Its overarching aim is harmonisation of data protection standards across the EU: it includes a mechanism for achieving consistency, and a ‘one-stop shop’ regulatory approach (i.e. multinationals are answerable only to their ‘home’ data protection authority). It also tweaks the law on international data transfers, proposes that most organisations have designated data protection officers, offers individuals a ‘right to be forgotten’ and proposes eye-watering monetary penalties for data protection breaches.

Negotiations on that draft Regulation are in full swing: the European Parliament and the Council of the European Union’s DAPIX (Data Protection and Information Exchange) subgroup working on their recommendations separately before coming together to approve the final text (for more detail on the process, see the ICO’s outline here).

What changes, if any, should be made to the draft before it is finalised? That rather depends on who you ask.

In January 2013, the UK government set out its views on the draft Regulation. It did so in the form of its response to the recommendations of the Justice Select Committee following the latter’s examination of the draft Regulation. This is effectively the government’s current negotiation stance at the EU table. It opposes direct effect (i.e. it wants a directive rather than a regulation), thinks the ‘right to be forgotten’ as drafted is misconceived, favours charging for subject access requests and opposes the mandatory data protection officer requirement. The government considers that promoters of the draft have substantially overestimated the savings which the draft would deliver to business. The government also “believes that the supervisory authorities should have more discretion in the imposition of fines and that the proposed removal of discretion, combined with the higher levels of fines, could create an overly risk-averse environment for data controllers”. For more on its stance, see here.

The ICO has also has significant concerns. It opposes the two-stream approach (a mainstream Regulation and a crime-focused Directive) and seeks clarity on psedonymised data and non-obvious identifiers such as logs of IP addresses. It thinks the EU needs to be realistic about a ‘right to be forgotten’ and about its power over non-EU data controllers. It considers the current proposal to be “too prescriptive in terms of its administrative detail” and unduly burdensome for small and medium-sized enterprises in particular.

Interestingly, while the ICO favours consistency in terms of sanctions, it cautions against total harmonisation on all fronts: “Different Member States have different legal traditions. What is allowed by law is not spelled out in the UK in the way that it is in some other countries’ legal systems. The proposed legislation needs to reflect this, particularly in relation to the concept of ‘legitimate interests’.” For more on the ICO’s current thinking, see here.

Those then are the most influential UK perspectives. At an EU level, the European Parliament’s report on the draft Regulation is more wholeheartedly supportive. The European Parliament’s Industry Committee is somewhat more business-friendly in its focus, emphasising the importance of EU-wide consistency and a ‘one-stop shop’. Its message is clear: business needs certainty on data protection requirements. It also urges further exemptions from data protection duties for small and medium-sized enterprises “which are the backbone of Europe’s economy”. The Industry Committee’s views are available here.

Negotiations continue, the aim being to finalise the text by mid-2013. The European Parliament is likely to press for the final text to resemble the draft very closely. On the other hand, Ireland holds the Presidency of the Commission and of DAPIX – until mid-2013. Its perspective is probably closer to the UK ICO’s in tenor. There are good prospects of at least some of their views to be reflected in the final draft.

A number of the themes of the draft Regulation and the current negotiations are already surfacing in litigation, as explained below.

The Leveson Report

Data protection legislation in the UK will be affected not only by EU developments but by domestic ones too.

In recent weeks, debate about Leveson LJ’s report on the culture, practices and ethics of the press has tended to focus on the Defamation Bill which is currently scraping its way through Parliament. In particular, the debate concerns the merits of an apparently-Leveson inspired amendment tabled by Lord Puttnam which, some argue, threatens to derail this legislative overhaul of libel law in the UK (for one angle on this issue, see David Allen Green’s piece in the New Statesman here).

The Leveson Report also included a number of recommendations for changes to the DPA 1998 (see Panopticon’s posts here and here). These included overhauling and expanding the reach of the ICO and allowing courts to award damages even where no financial loss has been suffered (arguably a befitting change to a regime concerned at heart with personal privacy).

The thorniest of Leveson LJ’s DPA recommendations, however, concerned the wide-ranging ‘journalism exemption’ provided by s. 32. The ICO has begun work on a code of practice on the scope and meaning of this exemption. It has conducted a ‘framework consultation’, i.e. one seeking views on the questions to be addressed by the code, rather than the answers at this stage (further consultation will happen once a code has been drafted).

There is potential for this code to exert great influence: s. 32(3) says that in considering whether “the belief of a data controller that publication would be in the public interest was or is a reasonable one, regard may be had to his compliance with” any relevant code of practice – if it has been designated by order of the Secretary of State for this purpose. There is as yet no indication of an appetite for such designation, but it is hoped that, the wiser the code, the stronger the impetus to designate it.

The ICO’s framework consultation closes on 15 March. Watch out for (and respond to) the full consultation in due course.

Google – confidentiality, informed consent and data-sharing

Google (the closest current thing to a real ‘panopticon’?) has been the subject of a flurry of important recent developments.

First, certain EU data protection bodies intend to take “repressive action” against some of Google’s personal data practices. These bodies include the French authority, CNIL (the Commission nationale de l’informatique et des libertés) and the Article 29 Working Party (an advisory body made of data protection representatives from member states). In October 2012, following an investigation led by CNIL, the Working Party raised what it saw as deficiencies in Google’s confidentiality rules. It recommended, for example, that Google provide users with clearer information on issues such as how personal data is shared across Google’s services, and on Google’s retention periods for personal data. Google was asked to respond within four months. CNIL has reported in recent weeks that Google did not respond. The next step is for the Working Party “to set up a working group, led by the CNIL, in order to coordinate their repressive action which should take place before summer”. It is not clear what type of “repressive action” is envisaged.

Google and the ‘right to be forgotten’

Second, Google is currently involved in litigation against the Spanish data protection authority in the Court of Justice of the EU. The case arises out of complaints made to that authority by a number of Spanish citizens whose names, when Googled, generated results linking them to false, inaccurate or out-of-date information (contrary to the data protection principles) – for example an old story mentioning a surgeon’s being charged with criminal negligence, without mentioning that he had been acquitted. The Spanish authority ordered Google to remove the offending entries. Google challenged this order, arguing that it was for the authors or publishers of those websites to remedy such matters. The case was referred to the CJEU by the Spanish courts. The questions referred are available here.

The CJEU considered the case at the end of February, with judgment expected in mid-2013. The case is obviously of enormous relevance to Google’s business model (at least as regards the EU). Also, while much has been made about the ‘right to be forgotten’ codified in the draft EU Regulation (see above), this Google case is effectively about whether that right exists under the current law. For a Google perspective on these issues, see this blog post.

Another development closer to home touches on similar issues. The Court of Appeal gave judgment last month in Tamiz v Google [2013] EWCA Civ 68. Mr Tamiz complained to Google about comments on the ‘London Muslim’ blog (hosted by Google) which he contended were defamatory in nature. He asked Google to remove that blog. He also sought permission to serve proceedings on Google in California for defamation occurring between his request to Google and the taking down of the offending blog. Agreeing with Google, the Court of Appeal declined jurisdiction and permission to serve on Google in California.

Mr Tamiz’ case failed on the facts: given the small number of people who would have viewed this blog post in the relevant period, the extra-territorial proceedings ‘would not be worth the candle’.

The important points for present purposes, however, are these: the Court of Appeal held that there was an arguable case that Google was the ‘publisher’ of those statements for defamation purposes, and that it would not have an unassailable defence under s. 1 of the Defamation Act 1996. Google provided the blogging platform subject to conditions and had the power to block or remove content published in breach of those conditions. Following Mr Tamiz’s complaint, Google knew or ought to have known that it was causing or contributing to the ongoing publication of the offending material.

A ‘publisher’ for defamation purposes is not co-extensive with a ‘data controller’ for DPA purposes. Nonetheless, these issues in Tamiz resonate with those in the Google Spain case, and not just because of their ‘right to be forgotten’ subtext. Both cases raise this question: it is right to hold Google to account for its role in making false, inaccurate or misleading personal information available to members of the public? If it is, another question might also arise in due course: to what extent would Leveson-inspired amendments to the s. 32 DPA 1998 exemption (on which the ICO is consulting) affect service providers like Google?

Facebook, Google and jurisdiction

The Google Spain case also involves an important jurisdictional argument. Google’s headquarters are in California. It argued before the CJEU that Google Spain only sells advertising to the parent company, and that these complaints should therefore be considered under US data protection legislation. In other words, it argues, this is not a matter for EU data protection law at all. The Spanish authority argues that Google Spain’s ‘centre of gravity’ is in Spain: it links to Spanish websites, has a Spanish domain name and processes personal data about Spanish citizens and residents.

Victory for Google on this point would significantly curtail the data protection rights of EU citizens in this context.

Also on jurisdictional matters, Facebook has won an important recent victory in Germany. Schleswig-Holstein’s Data Protection Commissioner had ruled that Facebook’s ‘real names policy’ (i.e. its policy against accounts in psuedonymous names only) was unfair and unlawful. The German administrative court granted Facebook’s application for the suspension of that order on the grounds that the issue should instead be considered by the Irish Data Protection Authority, since Facebook is Dublin-based.

Here then, is an example of ‘one-stop shop’ arguments surfacing under current EU law. The ‘one-stop shop’ principle is clearly very important to businesses. In the Facebook case, it would no doubt say that its ‘home’ regulator understands its business much better and is therefore best equipped to assess the lawfulness of its practices. The future of EU law, however, is as much about consistency across member states as about offering a ‘one-stop shop’. The tension between ‘home ground advantage’ and EU-wide consistency is one of the more interesting practical issues in the current data protection debate.

Enforcement and penalties issued by the ICO

One of the most striking developments in UK data protection law in recent years has been the ICO’s use of its enforcement and (relatively new) monetary penalty powers.

On the enforcement front, the Tribunal has upheld the ICO’s groundbreaking notice issued against Southampton City Council for imposing audio recording requirements in taxis (see Panopticon’s post here).

The issuing of monetary penalties has continued apace, with the ICO having issued in the region of 30 notices in the last two years. In 2013, two have been issued.

One (£150,000) was on the Nursing and Midwifery Council, for losing three unencrypted DVDs relating to a nurse’s misconduct hearing, which included evidence from two vulnerable children. The second (£250,000) was on a private sector firm, Sony Computer Entertainment Europe Limited, following the hacking of Sony’s PlayStation Network Platform in April 2011, which the ICO considered “compromise[ed] the personal information of millions of customers, including their names, addresses, email addresses, dates of birth and account passwords. Customers’ payment card details were also at risk.”

In the only decision of its kind to date, the First-Tier Tribunal upheld a monetary penalty notice issued against Central London Community Care NHS Trust for faxing patient details to the wrong number (see Panopticon’s post here). The First-Tier Tribunal refused the Trust permission to appeal against that decision.

Other penalty notices are being appealed to the Tribunal – these include the Scottish Borders notice (which the Tribunal will consider next week) and the Tetrus Telecoms notice, the first to be issued under the Privacy and Electronic Communications Regulations 2003.

It is only a matter of time before the Upper Tribunal or a higher court considers a monetary penalty notice case. At present, however, there is no binding case law. To that extent, the monetary penalty system is a somewhat uncertain business.

The question of EU-wide consistency raises more fundamental uncertainty, especially when one considers the mandatory fining regime proposed in the draft EU Regulation, with fines of up to €1,000,000 or 2% of the data controller’s global annual turnover.

By way of contrast, 13 administrative sanctions for data protection breaches were issued in France in 2012, the highest fine being €20,000. Enforcement in Germany happens at a regional level, with Schleswig-Holstein regarded as on the stricter end; overall however, few fines are issued in Germany. How the ‘one-stop shop’ principle, the consistency mechanism and the proposed new fining regime will be reconciled is at present anyone’s guess.

From a UK perspective, however, the only point of certainty as regards monetary penalty notices is that there will be no slowing down in the ICO’s consideration of such cases in the short- to medium-term.

It is of course too early to say whether the developments outlined above will elevate data protection law from a supporting to a leading role in protecting privacy. It is clear, however, that – love them or hate them – data protection duties are increasingly relevant and demanding.

Robin Hopkins

Supreme Court: Articles 3, 6 and 8 ECHR in child protection PII case

There have been a number of important privacy judgments in recent weeks, particularly concerning Article 8 ECHR in cases with child protection elements. I have blogged on two Court of Appeal judgments. In the matter of X and Y (Children) [2012] EWCA Civ 1500 (19 November 2012) (Pill, Touslon and Monby LJJ; appeal against a decision of Peter Jackson J in the Family Division) concerned the tension between Articles 8 and 10. A second, more recent Court of Appeal judgment in Durham County Council v Dunn [2012] EWCA Civ 1654 (13 December 2012) (Maurice Kay, Munby and Tomlinson LJJ; appeal against a decision of HHJ Armitage QC) focused on balancing competing rights under Articles 8 (private and family life) and 6 (fair trial).

The Supreme Court has this week handed down an important judgment of the latter variety (Articles 8 and 6, as well as an Article 3 claim) in Re A (A Child) [2012] UKSC 60 (12 December 2012) (Lady Hale, with whom Lords Neuberger, Clarke, Wilson and Reed agreed;  appeal against a decision of McFarlane, Thorpe and Hallett LJJ).

Lady Hale began by summarising the case thus:

“We are asked in this case to reconcile the irreconcilable. On the one hand, there is the interest of a vulnerable young woman (X) who made an allegation in confidence to the authorities that while she was a child she had been seriously sexually abused by the father of a little girl (A) who is now aged 10. On the other hand we have the interests of that little girl, her mother (M) and her father (F), in having that allegation properly investigated and tested. These interests are not only private to the people involved. There are also public interests, on the one hand, in maintaining the confidentiality of this kind of communication, and, on the other, in the fair and open conduct of legal disputes. On both sides there is a public interest in protecting both children and vulnerable young adults from the risk of harm.”

In essence, X made the allegations of past sexual abuse by F to the local authority, but did not wish to take action against F. She asserted her rightsto privacy and confidentiality under Article 8  and argued that disclosure of her identity and the details of her allegations would amount to inhuman or degrading treatment contrary to Article 3.

The local authority asserted public interest immunity from disclosure. Lady Hale held that, analysed in terms of common law principles, disclosure should be ordrerd despite the important public interest in preserving the confidence of people who come forward with allegations of child abuse. At paragraph 30, she said this:

“Those allegations have to be properly investigated and tested so that A can either be protected from any risk of harm which her father may present to her or can resume her normal relationship with him. That simply cannot be done without disclosing to the parents and to the Children’s Guardian the identity of X and the detail and history of the allegations which she has made.”

The same conclusion was reached by analysing the matter in Convention terms. X’s case was primarily based on Article 3. Lady Hale agreed with the Court of Appeal that disclosure would not violate those rights: “The context here is not only that the state is acting in support of some important public interests; it is also that X is currently under the specialist care of a consultant physician and a consultant psychiatrist, who will no doubt do their utmost to mitigate any further suffering which disclosure may cause her” (paragraph 32).

Leaving aside Article 3, Lady Hale concluded that the rights of C, M and F under Articles 8 and 6 outweighed the Article 8 rights of X in the circumstances. A closed procedure seeking to minimise the impact on X’s privacy was not possible here. Furthermore, disclosure would not automatically expose X to the trauma of cross-examination: medical evidence and other means of giving evidence could, for example, be appropriate.

The case is an illuminating instance of extremely strong privacy rights being trumped by a combination of the family life rights of others, and in particular their right to a fair trial. In particular, it illustrates how, when serious allegations are made against individuals, the notion of privacy can cut both ways.

Robin Hopkins

CPR disclosure applications: ignore the DPA; balance Articles 6 and 8 instead

It is increasingly common for requests for disclosure in pre-action or other litigation correspondence to include a subject access request under section 7 of the Data Protection Act 1998. Litigants dissatisfied with the response to such requests often make applications for disclosure. Where an application is made in the usual way (i.e. under the CPR, rather than as a claim under section 7 of the DPA), how should it be approached? As a subject access request, with the “legal proceedings” exemption (section 35) arising for consideration, or as an “ordinary” disclosure application under CPR Rule 31? If the latter, what role (if any) do data protection rights play in the analysis of what should be disclosed?

As the Court of Appeal in Durham County Council v Dunn [2012] EWCA Civ 1654 observed in a judgment handed down today, there is much confusion and inconsistency of approach to these questions. Difficulties are exacerbated when the context is particularly sensitive – local authority social work records being a prime example. Anyone grappling with disclosure questions about records of that type will need to pay close attention to the Dunn judgment.

Background to the disclosure application

Mr Dunn alleged that he had suffered assaults and systemic negligence while in local authority care. He named individual perpetrators. He also said he had witnessed similar acts of violence being suffered by at other boys. He brought proceedings against the local authority. His solicitors asked for disclosure of various documents; included in the list of requested disclosure was the information to which Mr Dunn was entitled under section 7 of the DPA. Some documents were withheld from inspection, apparently on data protection grounds.

Mr Dunn made a disclosure application in the usual way, i.e. he did not bring a section 7 DPA claim. The District Judge assessed the application in data protection terms. He ordered disclosure with the redaction of names and addresses of residents of the care facility – but not those of staff members and other agents, who would not suffer the same stigmas or privacy incursions from such disclosure.

Mr Dunn said he could not pursue his claim properly without witnesses and, where appropriate, their contact details. He appealed successfully against the disclosure order. The order for redaction was overturned. The judge’s approach was to consider this under the CPR (this being a civil damages claim) – but to take the DPA into account as a distinct consideration in reaching his disclosure decision.

The relevance of the DPA

The Court of appeal upheld the use of the CPR as the correct regime for the analysis. It also upheld the appeal judge’s ultimate conclusion. It said, however, that he went wrong in treating the DPA as a distinct consideration when considering a disclosure application under the CPR. With such applications, the DPA is a distraction (paragraphs 21 and 23 of the judgment of Maurice Kay LJ). It is potentially “misleading to refer to a duty to protect data as if it were a category of exemption from disclosure or inspection. The true position is that CPR31, read as a whole, enables and requires the court to excuse disclosure or inspection on public interest grounds” (paragraph 21).

This was not to dismiss the usefulness of a subject access request to those contemplating litigation. See paragraph 16:

“I do not doubt that a person in the position of the claimant is entitled – before, during or without regard to legal proceedings – to make an access request pursuant to section 7. I also understand that such a request prior to the commencement of proceedings may be attractive to prospective claimants and their solicitors. It is significantly less expensive than an application to the Court for disclosure before the commencement of proceedings pursuant to CPR31.16. Such an access may result in sufficient disclosure to satisfy the prospective claimant’s immediate needs. However, it has its limitations. For one thing, the duty of the data controller under section 7 is not expressed in terms of disclosure of documents but refers to communication of “information” in “an intelligible form”. Although this may be achieved by disclosure of copies of original documents, possibly redacted pursuant to section 7(5), its seems to me that it may also be achievable without going that far. Secondly, if the data subject is dissatisfied by the response of the data controller, his remedy is by way of proceedings pursuant to section 7 which would be time-consuming and expensive in any event. They would also engage the CPR at that stage: Johnson v Medical Defence Union [2005] 1 WLR 750; [2004] EWCH 2509 (Ch).”

Instead, the CPR disclosure analysis should balance Article 6 and Article 8 rights in the context of the particular litigation.

Maurice Kay LJ summed up the requisite approach as follows:

“What does that approach require? First, obligations in relation to disclosure and inspection arise only when the relevance test is satisfied. Relevance can include “train of inquiry” points which are not merely fishing expeditions. This is a matter of fact, degree and proportionality. Secondly, if the relevance test is satisfied, it is for the party or person in possession of the document or who would be adversely affected by its disclosure or inspection to assert exemption from disclosure or inspection. Thirdly, any ensuing dispute falls to be determined ultimately by a balancing exercise, having regard to the fair trial rights of the party seeking disclosure or inspection and the privacy or confidentiality rights of the other party and any person whose rights may require protection. It will generally involve a consideration of competing ECHR rights. Fourthly, the denial of disclosure or inspection is limited to circumstances where such denial is strictly necessary. Fifthly, in some cases the balance may need to be struck by a limited or restricted order which respects a protected interest by such things as redaction, confidentiality rings, anonymity in the proceedings or other such order. Again, the limitation or restriction must satisfy the test of strict necessity.”

How to approach disclosure of social work records in litigation

This issue was dealt with by Munby LJ. In short, the main question was whether those seeking to withhold or redact social work records in litigation should analyse the issue in terms of public interest immunity (as some textbooks, older authorities and even the White Book appeared to suggest) or in terms of a balancing between competing rights under the ECHR (in particular, Articles 6 and 8).

Munby LJ made clear that the right answer is the latter. Where information contained in social work records is to be withheld in legal proceedings, this should not now be on the basis of a claim to public interest immunity; we are “a world away from 1970 or even 1989” (paragraph 43). This was despite the fact that “the casual reader of the White Book” (paragraph 31.3.33 in particular) could be forgiven for thinking that PII applies to local authority social work records. Here Munby LJ said he “would respectfully suggest that the treatment of this important topic in the White Book is so succinct as to be inadvertently misleading” (paragraph 48).

Importantly, Munby LJ also went on to explain how (and with what stringency) Article 8 rights to privacy and the protection of personal information should be approached when disclosing information pursuant to litigation. At paragraph 50, he gave the following guidance:

“… particularly in the light of the Convention jurisprudence, disclosure is never a simply binary question: yes or no. There may be circumstances, and it might be thought that the present is just such a case, where a proper evaluation and weighing of the various interests will lead to the conclusion that (i) there should be disclosure but (ii) the disclosure needs to be subject to safeguards. For example, safeguards limiting the use that may be made of the documents and, in particular, safeguards designed to ensure that the release into the public domain of intensely personal information about third parties is strictly limited and permitted only if it has first been anonymised. Disclosure of third party personal data is permissible only if there are what the Strasbourg court in Z v Finland (1998) 25 EHRR 373, paragraph 103, referred to as “effective and adequate safeguards against abuse.” An example of an order imposing such safeguards can be found in A Health Authority v X (Discovery: Medical Conduct) [2001] 2 FLR 673, 699 (appeal dismissed A Health Authority v X [2001] EWCA Civ 2014, [2002] 1 FLR 1045).”

Robin Hopkins

Redacting for anonymisation: Article 8 v Article 10 in child protection context

Panopticon has reported recently on the ICO’s new Code of Practice on Anonymisation: see Rachel Kamm’s post here. That Code offers guidance for ensuring data protection-compliant disclosure in difficult cases such as those involving apparently anonymous statistics, and situations where someone with inside knowledge (or a ‘motivated intruder’) could identify someone referred to anonymously in a disclosed document. The Upper Tribunal in Information Commissioner v Magherafelt District Council [2012] UKUT 263 AAC grappled with those issues earlier this year in the context of disclosing a summarised schedule of disciplinary action.

Redaction is often crucial in achieving anonymisation. Getting redaction right can be difficult: too much redaction undermines transparency, too much undermines privacy. The Court of Appeal’s recent judgment In the matter of X and Y (Children) [2012] EWCA Civ 1500 is a case in point. It involved the publication of a summary report from a serious case review by a Welsh local authority’s Safeguarding Children Board. The case involved very strong competing interests in terms of Article 8 and Article 10 ECHR. For obvious reasons (anonymity being the key concern here) little could be said of the underlying facts, but the key points are these.

A parent was convicted in the Crown Court of a serious offence relating to one of the children of the family (X). The trial received extensive coverage in the local media. The parent was named. The parent’s address was given. The fact that there were other siblings was reported, as also their number. All of this coverage was lawful.

The local authority’s Safeguarding Children Board conducted a Serious Case Review in accordance with the provisions of the Children Act 2004 and The Local Safeguarding Children Boards (Wales) Regulations 2006. Those Regulations require the Board to produce an “overview report” and also an anonymised summary of the overview report. The relevant Guidance provides that the Board should also “arrange for an anonymised executive summary to be prepared, to be made publicly available at the principal offices of the Board”.

Here two features of the draft Executive Summary were pivotal.

First, reference was made to the proceedings in the Crown Court in such a way as would enable many readers to recognise immediately which family was being referred to and would enable anyone else so inclined to obtain that information by only a few minutes searching of the internet.

Second, it referred, and in some detail, to the fact, which had not emerged during the proceedings in the Crown Court and which is not in the public domain, that another child in the family (Y), had also been the victim of parental abuse.

The local authority wanted to publish the Executive Summary, seeking to be transparent about its efforts to put right what went wrong and that it has learned lessons from X’s death. It recognised the impact on Y, but argued for a relaxtion of a restricted reporting order to allow it to publish the Executive Summary with some redactions. It was supported by media organisations who were legally represented.

The judge (Peter Jackson J) undertook a balance of interests under Articles 8 and 10. He allowed publication, with redactions which were (in the Court of Appeal’s words) “in substance confined to three matters: the number, the gender and the ages of the children.”

In assessing the adequacy of these redaction, the Court of Appeal considered this point from the judgment of Baroness Hale in ZH (Tanzania) v Secretary of State for the Home Department [2011] UKSC 4, [2011] 2 AC 166, at paragraph 33:

“In making the proportionality assessment under article 8, the best interests of the child must be a primary consideration. This means that they must be considered first. They can, of course, be outweighed by the cumulative effect of other considerations.”

Munby LJ thus concluded (paragraph 47 of this judgment) that “it will be a rare case where the identity of a living child is not anonymised”.

He recognised, on the other hand, that Article 10 factors always retained their importance: “there could be circumstances where the Article 8 claims are so dominant as to preclude publication altogether, though I suspect that such occasions will be very rare.”

On the approach to anonymisation through redaction, Munby LJ had this to say (paragraph 48):

“In some cases the requisite degree of anonymisation may be achieved simply by removing names and substituting initials. In other cases, merely removing a name or even many names will be quite inadequate. Where a person is well known or the circumstances are notorious, the removal of other identifying particulars will be necessary – how many depending of course on the particular circumstances of the case.”

In the present case, the redactions had been inadequate. They did not “address the difficulty presented by the two key features of the draft, namely, the reference to the proceedings in the Crown Court and the reference to the fact that Y had also been the victim of parental abuse” (paragraph 53).

Far more drastic redaction was required in these circumstances: to that extent, privacy trumped transparency, notwithstanding the legislation and the Guidance’s emphasis on disclosure. In cases such as this (involving serious incidents with respect to children), those taking disclosure decisions should err on the side of heavy redaction.

Robin Hopkins

 

Internet traffic data and debt collection: privacy implications

Mr Probst was a subsriber to the internet service provider (ISP) Verizon. He failed to pay his bill. A company called ‘nexnet’, the assignee of Verizon’s debt, sought to collect the sums due. In doing so, it obtained and used his internet traffic data in accordance with its ‘data protection and confidentiality agreement’ with Verizon. Disinclined to pay up, Mr Probst argued that nexnet had processed his personal data unlawfully and that the relevant terms of its agreement with Verizon purporting to sanction that processing were void. The first-instance German court agreed with him, but the appellate court did not.

It referred a question to the CJEU concerning Directive 2002/58 (the privacy and electronic communications Directive), which seeks to “particularise and complement” the Data Protection Directive 95/46/EC.

Article 5(1) of the 2002 Directive provides confidentiality in respect of electronic communications and traffic data. Article 6(1) says that traffic data must be “erased or made anonymous when it is no longer needed for the purpose of the transmission of a communication”, unless one of the exceptions in that Article applies. The relevant provisions here were Articles 6(2) and (5). The first allows traffic data to be processed for subscriber billing purposes – but only within a specified time period. The second allows for processing of such data by an ISP’s authorised agent only for specified activities and only insofar as is necessary for those activities. The provisions are worded as follows:

(2) Traffic data necessary for the purposes of subscriber billing and interconnection payments may be processed. Such processing is permissible only up to the end of the period during which the bill may lawfully be challenged or payment pursued.

(5) Processing of traffic data, in accordance with paragraphs 1, 2, 3 and 4, must be restricted to persons acting under the authority of providers of the public communications networks and publicly available electronic communications services handling billing or traffic management, customer enquiries, fraud detection, marketing electronic communications services or providing a value added service, and must be restricted to what is necessary for the purposes of such activities.

In Probst v mr.nexnet GmbH (Case C‑119/12), the Third Chamber of the CJEU essentially had to decide whether, and in what circumstances, Articles 6(2) and (5) allow an ISP to pass traffic data to the assignee of its claims for payment such that the latter may process those data. Its starting point was that Articles 6(2) and (5) were exceptions to the general principle of confidentiality with respect to one’s internet traffic data. They therefore needed to be construed strictly.

As regards Article 6(2), Mr Probst had argued that nexnet was not in the business of ‘billing’, but in the business of debt collection. The referring court’s view was that, for data protection purposes, those activities were sufficiently closely connected to be treated indentically. The Third Chamber agreed. It found that, by authorising traffic data processing ‘up to the end of the period during which the bill may lawfully be challenged or payment pursued’, Article 6(2) relates not only to data processing at the time of billing but also to the processing necessary for securing payment thereof.

As to Article 6(5), the Court held “that a persons acts under the authority of another where the former acts on instructions and under the control of the latter”.

The next question was essentially: what does a data protection-compliant contract between an ISP and a third party (an agent, assignee or someone to whom an activity is outsourced) look like? Must the ISP actually be able to determine the use of the data by the third party, including on a case-by-case basis, throughout the duration of the data processing? Or is it sufficient that its contract with the third party contains general rules about the privacy of telecommunications and data protection and provides for data to be erased or returned on request?

The Court emphasised that outsourcing or assignment may not result in lower levels of protection for individuals’ personal data (paragraph 26). The contract must be sufficiently specific. It must, for example, provide for the immediate and irreversible erasure or return of data as soon as knowledge thereof is no longer necessary for the recovery of the claims concerned. The controller (here, the ISP) must be in a position to check and ensure compliance with the privacy and data protection measures agreed under the contract, and the contract must provide for the ISP to be able to request the return or erasure of the data.

The issue in the Probst case (how to balance privacy and legal rights to monies owed) has obvious parallels with measures to combat copyright infringement (how to balance privacy and legal rights to intellectual property). I have blogged on copyright and privacy issues here and here.

The Probst judgment is an important confirmation of general principles about privacy with respect to one’s internet data. The implications for all sorts of contracts involving such data are clear – cloud computing arrangements, for example (on which, see Panopticon’s post here).

It is increasingly important that those contracts provide for specific and enforceable safeguards against unlawful processing of personal data. The Data Protection Directive will change before too long, but these principles will not.

Robin Hopkins

Important developments in surveillance law: RIPA and CCTV

Important changes to the Regulation of Investigatory Powers Act 2000 come into force from 1 November 2012, thanks to the Protection of Freedoms Act 2012 (Commencement No. 2) Order 2012, passed last week. This is an extremely important development for local authorities.

Local authorities are empowered under RIPA to use three surveillance techniques: directed surveillance, the deployment of a Covert Human Intelligence Source (CHIS) and accessing communications data. Early in its term, the Coalition government indicated that it would impose additional safeguards on local authorities’ use of such powers, responding in part to concerns aired by Big Brother Watch and others (see our post here and the recent ‘Grim RIPA’ report here). Chapter 2 of Part 2 of the Protection of Freedoms Act 2012 Act amended RIPA so as to require local authorities to obtain the approval of a magistrate for any authorisation for the use of a covert investigatory technique.

The procedure for obtaining judicial approval may be much like that involved in obtaining search warrants. It remains to be seen how magistrates scrutinise the reasoning and evidence supporting an authorisation so as to ensure that the conditions laid down by RIPA – in particular, necessity and proportionality – are satisfied. Ibrahim Hasan has discussed the changes in his Local Government Lawyer piece here.

Last week also saw a second important announcement on surveillance. The government has announced that it is busy with preparatory work on a new CCTV code of practice, with the aim of consulting on the draft code over the autumn and bringing the new one into force in April 2013. Authorities specified in s. 33(5) of the Protection of Freedoms Act 2012 have a duty to have regard to the code, and other system operators will be encouraged to adopt it on a voluntary basis.

The Home Office Minister, Jeremy Browne MP, told the House of Commons last week that the government is “committed to ensuring that any deployment in public places of surveillance cameras, including close circuit television (CCTV) and automatic number plate recognition (ANPR), is appropriate, proportionate, transparent and effective in meeting its stated purpose”.

Oversight of – and independent recommendations about – the new code will fall to Andrew Rennison, who will remain in post as both surveillance camera commissioner and forensic science regulator until February 2014.

If one adds the Local Authorities (Executive Arrangements) (Meetings and Access to Information) (England) Regulations 2012, also passed last week (see my post here), this is clearly a time of great flux in terms of the information law landscape for local authorities in particular.

Robin Hopkins