Google and data protection: no such thing as the ‘right to be forgotten’

Chris Knight has blogged recently about enforcement action against Google by European Data Protection authorities (but not yet the UK’s ICO). I blogged last month about a German case (BGH, VI ZR 269/12 of 14th May 2013) concerning Google’s ‘autocomplete’ function, and earlier this year about the Google Spain case (Case C‑131/12). The latter arises out of complaints made to that authority by a number of Spanish citizens whose names, when Googled, generated results linking them to allegedly false, inaccurate or out-of-date information (contrary to the data protection principles) – for example an old story mentioning a surgeon’s being charged with criminal negligence, without mentioning that he had been acquitted. The Spanish authority ordered Google to remove the offending entries. Google challenged this order, arguing that it was for the authors or publishers of those websites to remedy such matters. The case was referred to the CJEU by the Spanish courts.

Advocate General Jääskinen this week issued his opinion in this case.

The first point concerns territorial jurisdiction. Google claims that no processing of personal data relating to its search engine takes place in Spain. Google Spain acts merely as commercial representative of Google for its advertising functions. In this capacity it has taken responsibility for the processing of personal data relating to its Spanish advertising customers. The Advocate General has disagreed with Google on this point. His view is that national data protection legislation is applicable to a search engine provider when it sets up in a member state, for the promotion and sale of advertising space on the search engine, an office which orientates its activity towards the inhabitants of that state.

The second point is substantive, and is good news for Google. The Advocate General says that Google is not generally to be considered – either in law or in fact – as a ‘data controller’ of the personal data appearing on web pages it processes. It has no control over the content included on third party web pages and cannot even distinguish between personal data and other data on those pages.

Thirdly, the Advocate General tells us that there is no such thing as the so-called “right to be forgotten” (a favourite theme of debates on the work-in-progress new Data Protection Regulation) under the current Directive. The Directive offers accuracy as to safeguards and so on, but Google had not itself said anything inaccurate here. At paragraph 108 of his opinion, the Advocate General says this:

“… I consider that the Directive does not provide for a general right to be forgotten in the sense that a data subject is entitled to restrict or terminate dissemination of personal data that he considers to be harmful or contrary to his interests. The purpose of processing and the interests served by it, when compared to those of the data subject, are the criteria to be applied when data is processed without the subject’s consent, and not the subjective preferences of the latter. A subjective preference alone does not amount to a compelling legitimate ground within the meaning of Article 14(a) of the Directive.”

It remains to be seen of course whether the Court agrees with the Advocate General. The territorial issue and the ‘data controller’ question are of great significance to Google’s business model – and to those whose businesses face similar issues. The point about objectivity rather than subjectivity being the essential yardstick for compliance with data protection standards is potentially of even wider application.

“This is a good opinion for free expression,” Bill Echikson, a spokesman for Google, said in an e-mailed statement reported by Bloomberg.

Robin Hopkins

New CCTV Code of Practice: surveillance and the protection of freedoms

Surveillance of the covert and digital variety has been dominating the news of late. The legal contours of the practices leaked by Edward Snowden (the NSA’s obtaining of internet metadata) and covered by The Guardian (most recently, GCHQ’s monitoring of certain communications of ‘friendly’ foreign allies) may be matters of some debate.

In the meantime, the legal contours of a more overt and physical variety of surveillance – CCTV – have been somewhat clarified.

Panopticon indeed.

As its name suggests, the Protection of Freedoms Act 2012 expressed the incoming Coalition Government’s commitment to keeping in check the state’s surveillance of ordinary citizens. By that Act (sections 29-36), the Home Secretary was to present to Parliament a Code of Practice governing the use of surveillance camera systems including CCTV and Automatic Number Plate Recognition (ANPR). Following a consultation exercise – the response to which can be read here – the Home Secretary has now done so. The Code was laid before Parliament on 4 June 2013. A draft order (the Protection of Freedoms Act 2012 (Code of Practice for Surveillance Camera Systems and Specification of Relevant Authorities) Order 2013) is currently being considered by Parliament’s Joint Committee on Statutory Instruments.

Pending its coming into force, Panopticon summarises the key features of the new Code.

To whom does the Code apply?

The Code imposes duties on ‘relevant authorities’, which are those listed at section 33(5) of the Protection of Freedoms Act 2012 – in the main, local authorities and policing authorities.

The draft order proposes to add the following to the list of relevant authorities:

(a) The chief constable of the British Transport Police;

(b) The Serious Organised Crime Agency;

(c) The chief constable of the Civil Nuclear Constabulary; and

(d) The chief constable of the Ministry of Defence Police.

The Code recognises that concern about the use of surveillance cameras often extends beyond these sorts of full-blooded ‘public’ authorities. It recognises that the list of relevant authorities may need to be expanded in future to encompass shopping centres, sports grounds, schools, transport centres and the like.

For now, however, only those listed as ‘relevant authorities’ are subject to the duties imposed by the Code. Others who use such surveillance systems are ‘encouraged’ to abide by the Code.

What duty is imposed by the Code?

The Code imposes a ‘have regard to’ duty. In other words, relevant authorities are required to have regard to the Code when exercising any of the functions to which the Code relates. As regards its legal effects:

“A failure on the part of any person to act in accordance with any provision of this code does not of itself make that person liable to criminal or civil proceedings. This code is, however, admissible in evidence in criminal or civil proceedings, and a court or tribunal may take into account a failure by a relevant authority to have regard to the code in determining a question in any such proceedings” (paragraph 1.16).

It may well be that the Code also weighs heavily with the ICO in its consideration of any complaints about the use of surveillance cameras breaching the DPA 1998.

Remember that the Home Office Code sits alongside and does not replace the ICO’s CCTV Code of Practice.

What types of activity are covered by the new Code?

Relevant authorities must have regard to the Code ‘when exercising any of the functions to which the Code relates’. This encompasses the operation and use of and the processing data derived from surveillance camera systems in public places in England and Wales, regardless of whether there is any live viewing or recording of images and associated data.

The Code does not apply to covert surveillance, as defined under the Regulation of Investigatory Powers Act 2000.

What about third party contractors?

Where a relevant authority instructs or authorises a third party to use surveillance cameras, that third party is not under the ‘have regard to’ duty imposed by the Code. That duty does, however, apply to the relevant authority’s arrangements.

By paragraph 1.11:

“The duty to have regard to this code also applies when a relevant authority uses a third party to discharge relevant functions covered by this code and where it enters into partnership arrangements. Contractual provisions agreed after this code comes into effect with such third party service providers or partners must ensure that contractors are obliged by the terms of the contract to have regard to the code when exercising functions to which the code relates.”

The approach

The guiding philosophy of the Code is one of surveillance by consent:

 “The government considers that wherever overt surveillance in public places is in pursuit of a legitimate aim and meets a pressing need, any such surveillance should be characterised as surveillance by consent, and such consent on the part of the community must be informed consent and not assumed by a system operator…. [legitimacy] in the eyes of the public is based upon a general consensus of support that follows from transparency about their powers, demonstrating integrity in exercising those powers and their accountability for doing so” (paragraph 1.5).

In a nutshell, the expectation is this:

“The decision to use any surveillance camera technology must, therefore, be consistent with a legitimate aim and a pressing need. Such a legitimate aim and pressing need must be articulated clearly and documented as the stated purpose for any deployment. The technical design solution for such a deployment should be proportionate to the stated purpose rather than driven by the availability of funding or technological innovation. Decisions over the most appropriate technology should always take into account its potential to meet the stated purpose without unnecessary interference with the right to privacy and family life. Furthermore, any deployment should not continue for longer than necessary” (paragraph 2.4).

The guiding principles

The Code then sets out 12 guiding principles which systems operators should follow:

(1) Use of a surveillance camera system must always be for a specified purpose which is in pursuit of a legitimate aim and necessary to meet an identified pressing need.

(2) The use of a surveillance camera system must take into account its effect on individuals and their privacy, with regular reviews to ensure its use remains justified.

(3) There must be as much transparency in the use of a surveillance camera system as possible, including a published contact point for access to information and complaints.

(4) There must be clear responsibility and accountability for all surveillance camera system activities including images and information collected, held and used.

(5) Clear rules, policies and procedures must be in place before a surveillance camera system is used, and these must be communicated to all who need to comply with them.

(6) No more images and information should be stored than that which is strictly required for the stated purpose of a surveillance camera system, and such images and information should be deleted once their purposes have been discharged.

(7) Access to retained images and information should be restricted and there must be clearly defined rules on who can gain access and for what purpose such access is granted; the disclosure of images and information should only take place when it is necessary for such a purpose or for law enforcement purposes.

(8) Surveillance camera system operators should consider any approved operational, technical and competency standards relevant to a system and its purpose and work to meet and maintain those standards.

(9) Surveillance camera system images and information should be subject to appropriate security measures to safeguard against unauthorised access and use.

(10) There should be effective review and audit mechanisms to ensure legal requirements, policies and standards are complied with in practice, and regular reports should be published.

(11) When the use of a surveillance camera system is in pursuit of a legitimate aim, and there is a pressing need for its use, it should then be used in the most effective way to support public safety and law enforcement with the aim of processing images and information of evidential value.

(12) Any information used to support a surveillance camera system which compares against a reference database for matching purposes should be accurate and kept up to date.

Points to note

The Code then fleshes out those guiding principles in more detail. Here are some notable points:

Such systems “should not be used for other purposes that would not have justified its establishment in the first place” (paragraph 3.1.3).

“People do, however, have varying and subjective expectations of privacy with one of the variables being situational. Deploying surveillance camera systems in public places where there is a particularly high expectation of privacy, such as toilets or changing rooms, should only be done to address a particularly serious problem that cannot be addressed by less intrusive means” (paragraph 3.2.1).

“Any proposed deployment that includes audio recording in a public place is likely to require a strong justification of necessity to establish its proportionality. There is a strong presumption that a surveillance camera system must not be used to record conversations as this is highly intrusive and unlikely to be justified” (paragraph 3.2.2).

“Any use of facial recognition or other biometric characteristic recognition systems needs to be clearly justified and proportionate in meeting the stated purpose, and be suitably validated. It should always involve human intervention before decisions are taken that affect an individual adversely” (paragraph 3.3.3).

“This [the requirement to publicise as much as possible about the use of a system] is not to imply that the exact location of surveillance cameras should always be disclosed if to do so would be contrary to the interests of law enforcement or national security” (paragraph 3.3.6).

“It is important that there are effective safeguards in place to ensure the forensic integrity of recorded images and information and its usefulness for the purpose for which it is intended to be used. Recorded material should be stored in a way that maintains the integrity of the image and information, with particular importance attached to ensuring that meta data (e.g. time, date and location) is recorded reliably, and compression of data does not reduce its quality” (paragraph 4.12.2).

Enforcement

The Surveillance Camera Commissioner is a statutory appointment made by the Home Secretary under section 34 of the Protection of Freedoms Act 2012. The Commissioner has no enforcement or inspection powers. However, in encouraging compliance with the Code, he “should consider how best to ensure that relevant authorities are aware of their duty to have regard for the Code and how best to encourage its voluntary adoption by other operators of surveillance camera systems” (paragraph 5.3). The Commissioner is/is to be assisted by a non-statutory Advisory Council with its own specialist subgroups.

Given the limited remit of the Surveillance Camera Commissioner, it may be that the Code shows its teeth more effectively in complaints to the ICO and/or the courts.

Robin Hopkins

Damages under section 13 DPA: Court of Appeal’s judgment in Halliday

I blogged a while ago about the ex tempore judgment from the Court of Appeal in a potentially groundbreaking case on damages under section 13 of the DPA, namely Halliday v Creation Consumer Finance [2013] EWCA Civ 333. The point of potential importance was that ‘nominal damages’ appeared to suffice for the purposes of section 13(1), thereby opening up section 13(2). In short, the point is that claimants under the DPA cannot be compensated for distress unless they have also suffered financial harm. A ‘nominal damages’ approach to the concept of financial harm threatened to make the DPA’s compensation regime dramatically more claimant-friendly.

The Court of Appeal’s full judgment is now available. As pointed out on Jon Baines’ blog, ground has not been broken: the ‘nominal damages’ point was a concession by the defendant rather than a determination by the Court. See paragraph 3 of the judgment of Lady Justice Arden:

“… this issue, which was the main issue of the proposed appeal to this court, is now academic as the respondent, CCF, concedes an award of nominal damages is “damage” for the purposes of the Directive and for the purposes of section 13(2) of the Data Protection Act 1998.”

Other potentially important points have also fallen somewhat flat. The question of whether UK law provided an adequate remedy for a breach of a right conferred by a European Directive fell away on the facts (“proof fell short in relation to the question of damage to reputation and credit”), while the provision for sanctions under Article 24 of Directive 95/46/EC was neither directly enforceable to Mr Halliday nor of assistance to him.

Still, the judgment is not without its notable points.

One is the recognition that compensation for harm suffered is a distinct matter from penalties for wrongdoing; the former is a matter for the courts in the DPA context, the latter a matter for the Information Commissioner and his monetary penalty powers. Such was the implication of paragraph 11:

“… it is not the function of the civil court, unless specifically provided for, to impose sanctions. That is done in other parts of the judicial system.”

Another point worth noting is Lady Justice Arden’s analysis of distress and the causation thereof. The distress must be caused by the breach, not by other factors such as (in this case) a failure to comply with a court order. See paragraph 20:

“Focusing on subsection (2), it is clear that the claimant has to be an individual, that he has to have suffered distress, and that the distress has to have been caused by contravention by a data controller of any of the requirements of the Act. In other words, this is a remedy which is not for distress at large but only for contravention of the data processing requirements. It also has to be distress suffered by the complainant and therefore would not include distress suffered by family members unless it was also suffered by him. When I say that it has to be caused by breach of the requirements of the Act, the distress which I accept Mr Halliday would have felt at the non-compliance of the order is not, at least directly, relevant because that is not distress by reason of the contravention by a data controller of the requirements of this Act. If the sole cause of the distress had been non-compliance with a court order, then that would have lain outside the Act unless it could be shown that it was in substance about the non-compliance with the Data Protection Act.”

The claimant had sought to draw an analogy with guidelines and banding for discrimination awards as set by Vento v Chief Constable of West Yorkshire Police [2013] 1 ICR 31. The Court of Appeal was not attracted. See paragraph 26:

“In answer to that point, the field of discrimination is, it seems to me, not a helpful guide for the purposes of data protection. Discrimination is generally accompanied by loss of equality of opportunity with far-reaching effects and is liable to cause distinct and well-known distress to the complainant.”

Finally, Lady Justice Arden commented as follows concerning the level of the compensation to be awarded on the facts of this case: “in my judgment the sum to be awarded should be of a relatively modest nature since it is not the intention of the legislation to produce some kind of substantial award. It is intended to be compensation, and thus I would consider it sufficient to render an award in the sum of £750” (paragraph 36).

Lord Justice Lloyd (who, along with Mr Justice Ryder agreed with Lady Justice Arden) did pause to think about a submission on this question ‘if you were so distressed, why did you not complain immediately?’, but concluded that (paragraph 47):

“I confess that I was somewhat impressed at one point by Mr Capon’s submission that it was a surprise, if Mr Halliday was so distressed by this contravention, that he did not immediately protest upon discovering, in response to his first credit reference enquiry, the fact of the contravention, and indeed he did not protest until about a month after the second report had been obtained. But I bear in mind, in response to that, Mr Halliday’s comment that he had had such difficulty in getting any sensible response, or indeed any response, out of CCF at the earlier stage, that it is perhaps less surprising that he did not immediately protest. In any event, the period in question is not a very lengthy one between his discovery of the contravention by his first reference request and his taking action in July. Accordingly, it does not seem to me that that is a matter that should be taken to reduce my assessment of the degree of distress that he suffered.”

Robin Hopkins

Google: autocomplete and the frontiers of privacy

Unsurprisingly, the frontiers of privacy and data protection law are often explored and extended by reference to what Google does. Panopticon has, for example, covered disputes over Google Street View (on which a US lawsuit was settled in recent months), Google’s status as a ‘publisher’ of blogs containing allegedly defamatory material (see Tamiz v Google [2013] EWCA Civ 68) and its responsibility for search results directing users to allegedly inaccurate or out-of-date personal data (see Google Spain v Agencia Espanola de Proteccion de Datos (application C-131/12), in which judgment is due in the coming months).

A recent decision of a German appellate court appears to have extended the frontiers further. The case (BGH, VI ZR 269/12 of 14th May 2013) concerned Google’s ‘autocomplete’ function. When the complainants’ names were typed into Google’s search bar, the autocomplete function added the ensuing words “Scientology” and “fraud”. This was not because there was lots of content linking that individual with those terms. Rather, it was because these were the terms other Google users had most frequently searched for in conjunction with that person’s name. This was due to rumours the truth or accuracy of which the complainants denied. They complained that the continuing association of their names with these terms infringed their rights to personality and reputation as protected by German law (Articles 823(1) and 1004 of the German Civil Code).

In the Google Spain case, Google has said that the responsibility lies with the generators of the content, not with the search engine which offers users that content. In the recent German case, Google has argued in a similar vein that the autocomplete suggestions are down to what other users have searched for, not what Google says or does.

In allowing the complainants’ appeals, the Federal Court of Justice in Karlsruhe has disagreed with Google. The result is that once Google has been alerted to the fact that an autocomplete suggestion links someone to libellous words, it must remove that suggestion. The case is well covered by Jeremy Phillips at IPKat and by Karin Matussek of Bloomberg in Berlin.

The case is important in terms of the frontiers of legal protection for personal integrity and how we allocate responsibility for harm. Google says that, in these contexts, it is a facilitator not a generator. It says it should not liable for what people write (see Tamiz and Google Spain), not for what they search for (the recent German case). Not for the first time, courts in Europe have allocated responsibility differently.

Notably, this case was not brought under data protection law. In principle, it seems that such complaints could be expressed in data protection terms. Perhaps, if the EU’s final Data Protection Regulation retains the severe penalty provisions proposed in the draft version, data protection will move centre-stage in these sorts of cases.

Robin Hopkins

Data protection: trends, possibilities and FOI disclosures

At 11KBW’s information law seminar in May, one of the discussion topics was ‘the future of data protection’. Here are some further thoughts on some interesting trends and developments.

Progress at the EU level

A major issue on this front is of course progress on the draft EU Data Protection Regulation – on which see this blog post from the ICO’s David Smith for an overview of the issues currently attracting the most debate. While that negotiation process runs its course, the Article 29 Working Party continues to provide influential guidance for users and regulators on some of the thorniest data protection issues. Its most recent opinion addresses purpose limitation, i.e. the circumstances under which data obtained for one purpose can be put to another. A summary of its views is available here.

Subject access requests

Turning to domestic DPA litigation in the UK, practitioners should watch out for a number of other developments (actual or potential) over the coming months. On the subject access request front, for example, data controllers have tended to take comfort from two themes in recent judgments (such as Elliott and Abadir, both reported on Panopticon). In short, the courts in those cases have agreed that (i) data controllers need only carry out reasonable and proportionate searches, and (ii) that section 7(9) claims being pursued for the collateral purpose of aiding other substantive litigation will be an abuse of process.

Data controllers should, however, note that neither of those points is free from doubt: there are plenty who doubt the legal soundness of the proportionality point, and the abuse of process point has arisen for section 7(9) claims to the court – it should not, in other words, be relied upon too readily to refuse requests themselves.

Damages

Damages under section 13 of the DPA is another area of potentially important change. The Halliday v Creation Consumer Finance case (briefly reported by Panopticon) has been given further discussion in the Criminal Law & Justice Weekly here. Based on that information, perhaps the most interesting point is this: defendants have rightly taken comfort from the requirement under section 13 that compensation for distress can be awarded only where damage has also been suffered. In Halliday, however, nominal damages (of £1) were awarded, thereby apparently fulfilling the ‘damage’ requirement and opening the door for a ‘distress’ award (though note that Panopticon has not yet seen a full judgment from the Court of Appeal in this case, so do not take this as a definitive account). If that approach becomes standard practice, claimants may be in much stronger positions for seeking damages.

A further potential development on the damages front arises out of monetary penalty notices: data controllers who are subject to hefty penalties by the ICO may in some cases also find themselves facing section 13 claims from the affected data subjects themselves, presenting a worrying prospect of paying out twice for the same mistake.

Disclosure of personal data in the FOIA context

In general terms, requesters struggle to obtain the personal data of others through FOIA requests. A couple of very recent decisions have, however, gone the other way.

In White v IC and Carmarthenshire County Council (EA/2012/0238), the First-Tier Tribunal allowed the requester’s appeal and ordered disclosure of a list of licensed dog-breeders in the council’s area. In particular, it concluded that (paragraphs 21-23):

“…the Tribunal believes – on the facts of this case – that an important factor for any assessment in relation to the “fairness” of the disclosure of the personal data is best discovered from the context in which the personal data was provided to the Council in the first place.

22. The context, here, is to secure a commercial licence required by law to breed dogs. That license is necessary for the local authority to know who the licensed dog breeders in that area are, and so that the law can be enforced and welfare checks can be conducted as and when necessary in relation to the welfare of the dogs being bred commercially.

23. Licensing – in the ordinary course of things – is a public regulatory process. Indeed it was a public process in Carmarthenshire, in relation to the information that is at the core of this appeal, until the Council changed its policy in 2008.”

The Tribunal was unimpressed by the suggestive language of a survey of dog breeders which the council had carried out to support its case for non-disclosure. It also noted that a neighbouring council had disclosed such information.

The First-Tier Tribunal issued its decision in Dicker v IC (EA/2012/0250) today. It allowed the requester’s appeal and ordered disclosure of the salary of the chief executive of the NHS Surrey PCT over specified time periods, including total remuneration, expenses allowance, pension contributions and benefit details. As to legitimate interests in disclosure, the Tribunal said that (paragraph 13):

“In this case the arrangements (including secondment and recharge from another public authority at one stage) mean that the arrangements are not as transparent as might be wished and it is not entirely clear from the information published (as opposed to the assurances given) that the national pay guidance has been complied with. Mr Dicker asserted that the CEO was paid in excess of the national framework. The Tribunal was satisfied that there was a legitimate public interest in demonstrating that the national framework had been complied with and that the published information did not properly establish this”.

On the questions of distress and privacy infringements, the Tribunal took this view (paragraph 14):

“The CEO is a prominent public servant discharging heavy responsibilities who must expect to be scrutinised. Individuals in such circumstances are rational, efficient, hard-working and robust. They are fully entitled to a high degree of respect for their private lives. However the protection of personal information about their families and their health is a very different matter from having in the public domain information about income… The Tribunal simply cannot accept that anyone in such a role would feel the slightest distress, or consider that there has been any intrusion or that they would be prejudiced in any way by such information. From the perspective of the individual such information is essentially trivial; indeed, in other European societies, such information would be routinely available.”

If this approach were to become standard, the implications for public authorities would be significant.

Further, there are two very important personal data FOIA cases to look out for in the coming months. Following its decision in the Edem case late in 2012, the Upper Tribunal’s next consideration of personal data in the FOIA context is the appeal in the Morley v IC & Surrey Heath Borough Council (EA/2011/0173) case, in which the Tribunal – in a majority decision in which Facebook disclosures played a significant part – ordered the disclosure of names of certain youth councillors.

More importantly, the Supreme Court will hear an appeal from the Scottish Court of Session in July about a FOISA request for the number of individuals employed by the Council on specific points in the pay structure. The council relied on the personal data exemption (contending that individuals could be identified from the requested information), but the Scottish Information Commissioner ordered disclosure and succeeded before Scotland’s highest court. The Supreme Court will consider issues including the approach to ‘legitimate interests’ under condition 6(1) of schedule 2 to the DPA (the condition most often relied upon in support of disclosing personal data to the public). The case is likely to have far-reaching implications. For more detail, see Alistair Sloan’s blog.

Panopticon will, as ever, keep its eye on these and other related developments.

Robin Hopkins

Privacy and data protection developments in 2013: Google, Facebook, Leveson and more

Data protection law was designed to be a fundamental and concrete dimension of the individual’s right to privacy, the primary safeguard against misuse of personal information. Given those ambitions, it is surprisingly rarely litigated in the UK. It also attracts criticism as imposing burdensome bureaucracy but delivering little in the way of tangible protection in a digital age. Arguably then, data protection law has tended to punch below its weight. There are a number of reasons for this.

One is that Directive 95/46/EC, the bedrock of data protection laws in the European Union, is the product of a largely pre-digital world; its drafters can scarcely have imagined the ubiquity of Google, Twitter, Facebook and the like.

Another is that in the UK, the evolution of Article 8 ECHR and common law privacy and breach of confidence actions has tended to deprive the Data Protection Act 1998 of the oxygen of litigation – before the House of Lords in Campbell v MGN [2004] UKHL 22, for example, it was agreed that the DPA cause of action “added nothing” to the supermodel’s breach of confidence claim (para. 130).

A further factor is that the DPA 1998 has historically lacked teeth: a court’s discretion to enforce subject access rights under s. 7(9) is “general and untrammelled” (Durant v FSA [2003] EWCA Civ 1746 at para. 74); damages under s. 13 can only be awarded if financial loss has been incurred, and the Information Commissioner has, until recently, lacked robust enforcement powers.

This landscape is, however, undergoing very significant changes which (one hopes) will improve data protection’s fitness for purpose and amplify its contribution to privacy law. Here is an overview of some of the more notable developments so far in 2013.

The draft Data Protection Regulation

The most fundamental feature of this landscape is of course EU law. The draft DP Regulation, paired with a draft Directive tailored to the crime and security contexts, was leaked in December 2011 and published in January 2012 (see Panopticon’s analysis here). The draft Regulation, unlike its predecessor would be directly effective and therefore not dependent on implementation through member states’ domestic legislation. Its overarching aim is harmonisation of data protection standards across the EU: it includes a mechanism for achieving consistency, and a ‘one-stop shop’ regulatory approach (i.e. multinationals are answerable only to their ‘home’ data protection authority). It also tweaks the law on international data transfers, proposes that most organisations have designated data protection officers, offers individuals a ‘right to be forgotten’ and proposes eye-watering monetary penalties for data protection breaches.

Negotiations on that draft Regulation are in full swing: the European Parliament and the Council of the European Union’s DAPIX (Data Protection and Information Exchange) subgroup working on their recommendations separately before coming together to approve the final text (for more detail on the process, see the ICO’s outline here).

What changes, if any, should be made to the draft before it is finalised? That rather depends on who you ask.

In January 2013, the UK government set out its views on the draft Regulation. It did so in the form of its response to the recommendations of the Justice Select Committee following the latter’s examination of the draft Regulation. This is effectively the government’s current negotiation stance at the EU table. It opposes direct effect (i.e. it wants a directive rather than a regulation), thinks the ‘right to be forgotten’ as drafted is misconceived, favours charging for subject access requests and opposes the mandatory data protection officer requirement. The government considers that promoters of the draft have substantially overestimated the savings which the draft would deliver to business. The government also “believes that the supervisory authorities should have more discretion in the imposition of fines and that the proposed removal of discretion, combined with the higher levels of fines, could create an overly risk-averse environment for data controllers”. For more on its stance, see here.

The ICO has also has significant concerns. It opposes the two-stream approach (a mainstream Regulation and a crime-focused Directive) and seeks clarity on psedonymised data and non-obvious identifiers such as logs of IP addresses. It thinks the EU needs to be realistic about a ‘right to be forgotten’ and about its power over non-EU data controllers. It considers the current proposal to be “too prescriptive in terms of its administrative detail” and unduly burdensome for small and medium-sized enterprises in particular.

Interestingly, while the ICO favours consistency in terms of sanctions, it cautions against total harmonisation on all fronts: “Different Member States have different legal traditions. What is allowed by law is not spelled out in the UK in the way that it is in some other countries’ legal systems. The proposed legislation needs to reflect this, particularly in relation to the concept of ‘legitimate interests’.” For more on the ICO’s current thinking, see here.

Those then are the most influential UK perspectives. At an EU level, the European Parliament’s report on the draft Regulation is more wholeheartedly supportive. The European Parliament’s Industry Committee is somewhat more business-friendly in its focus, emphasising the importance of EU-wide consistency and a ‘one-stop shop’. Its message is clear: business needs certainty on data protection requirements. It also urges further exemptions from data protection duties for small and medium-sized enterprises “which are the backbone of Europe’s economy”. The Industry Committee’s views are available here.

Negotiations continue, the aim being to finalise the text by mid-2013. The European Parliament is likely to press for the final text to resemble the draft very closely. On the other hand, Ireland holds the Presidency of the Commission and of DAPIX – until mid-2013. Its perspective is probably closer to the UK ICO’s in tenor. There are good prospects of at least some of their views to be reflected in the final draft.

A number of the themes of the draft Regulation and the current negotiations are already surfacing in litigation, as explained below.

The Leveson Report

Data protection legislation in the UK will be affected not only by EU developments but by domestic ones too.

In recent weeks, debate about Leveson LJ’s report on the culture, practices and ethics of the press has tended to focus on the Defamation Bill which is currently scraping its way through Parliament. In particular, the debate concerns the merits of an apparently-Leveson inspired amendment tabled by Lord Puttnam which, some argue, threatens to derail this legislative overhaul of libel law in the UK (for one angle on this issue, see David Allen Green’s piece in the New Statesman here).

The Leveson Report also included a number of recommendations for changes to the DPA 1998 (see Panopticon’s posts here and here). These included overhauling and expanding the reach of the ICO and allowing courts to award damages even where no financial loss has been suffered (arguably a befitting change to a regime concerned at heart with personal privacy).

The thorniest of Leveson LJ’s DPA recommendations, however, concerned the wide-ranging ‘journalism exemption’ provided by s. 32. The ICO has begun work on a code of practice on the scope and meaning of this exemption. It has conducted a ‘framework consultation’, i.e. one seeking views on the questions to be addressed by the code, rather than the answers at this stage (further consultation will happen once a code has been drafted).

There is potential for this code to exert great influence: s. 32(3) says that in considering whether “the belief of a data controller that publication would be in the public interest was or is a reasonable one, regard may be had to his compliance with” any relevant code of practice – if it has been designated by order of the Secretary of State for this purpose. There is as yet no indication of an appetite for such designation, but it is hoped that, the wiser the code, the stronger the impetus to designate it.

The ICO’s framework consultation closes on 15 March. Watch out for (and respond to) the full consultation in due course.

Google – confidentiality, informed consent and data-sharing

Google (the closest current thing to a real ‘panopticon’?) has been the subject of a flurry of important recent developments.

First, certain EU data protection bodies intend to take “repressive action” against some of Google’s personal data practices. These bodies include the French authority, CNIL (the Commission nationale de l’informatique et des libertés) and the Article 29 Working Party (an advisory body made of data protection representatives from member states). In October 2012, following an investigation led by CNIL, the Working Party raised what it saw as deficiencies in Google’s confidentiality rules. It recommended, for example, that Google provide users with clearer information on issues such as how personal data is shared across Google’s services, and on Google’s retention periods for personal data. Google was asked to respond within four months. CNIL has reported in recent weeks that Google did not respond. The next step is for the Working Party “to set up a working group, led by the CNIL, in order to coordinate their repressive action which should take place before summer”. It is not clear what type of “repressive action” is envisaged.

Google and the ‘right to be forgotten’

Second, Google is currently involved in litigation against the Spanish data protection authority in the Court of Justice of the EU. The case arises out of complaints made to that authority by a number of Spanish citizens whose names, when Googled, generated results linking them to false, inaccurate or out-of-date information (contrary to the data protection principles) – for example an old story mentioning a surgeon’s being charged with criminal negligence, without mentioning that he had been acquitted. The Spanish authority ordered Google to remove the offending entries. Google challenged this order, arguing that it was for the authors or publishers of those websites to remedy such matters. The case was referred to the CJEU by the Spanish courts. The questions referred are available here.

The CJEU considered the case at the end of February, with judgment expected in mid-2013. The case is obviously of enormous relevance to Google’s business model (at least as regards the EU). Also, while much has been made about the ‘right to be forgotten’ codified in the draft EU Regulation (see above), this Google case is effectively about whether that right exists under the current law. For a Google perspective on these issues, see this blog post.

Another development closer to home touches on similar issues. The Court of Appeal gave judgment last month in Tamiz v Google [2013] EWCA Civ 68. Mr Tamiz complained to Google about comments on the ‘London Muslim’ blog (hosted by Google) which he contended were defamatory in nature. He asked Google to remove that blog. He also sought permission to serve proceedings on Google in California for defamation occurring between his request to Google and the taking down of the offending blog. Agreeing with Google, the Court of Appeal declined jurisdiction and permission to serve on Google in California.

Mr Tamiz’ case failed on the facts: given the small number of people who would have viewed this blog post in the relevant period, the extra-territorial proceedings ‘would not be worth the candle’.

The important points for present purposes, however, are these: the Court of Appeal held that there was an arguable case that Google was the ‘publisher’ of those statements for defamation purposes, and that it would not have an unassailable defence under s. 1 of the Defamation Act 1996. Google provided the blogging platform subject to conditions and had the power to block or remove content published in breach of those conditions. Following Mr Tamiz’s complaint, Google knew or ought to have known that it was causing or contributing to the ongoing publication of the offending material.

A ‘publisher’ for defamation purposes is not co-extensive with a ‘data controller’ for DPA purposes. Nonetheless, these issues in Tamiz resonate with those in the Google Spain case, and not just because of their ‘right to be forgotten’ subtext. Both cases raise this question: it is right to hold Google to account for its role in making false, inaccurate or misleading personal information available to members of the public? If it is, another question might also arise in due course: to what extent would Leveson-inspired amendments to the s. 32 DPA 1998 exemption (on which the ICO is consulting) affect service providers like Google?

Facebook, Google and jurisdiction

The Google Spain case also involves an important jurisdictional argument. Google’s headquarters are in California. It argued before the CJEU that Google Spain only sells advertising to the parent company, and that these complaints should therefore be considered under US data protection legislation. In other words, it argues, this is not a matter for EU data protection law at all. The Spanish authority argues that Google Spain’s ‘centre of gravity’ is in Spain: it links to Spanish websites, has a Spanish domain name and processes personal data about Spanish citizens and residents.

Victory for Google on this point would significantly curtail the data protection rights of EU citizens in this context.

Also on jurisdictional matters, Facebook has won an important recent victory in Germany. Schleswig-Holstein’s Data Protection Commissioner had ruled that Facebook’s ‘real names policy’ (i.e. its policy against accounts in psuedonymous names only) was unfair and unlawful. The German administrative court granted Facebook’s application for the suspension of that order on the grounds that the issue should instead be considered by the Irish Data Protection Authority, since Facebook is Dublin-based.

Here then, is an example of ‘one-stop shop’ arguments surfacing under current EU law. The ‘one-stop shop’ principle is clearly very important to businesses. In the Facebook case, it would no doubt say that its ‘home’ regulator understands its business much better and is therefore best equipped to assess the lawfulness of its practices. The future of EU law, however, is as much about consistency across member states as about offering a ‘one-stop shop’. The tension between ‘home ground advantage’ and EU-wide consistency is one of the more interesting practical issues in the current data protection debate.

Enforcement and penalties issued by the ICO

One of the most striking developments in UK data protection law in recent years has been the ICO’s use of its enforcement and (relatively new) monetary penalty powers.

On the enforcement front, the Tribunal has upheld the ICO’s groundbreaking notice issued against Southampton City Council for imposing audio recording requirements in taxis (see Panopticon’s post here).

The issuing of monetary penalties has continued apace, with the ICO having issued in the region of 30 notices in the last two years. In 2013, two have been issued.

One (£150,000) was on the Nursing and Midwifery Council, for losing three unencrypted DVDs relating to a nurse’s misconduct hearing, which included evidence from two vulnerable children. The second (£250,000) was on a private sector firm, Sony Computer Entertainment Europe Limited, following the hacking of Sony’s PlayStation Network Platform in April 2011, which the ICO considered “compromise[ed] the personal information of millions of customers, including their names, addresses, email addresses, dates of birth and account passwords. Customers’ payment card details were also at risk.”

In the only decision of its kind to date, the First-Tier Tribunal upheld a monetary penalty notice issued against Central London Community Care NHS Trust for faxing patient details to the wrong number (see Panopticon’s post here). The First-Tier Tribunal refused the Trust permission to appeal against that decision.

Other penalty notices are being appealed to the Tribunal – these include the Scottish Borders notice (which the Tribunal will consider next week) and the Tetrus Telecoms notice, the first to be issued under the Privacy and Electronic Communications Regulations 2003.

It is only a matter of time before the Upper Tribunal or a higher court considers a monetary penalty notice case. At present, however, there is no binding case law. To that extent, the monetary penalty system is a somewhat uncertain business.

The question of EU-wide consistency raises more fundamental uncertainty, especially when one considers the mandatory fining regime proposed in the draft EU Regulation, with fines of up to €1,000,000 or 2% of the data controller’s global annual turnover.

By way of contrast, 13 administrative sanctions for data protection breaches were issued in France in 2012, the highest fine being €20,000. Enforcement in Germany happens at a regional level, with Schleswig-Holstein regarded as on the stricter end; overall however, few fines are issued in Germany. How the ‘one-stop shop’ principle, the consistency mechanism and the proposed new fining regime will be reconciled is at present anyone’s guess.

From a UK perspective, however, the only point of certainty as regards monetary penalty notices is that there will be no slowing down in the ICO’s consideration of such cases in the short- to medium-term.

It is of course too early to say whether the developments outlined above will elevate data protection law from a supporting to a leading role in protecting privacy. It is clear, however, that – love them or hate them – data protection duties are increasingly relevant and demanding.

Robin Hopkins