Data protection developments: fines, group actions and right to be forgotten

September 12th, 2017

The GDPR is still eight months away from coming into force, but – as with any such sea-change – it is informing much of our data protection thinking already. In its recent judgment in the Barbulescu case about monitoring employee communications, for example, the European Court of Human Rights cited provisions of the GDPR. Here are some substantive recent developments illustrating the direction of travel in contentious data protection.

Monetary penalties

Facebook was fined €1.2 million by Spain’s data protection authority yesterday for data protection practices relating to the use of personal data for targeted advertising purposes. This follows a similar action (€150,000) by the French authority earlier this year. I haven’t seen much by way of detailed explanations in English yet, but in her article in TechCrunch, Natasha Lomas says this:

“The regulator found Facebook collects data on ideology, sex, religious beliefs, personal tastes and navigation — either directly, through users’ use of its services or from third party pages — without, in its judgement, “clearly informing the user about the use and purpose”.”

The case is a good illustration of the importance of privacy polices being transparent about purposes, especially when it comes to sensitive information and profiling. Another strand involves retention and deletion: Facebook allegedly retained browsing information impermissibly, even when faced with deletion requests.

It is also a useful illustration of how a regulator might quantify data protection fines: here, the Spanish DPA found two ‘serious’ contraventions at €300,000 each, and one ‘very serious’ contravention at €600,000.

Group litigation

There are many scaremongering headlines suggesting that any data breach will automatically result in a seven-figure regulatory fine. In reality, most cases probably won’t result in Facebook-level fines, at least not in every EU member state. But should data controllers be equally worried about private law actions? In particular, will group litigation on the back of data breaches gather momentum?

In the UK, the first major group claim (against Morrisons) goes to trial in October. An interesting recent development on the US style has emerged in the wake of the Equifax breach: a chatbot has been launched, offering affected individuals an automated route to seeking small-claims compensation (up to $25,000, depending on the state) from Equifax.

This is not a new litigation route per se. It does, however, illustrate an important trend. Historically, large-scale private actions have been rare in UK data protection, in part because individuals don’t understand what to do or how to do it. It can be bewildering and burdensome commencing a claim. The chatbot development suggests that it will become easier and easier for individuals to seek compensation following data breaches.

The right to be forgotten

On Friday, Lord Justice Stephens handed down an un-anonymised version of his judgment in the Callum Townshend case, brought in Northern Ireland. Mr Townshend sought the Court’s permission to serve proceedings on Google Inc. for its refusal of his de-listing/right to be forgotten requests. He wanted Google to de-list certain URLs to articles that discussed inter alia his convictions for sexual offences, which are only due to become spent in 2023. He seeks to bring a claim for breach of confidence, misuse of private information and DPA contraventions.

His application to serve proceedings on Google Inc. was dismissed on the grounds that there were no serious issues to be tried. Here is the nub of the issue, in Lord Justice Stephens’ judgment:

“Usually there can be no expectation of privacy in such offences which are not spent. The reason why the sexual offences are not   “spent” is that the statutory rehabilitation period has not expired and it has been extended as the plaintiff is a notorious recidivist. In relation to the question as to why this case should be taken out of the usual it was submitted that the sexual offences involved particularly intrusive personal material relating to the plaintiff’s sexual orientation with a disproportionate effect on the plaintiff. The fact that an offence reveals a particular sexual orientation on the part of an offender, whatever that orientation may be, is not a most compelling circumstance so as to require open justice to be outweighed by the protection of the offender. The public interest in disclosure is also demonstrated by the fact that the plaintiff has sought to associate his name with a children’s charity. That charity and others like it should have access at the very least to information about his unspent sexual convictions. Also the public interest in disclosure is demonstrated by the fact that the plaintiff’s recidivism has been raised as an issue of public concern in the Assembly. I do not consider that there is any disproportionate effect on the plaintiff. There is no arguable case as to an expectation of privacy in relation to the convictions which are not “spent”.”

As regards the DPA, there could be no serious issue about Google’s ability to rely on condition 6(1) from Schedule 2 and condition 5 from Schedule 3. That last point is interesting, and may raise a few eyebrows: the rationale was that “as a consequence of the open justice principle by committing an offence the offender is deliberately taking steps to make the information public”.

The outcome of the Townshend case is no surprise, but it will be interesting to see whether the GDPR will affect the prevailing general rule that RTBF can only apply to unspent convictions in exceptional cases.

Robin Hopkins @hopkinsrobin

Comments are closed.