Mean Ms Mustard (Or: covert recordings as admissible evidence)

October 15th, 2019 by Robin Hopkins

Ms Mustard was injured in a road traffic accident, for which she claims compensation. She was examined by medical experts appointed by the insurer. She covertly recorded two of those consultations deliberately, and a third accidentally. She wants to deploy those recordings as evidence in support of her claim. The insurer objected, arguing that the recordings constituted unlawful processing contrary to the GDPR and the DPA 2018. Read more »

 

(Thumb)nail in the coffin for the prohibition on monitoring?

October 3rd, 2019 by Julian Blake

Article 15(1) of the E-Commerce Directive (2000/31/EC) has long been a useful weapon in the armoury of social media platforms and search engines by prohibiting a “general monitoring obligation”. This, they argue, means that they can only be required to remove specific unlawful content that is identified by the complainant or court, but no more. The problem with this is that it is very easy for the unlawful content to be spread far and wide and the complainant is required to play whac-a-mole, identifying every repetition and variation of that content.

In today’s judgment in Eva Glawischnig-Piesczek v Facebook Ireland Limited (Case C-18/18), the CJEU has given important guidance in relation to the removal of content which contains identical wording to the original unlawful content or which has “equivalent” content.

Read more »

 

Google: forget the right to be forgotten – here come class actions

October 2nd, 2019 by Robin Hopkins

Google got a good result from the CJEU last week on the right to be forgotten front: in Google LLC v CNIL (Case C‑507/17), the French DP regulator’s rather ambitious demand for global delisting on right to be forgotten grounds was overturned. In a nutshell:

Google has acknowledged that the EU’s RTBF rights are undermined if an internet user can simply switch to a non-EU version of Google and see the offending search results. So it implemented geo-blocking measures, whereby an EU user is automatically routed to an EU version of Google (one that doesn’t deliver the offending references), regardless of whether they type in a non-EU Google domain name.

Not good enough, said the CNIL, slapping Google with a €100k fine: Google must de-reference the offending links from search results delivered through any Google domain in the world. Read more »

 

Facial recognition: a GDPR fine and some further regulation?

September 5th, 2019 by Robin Hopkins

Facial recognition is certainly a hot topic just now. I blogged yesterday about the judgment in Bridges, which saw the Divisional Court dismiss challenges – principally on privacy and data protection grounds – to the use of automated facial recognition technology in a policing context. It would be a mistake, however, for data controllers to assume that the legal and regulatory environment is generally relaxed and permissive about facial recognition. Here are two interesting recent developments to bear in mind alongside the Bridges judgment. Read more »

 

Fashionably late

September 5th, 2019 by Robin Hopkins

With Panopticon having been prorogued for much of the summer, we didn’t get round to a timely blog post on the CJEU’s judgment from the end of July in Fashion ID GmbH & Co. KG v Verbraucherzentrale NRW eV (Case C-40/17). In case you were likewise lounging around Rees-Mogg style and failed to keep up with data protection judgments, here is a brief summary to help you disguise that from your boss or clients. Read more »

 

Police use of automated facial recognition: a justified privacy intrusion

September 4th, 2019 by Robin Hopkins

The opening sentence of today’s judgment in R (Bridges) v Chief Constable of South Wales Police and Others [2019] EWHC 2341 (Admin) is right up Panopticon’s alley: “The algorithms of the law must keep pace with new and emerging technologies”. In precisely that spirit, the Divisional Court’s (Haddon-Cave LJ and Swift J) dismissal of the challenge to South Wales Police’s use of Automated Facial Recognition technology (“AFR”) contains very significant lessons in how to apply privacy and data protection law to beneficial but intrusive technology. Read more »