Key points from the Bridges facial recognition appeal

September 3rd, 2020

September: Panopticon is scraping itself off furlough and bounding back to school. Here are two information rights from August that are worth noting, both anchored in ECHR rights.

First, readers will recall the high-profile case of R (Bridges) v Chief Constable of South Wales Police and Others. Bridges concerned a challenge on (among others) Article 8 ECHR and DP grounds to the police force’s use of automated facial recognition (AFR) as part of a pilot project aimed at spotting the faces of suspects on wanted lists among the crowds.

At first instance – see my post here – the challenge failed. The Divisional Court found that, while Mr Bridges’ Article 8 rights were interfered by his being subjected to AFR scanning, that interference was justified under Article 8(2) ECHR: the police’s actions were ‘in accordance with the law’ and they were proportionate. The DP challenge also failed: the police force had complied with the provisions of the DPA 2018 implementing the Law Enforcement Directive, and its documentation (DPIA; appropriate policy document) just about passed muster.

The Court of Appeal ([2020] EWCA Civ 1058) allowed Mr Bridges’ appeal, but to a limited extent. It certainly did not conclude that the use of AFR by police forces is unlawful on privacy or DP grounds. Instead, it concluded that the police force’s historic use of AFR over the period complained of by Mr Bridges was unlawful. Here are the key points:

The police’s use of AFR had not been ‘in accordance with the law’ (as required to justify a privacy interference under Article 8(2) ECHR). A ‘relativist approach’ was required: the more intrusive the act complained of, the more precise and specific must be the law said to justify it. This use of AFR was not as intrusive as the retention of fingerprint and DNA samples, but it was more intrusive than observation and the taking of photos, as AFR involves novel technology, automated processing and large numbers of affected individuals. So there needs to be a sufficiently specific legal framework governing its use. the DPA 2018, the Surveillance Camera Code of Practice and SWP’s local policies were valid and important parts of that governance framework, but they were too broad and vague as to where AFR could be used, who would be put on watch lists, and how the police could exercise their discretion. If guidance documents are tightened up, there is no barrier in principle to the use of AFR by police forces for such purposes.

The Court of Appeal agreed with the Divisional Court that the police’s use of AFR had been proportionate. A key point here was transparency: the police “did all that could reasonably be done to bring to the public’s attention that AFR Locate was being deployed at a particular place at a particular time”. Further, the impact on Mr Bridges was very limited, and the overall impact was not ramped up by aggregating the limited impact on lots of other people: “an impact that has very little weight cannot become weightier simply because other people were also affected.  It is not a question of simple multiplication.  The balancing exercise which the principle of proportionality requires is not a mathematical one; it is an exercise which calls for judgement” (para 143).

On the DP front, the Court of Appeal agreed with Mr Bridges that the police’s DP impact assessment was deficient – but only in that it failed to recognise the ‘in accordance with the law’ problem discussed above. Otherwise, it passed muster. We get some useful insight here into what a robust DPIA for intrusive activities might contain (para 151):

“The DPIA specifically acknowledged that AFR might be perceived as being privacy intrusive in the use of biometrics and facial recognition and that Article 8 of the Convention was relevant. It sought to explain that AFR would only avoid being in breach of Article 8 if it was necessary, proportionate, in pursuit of a legitimate aim and in accordance with the law but that all those requirements would be satisfied if AFR Locate was used in the manner set out. The DPIA explained how AFR Locate operates. It is obvious from that explanation that large numbers of the public would be caught through CCTV cameras used in the deployment. It specifically stated that: “It is the intention during each deployment to allow the AFR application to enrol and therefore process as many individuals as possible”. That the public at large was potentially affected was reflected in the statement that: “in order to ensure that the public are engaged in the use of the technology every opportunity has been taken to demonstrate its use, to include during Automated Facial Recognition deployments”.”

The appeal also succeeded as regards the public sector equality duty: “the reason why the PSED is so important is that it requires a public authority to give thought to the potential impact of a new policy which may appear to it to be neutral but which may turn out in fact to have a disproportionate impact on certain sections of the population”.

So: a degree of vindication for both sides, no in-principle obstacles to police using AFR, and some useful insights into assessing the proportionality of privacy-intrusive processing.

11KBW’s Andrew Sharland QC and Stephen Kosmin appeared for the Surveillance Camera Commissioner, intervening.

Robin Hopkins

Comments are closed.