The Gerrard litigation:  the death-knell for litigation surveillance?

The recent decision of the High Court (Richard Spearman QC, sitting as a Judge of the Queen’s Bench Division) in David Neil Gerrard and Elizabeth Ann Gerrard v Eurasian Natural Resources Corporation Limited and Diligence International LLC [2020] EWHC 3241 (QB), relates to one aspect of the complex litigation between Mr. Gerrard (currently a partner at Dechert LLP, a law firm) and ENRC (his former client).   The decision deals with various interlocutory applications in a claim that is itself ancillary to the main proceedings.  Nevertheless, even though it relates to a skirmish in a much more extensive battle, the decision is of considerable interest in its own right, in particular as to the use of covert surveillance in the context of litigation.

Mr. Gerrard was ENRC’s solicitor between December 2010 and March 2013, acting for ENRC in relation to a SFO investigation.  In 2017, ENRC brought proceedings against Mr. Gerrard in the Commercial Court alleging that Mr. Gerrard had acted negligently and in breach of fiduciary duty by seeking to extend the scope of the SFO’s investigation into ENRC, and by leaking information about ENRC to the media and the SFO.  In 2019, ENRC brought further proceedings in the Chancery Division against the Director of the SFO, for (among other matters) inducing Dechert LLP and/or Mr. Gerrard to breach their fiduciary duty to ENRC. Continue reading

Coronavirus and Information Law

This week has brought unprecedented disruption to the legal system, and the whole economy.  The Panopticon team, and all of us at 11KBW, are working hard to ensure that we can continue to provide you with the level of service that you have come to expect.  Meanwhile, here are some initial responses to the Coronavirus pandemic from an information law perspective. Continue reading

Of Tweeting and Transgender Rights

Over the years, Panopticon has discussed a number of cases about the powers of the police to record, retain, and disseminate information about individuals.  The judgment of Mr. Justice Julian Knowles in R (ota Harry Miller) v (1) The College of Policing, and (2) The Chief Constable of Humberside [2020] EWHC 225 (Admin) is a significant contribution to the law in this area.  In Panopticon terms the case is unusual, in that the issues are discussed by reference to the right to freedom of expression under Article 10 of the European Convention on Human Rights (“ECHR”), rather than by reference to Article 8 or data protection legislation.

An important part of the context for the case is the current political controversy regarding the status of transgender people, including proposals to reform the Gender Recognition Act 2004 so as to replace the current requirements for obtaining a Gender Recognition Certificate (GRC) with an approach that places greater emphasis on an individual’s self-identification of their gender.  Reforms along these lines were the subject of a Government consultation in 2018.  In this respect also, the case takes Panopticon into hitherto unchartered waters. Continue reading

Data Breach, Group Actions, and the criminal insider: the Morrisons case

 

A spectre is haunting data controllers – the spectre of group liability for data breach.

In Vidal-Hall v Google [2015] EWCA Civ 311 the Court of Appeal held that damages claims under section 13 of the Data Protection Act 1998 (DPA) can be brought on the basis of distress alone, without monetary loss.  Since that decision there has much speculation that a major data breach could lead to distress-based claims against the data controller by a large class of individuals.  Even if each individual claim was modest (in the hundreds or low thousands of pounds) the aggregate liability could be substantial.

Cases of this nature may give rise to important questions of public policy.  Often the data controller will themselves be the victim of malicious or criminal conduct, involving a hack by outsiders or a data leak by insiders. In such situations, should the data controller be required to compensate data subjects?  What if the very purpose of the hack or leak was to damage the data controller, so that by imposing civil liability on the controller the Courts would help further that purpose?

The recent decision of the High Court in Various Claimants v Wm Morrisons Supermarket PLC [2017] EWHC 3113 is the first significant case to grapple with these issues post Vidal-Hall.  The case involves a group claim brought by some 5,500 Morrisons’ employees in connection with the criminal misuse of a significant quantity of payroll data by a rogue employee.  In a lengthy judgment handed down on 1st December 2017, Langstaff J found that Morrisons were not directly liable to the claimants in respect of the criminal misuse of the data, whether under the DPA or at common law, but that they were nevertheless vicariously liable.  The trial dealt only with liability: quantum remains to be determined.

11KBW’s Anya Proops QC and Rupert Paines acted for Morrisons. Continue reading

Safe Harbour and the European regulators

On 6th October 2015 the CJEU declared the Commission’s Safe Harbor Decision invalid, in Case C-362/14 Schrems.  Since then, data protection specialists have discussed little else; and Panopticon has hosted comments by Chris Knight, Anya Proops, and Robin Hopkins.

How have EU data protection regulators responded to the judgment?

The ICO’s immediate response came in a statement from Deputy Commissioner David Smith.  This struck a careful and measured tone, emphasising that the Safe Harbour is not the only basis on which transfers to the US can be made, and referring to the ICO’s earlier guidance on the range of ways in which overseas transfers can be made.

On 16th October the Article 29 Working Party issued a statement taking a rather more combative line.  Here are the main points.

  1. The question of massive and indiscriminate surveillance (i.e. in the US) was a key element of the CJEU’s analysis. The Court’s judgment required that any adequacy analysis implied a broad analysis of the third country domestic laws and international commitments.
  1. The Working Party urgently called on Member States and European institutions to open discussions with the US authorities to find suitable solutions. The current negotiations around a new Safe Harbour could be part of the solution.
  1. Meanwhile the Working Party would continue its analysis of how the CJEU judgment affected other transfer tools. During this period Standard Contractual Clauses and Binding Corporate Rules could still be used.  If by the end of January 2016 no appropriate solution with the US had been found, the EU regulators would take “appropriate actions”.
  1. Transfers still taking place based on the Safe Harbour decision were unlawful.

There are a couple of key messages here.  One is that it seems doubtful that the Article 29 Working Party would regard an adequacy assessment by a data controller as being a proper basis for transfer to the US:  see point 1.  A second is that there is a hint that even standard clauses and BCRs might not be regarded a safe basis for transfer (see point 3): the answer will depend on the outcome of the Working Party’s further analysis of the implications of Schrems.

The rise of the Ubermensch

 

In May 2012, Transport for London licensed Uber London Limited as an operator of private hire vehicles in London.

Uber is controversial.  It’s a good example of how new technology can disrupt existing business models in unexpected ways.  One controversy is addressed by Ouseley J in Transport for London v Uber London Limited and others [2015] EWHC 2918 (Admin):  whether the way in which the Uber fare is calculated infringes the criminal prohibition on the use of a taximeter in a London private hire vehicle. Answer – it doesn’t.

What does any of this have to do with Panopticon?  Our usual concerns, broadly speaking, are with access to public sector information, and with information privacy (including its interaction with freedom of expression).  But these fields are fundamentally shaped by developments in the technology that is used for collecting, sharing and using information.  A wider understanding of the legal issues to which those developments can give rise is valuable, even if it takes us a little outside the usual ambit of this blog.

So:  in London there are black cabs, and there are private hire vehicles (PHVs).  PHVs are subject to three-fold licensing:  the operator, the vehicle, and the driver must all be licensed.  One of the restrictions under which PHVs operate is that it is a criminal offence for the vehicle to be equipped with a taximeter: see section 11(1) of the Private Hire Vehicles (London) Act 1998.  A taximeter is defined by section 11(3) as “a device for calculating the fare to be charged in respect of any journey by reference to the distance travelled or time elapsed since the start of the journey (or a combination of both)”.

Uber operates in London as a licensed PHV operator (though the vehicles in its network include both PHVs and black cabs).  It uses technology that – as Ouseley J points out – was not envisaged when the relevant legislation was introduced in 1998.  “As was agreed, the changes brought about by the arrival of Google, the Smartphone equipped with accurate civilian use GPS, mobile internet access and in-car navigation systems, would not have been within the contemplation of Parliament in 1998.” (Google was in fact incorporated in 1998, and what it has to do with the case is obscure, but let that pass).

In order for the Uber system to operate, both the driver and the customer must have a smartphone, and must download the Uber Driver App and Customer App respectively.  The customer makes a booking using the Customer App.  The booking is transmitted to Uber’s servers in the US, and thence to the smartphone of the driver of the nearest vehicle in London – if that driver does not accept the booking, it is sent to the next nearest vehicle.  When the driver picks up the customer, the driver presses the “begin trip” icon on the Driver App.  At the end of the journey he presses “end trip”.  Signals are then sent to Uber’s servers in the US by the driver’s Smartphone, providing them with GPS data from the driver’s smartphone and time details.  One of the servers (“Server 2”) obtains information from another server about the relevant fare structure, and then calculates the fare and transmits information to the Driver App and the Customer App about the amount charged.  The customer’s credit or debit card is charged for the journey.

Does all this mean that the vehicle is equipped with a taximeter?

No, said Ouseley J, in proceedings brought by Transport for London seeking a declaration that PHVs in the Uber network are not equipped with a taximeter.

The argument before Ouseley J was that the driver’s smartphone, operating using the Driver App, was a taximeter.  But the fatal objection to this argument was that the fare was calculated by Server 2 not by the smartphone, and hence the calculation was done remotely and not in the vehicle itself.  To contravene section 11, it was not sufficient that the calculation was done using information uploaded from the smartphone, and that the calculation was then transmitted to and received on the smartphone.  Hence the smartphone was not a device falling within section 11(3). Moreover, even if the smartphone was a relevant device, the vehicle was not equipped with it; it was the driver who was equipped, and so the prohibition in section 11(1) was not infringed in any event.

Ousely J considered the case-law about the need to adopt an updating or “always speaking” construction of legislation, to take account of technological or scientific developments: see R (Quintavalle) v Secretary of State for Health [2003] UKHL 13, [2003] 2 AC 687.  This case law had no bearing, since the section 11 was in general terms and entirely capable of being applied to modern technology; there was no need to adopt any updating construction of the section.

The Uber case is a useful reminder that controversies about the implications of developments such as big data, cloud computing, and mobile internet access, are not just about privacy and data protection.  Rather, the issues are pervasive and can be expected to affect every corner of the law (and of politics, the economy, and society).

The mobile data devices that we use are constantly interacting with other devices and information storage facilities, including servers.  For the purpose of our daily lives, usually all we are interested in is specific transactions (like booking and paying for a PHV): we do not need to think about the different stages of information processing that underpin the transaction.  But for regulatory purposes, breaking down a transaction into those stages, and understanding when and how each stage takes place, can be essential.  Uber drivers and customers don’t need to think about Server 2:  but if you want to know whether Uber breaks the law, Server 2 is crucial.