Bridges v CCSWP: A landmark case in the era of automated facial recognition

On 4 September 2019, the UK High Court issued the world’s first decision on the privacy implications of law enforcement’s use of automated facial recognition (AFR) in the case R (Bridges) v The Chief Constable of South Wales Police. [1] 

THE FACTS 

Edward Bridges, a civil liberties activist, raised the judicial review in response to the trialled use of AFR by the South Wales Police (SWP). In his claim, Bridges alleged that the use of AFR by the SWP on two occasions amounted to a violation of the right to “private and family life” under Article 8 of the European Convention of Human Rights (ECHR), that it failed to comply with the provisions of the Data Protection Act 2018 (DPA), and that it may elicit indirect discrimination by ill-considered deployment, thus violating the Equality Act 2010.  

Bridge’s claim rested on two occasions where AFR was deployed by SWP - the first in Cardiff City Centre, the second at a public event at the Motorpoint Arena. During each occurrence, a single marked van containing AFR equipment was deployed for a limited four-hour period. On both occasions, a notice was published on social media, and the respective vans containing AFR were clearly marked by SWP. The constabulary claimed that the purpose of the deployment was to locate and detain “persons on a watchlist”.  

COURT DECISION

In deciding the ECHR Article 8 claims, the court determined that the operation of AFR was within the authority of SWP. This ruling was founded on the lack of exhaustive definition of the right to privacy under the ECHR and the law’s qualification that interference is permitted “in accordance to the law”. Justice Swift determined that the SWP possessed “amply sufficient” powers within the principles of common law. He also ruled that there was a “clear and sufficient legal framework for governing whether, when and how AFR Locate may be used,” thus fulfilling the qualification of “in accordance to the law”.

When considering the second claim that the use of AFR by SWP failed to comply with the provisions of the Data Protection Act 2018, the court found that the use of AFR “was done in an open and transparent way, with significant public engagement [and] was undertaken for a limited time, covering a limited footprint,” thereby satisfying the first data protection principle. The court thus reasoned that the data protection undertaken was consistent with regulations.

The final claim - that ill-conceived deployment of AFR could breach the provisions of the Equality Act - were similarly-dismissed on the ground that the Equality Impact Assessment document “demonstrated that due regard was had by SWP” with regards to the provisions of the Act. 

The court subsequently dismissed the review on the grounds that the “current legal regime is adequate to ensure the appropriate and non-arbitrary use of AFR”. On 20 November 2019, the Court of Appeal agreed to hear the case and, at the time of writing, is currently deliberating.  

The judgement of the High Court may come as a blow to privacy activists. However, the case itself represents the first of its kind, and is of huge political and legal significance in determining the future use of automated facial recognition by police in the UK. 

[1] R (Bridges) v The Chief Constable of South Wales Police [2019] EWHC 2341

Screenshot 2020-08-13 at 09.26.52.png

Finley is a Politics and International Relations Graduate from Queen Mary University of London and currently an MSc Candidate in International Planning at UCL. He has a keen interest in the intersection of urbanity, human rights and justice. 

LinkedIn