Looking at Facial Recognition Technology
Julie Elizabeth Boyd - Liverpool John Moores University
-- One of the areas of law I have been interested in are issues concerning Human Rights and Civil Liberties. As part of my recent paper, entitled ‘Facial Recognition Technology: Private Lives and Public Spaces,’ which was presented to the CUHK Machine Lawyering Conference 2021, considered the Use of Facial Recognition Technology in the United Kingdom.
Security and law enforcement agencies are led by an increased demand to combat terrorism and crime. Amongst the methods in their armory is the increasing use of digital surveillance, which includes Facial Recognition Technology (FRT). FRT is a combination of applications that perform a specific task by using a human face to verify or identify an individual. FRT is able to capture the facial biometrics of members of the public who are passing within the range of surveillance cameras which have been deployed in a specific area. FRT then compares this data to the facial biometrics of individuals who are on what is known as police “watchlists”. Simply put, it is an automated system that uses technology to scan faces in crowds.
FRT systems for policing can be used to detect and prevent crime. They can be used to identify wanted, suspected or potential criminals, to identify protestors and the movement of persons in any given space where it is being utilized. They can also be used to help find missing persons, identify and find exploited children, identify and track criminals and support and accelerate investigations. However, there have been growing social and legal concerns due to the specific problems caused by its deployment and use.
Unrestricted use of FRT has been strongly criticised by civil liberties groups and the Equality and Human Rights Commission (EHRC) who claim it is a disproportionate invasion of privacy, including concerns around lack of transparency and oversight. FRT involves widespread monitoring, collection and storage of sensitive personal data without any individualized reasonable suspicion of criminal wrongdoing which may be considered as a form of indiscriminate mass surveillance. As such, democratic values and fundamental rights may be undermined without legal safeguards.
In October 2019, a Police Report revealed that FRT had been in use at Kings Cross since 2016. King’s Cross was one of the first landowners to admit that it had used FRT by deploying facial identification technology in CCTV cameras which prompted concerns over ethics and legality. Individuals would be tracked without their knowledge and hence without their consent. As such there were obvious concerns about a lack of appropriate oversight, safeguards, and procedures.
London Metropolitan Police issued an apology for failing to previously disclose that the scheme had existed. In September 2019, King’s Cross made the decision to abandon any future plans to use FRT.
In April 2018, Greater Manchester Police used FRT in the large Shopping Centre complex, known as the Trafford Centre, for six months and this was considered the largest FRT pilot trial in the UK but was stopped by the Surveillance Camera Commissioner, which is the Government’s Independent Regulator, amid concerns and that it had not been authorised by senior police officers. In September 2018, the pilot trial was suspended.
For the first time in the world, the issue of AFR was brought before the courts. R (Bridges) v Chief Constable of South Wales Police and Secretary of State for the Home Department  EWHC 2341 (Admin). This case was later heard in the Court of Appeal. R (Bridges) V Chief Constable Of South Wales Police & Information Commissioner  EWCA Civ 1058. The Court of Appeal ruled that the use of AFR by South Wales Police was unlawful and upheld three of the five points raised in the appeal.
The public do not have a right in law to appeal against a decision to install or use FRT. The only area of authority that may be referred to or relied upon is the Biometrics Commission, created by the Protection of Freedoms Act 2012 which has established a set of general ethical principles that should be applied to any FRT trial.
The public should be informed and consulted when an FRT trial is to be conducted and the purpose and general approach to evaluation must be explained The public should be informed of the location where the FRT trial is in progress and provided with details of whom to contact for further information.
If a member of the public felt that FRT was being used for any unlawful recording they can challenge the police if they feel that their own privacy is being compromised. However, if the police can show that they have a legitimate aim and meet the oversight and regulation framework (Security Camera Commission, Biometrics Commission and Information Commissioners Office) then it is likely that their use of FRT in that instance will be legally justified.
FRT is being utilized in various other public spaces in countries such as the United States of America, Australia, as well throughout the UK. These places include factories, cafes, airports, shopping areas, government buildings and even schools. As pointed out, FRT is initially seen as a technology used to identify criminals or missing people. Yet, its increasing use in other areas of public life may need to be more transparent, more closely scrutinized and more often challenged.
It is clear that as technology advances, the law will need to keep pace to ensure that the use of such technology does not breach human rights and civil liberties, and that if it does, it is answerable to the rule of law.