London is buying heaps of facial recognition tech

The Metropolitan Police is buying a new facial recognition system that will supercharge its surveillance technology capabilities
Image may contain Wall
Shutterstock

The UK’s biggest police force is set to significantly expand its facial recognition capabilities before the end of this year. New technology will enable London’s Metropolitan Police to process historic images from CCTV feeds, social media and other sources in a bid to track down suspects. But critics warn the technology has “eye-watering possibilities for abuse” and may entrench discriminatory policing.

In a little-publicised decision made at the end of August, the Mayor of London’s office approved a proposal allowing the Met to boost its surveillance technology. The proposal says that in the coming months the Met will start using Retrospective Facial Recognition (RFR), as part of a £3 million, four-year deal with Japanese tech firm NEC Corporation. The system examines images of people obtained by the police before comparing them against the force’s internal image database to try and find a match.

“Those deploying it can in effect turn back the clock to see who you are, where you've been, what you have done and with whom, over many months or even years,” says Ella Jakubowska, policy advisor at  European Digital Rights, an advocacy group. Jakubowska says the technology can “suppress people's free expression, assembly and ability to live without fear”.

The purchase of the system is one of the first times the Met’s use of RFR has been publicly acknowledged. Previous versions of its facial recognition web page on the Wayback Machine shows references to RFR were added at some stage between November 27, 2020, and February 22, 2021. The technology is currently used by six police forces in England and Wales, according to a report published in March. “The purchase of a modern, high-performing facial recognition search capability reflects an upgrade to capabilities long used by the Met as well as a number of other police forces,” a spokesperson for the Met says.

Critics argue that the use of RFR encroaches on people’s privacy, is unreliable and could exacerbate racial discrimination. “In the US, we have seen people being wrongly jailed thanks to RFR,” says Silkie Carlo, director of civil liberties group Big Brother Watch. “A wider public conversation and strict safeguards are vital before even contemplating an extreme technology like this, but the Mayor of London has continued to support expensive, pointless and rights-abusive police technologies.”

A spokesperson for the Mayor of London defended the use of the technology, saying it will shorten the time it takes to identify suspects and help reduce crime in the capital. “Whilst this is clearly an important policing tool, it’s equally important that the Met Police are proportionate and transparent in the way it is used to retain the trust of all Londoners,” the spokesperson says. The London Policing Ethics Panel, an independent scrutiny group set up by the Mayor’s office, has been tasked with reviewing and advising the Met on its use of the RFR, although this process has not happened before the purchase of the technology was approved. The Ethics Panel did not respond to a request for comment.

Political support for the use of facial recognition remains contested in the UK, with MPs from Labour, the Liberal Democrats and the Green Party all calling for regulations on the use of the technology. “I’m disappointed to see this latest development in the Met’s use of Retrospective Facial Recognition software,” says Sarah Olney, Liberal Democratic MP for Richmond Park. “It comes despite the widespread concerns as to its accuracy, along with its clear implications on human rights. Better policing ought to start from a foundation of community trust. It’s difficult to see how RFR achieves this.”

The expansion of the Met’s facial recognition technology, which also includes Live Facial Recognition (LFR) systems that are used in public places, comes at a time when the legality of such systems remain in question with serious concerns being raised about its deployment. Lawmakers around the world are considering how to regulate facial recognition systems and multiple cities have banned the use of the technology. 

The UK’s data regulator, the Information Commissioner’s Office, has not published official guidance on the use of RFR. “Police forces wishing to use RFR technology must comply with data protection law before, during and after its use,” an ICO spokesperson says, adding that organisations must put in place robust policies and complete a Data Protection Impact Assessment (DPIA) prior to processing people’s data. “These are crucial steps to take so public trust is not lost,” the spokesperson says.

The Met’s approved proposal says the technology will “ensure a privacy by design approach”. However, when it was approved by the Deputy Mayor for Policing and Crime last month, a DPIA had not yet been completed. A spokesperson for the Met says it was first necessary to appoint a supplier and that the document will be completed and published before any data processing begins. The spokesperson also says “the use of any image will be subject [to] a carefully implemented framework which reflects and responds to the expectations of privacy that attach to any image used”. Details of the framework were not provided.

Publicly available information on the way police forces deploy RFR is sparse. A briefing note published by South Wales Police last year shows that it used its RFR system to process 8,501 images between 2017 and 2019 and identified 1,921 potential offenders in the process. The technology can also be used to identify missing or deceased people. Despite being used by various police forces in the UK, the technology has largely evaded the intense public and legal scrutiny that has accompanied the use of LFR technology. LFR scans the faces of people walking past a camera and compares them against a watchlist in real-time.

In July 2019, the House of Commons Science and Technology Committee recommended that LFR should not be used until concerns regarding the technology’s bias and efficacy are resolved. In August 2020, the UK’s Court of Appeal found that South Wales Police’s use of LFR was unlawful and this month, the United Nations High Commissioner for Human Rights called for a moratorium on the use of LFR. But the Met says it will continue to use its LFR technology, alongside the new RFR system, when it deems it appropriate. “Each police force is responsible for their own use of Live Facial Recognition (LFR) technologies,” a Met spokesperson says. “The Met is committed to give prior notification for its overt use of LFR to locate people on a watchlist. We will continue to do this where a policing purpose to deploy [it] justifies the use of LFR.”

The Met’s proposal states that “the RFR use case is very different to LFR and seeks to help officers identify persons from media of events that have already happened”. But experts warn that differentiating the two types of technology is not straightforward. “Both have the potential to be massively invasive tools, capable of creating a record of individuals’ movements across our cities,” says Daragh Murray, a senior lecturer at the University of Essex Human Rights Centre and School of Law who previously reviewed the Met’s facial recognition systems. “Depending on how it’s deployed, RFR can, in fact, be strikingly similar to LFR.” The RFR product set to be bought by the Met is also made by NEC, the same company that makes its LFR system. A spokesperson for the company said they were unable to comment on specific customer projects.

As well as LFR and RFR, the Met also has access to the Police National Database (PND) facial search facility. The PND is a centralised database of custody images uploaded by police forces around the country and managed by the Home Office. Since 2014, the database has had a facial search function. A spokesperson from the Met confirms it often relies on images that are not on the PND, saying that it may have, for example, other images provided by the public as a result of an appeal. “Understanding who is in that imagery is important to solving crime,” the spokesperson adds. “Equally important can be establishing where a person has appeared across a number of images. This can help the Met link events together, develop investigative leads and use other image holdings to solve crime.”

Murray says debates about the use of facial recognition “should go beyond privacy” and also consider the legality and necessity of these tools. “We really need to think what the impact might be on democratic rights, like freedom of expression, association, assembly and religion,” he says. “There is no one size fits all approach to LFR or RFR. The circumstances of deployment will significantly affect the human rights impact. This should be set out in legislation, with appropriate safeguards.”


More great stories from WIRED

This article was originally published by WIRED UK