Police Use of Face Recognition Is Sweeping the UK

Face recognition technology has been controversial for years. Cops in the UK are drastically increasing the amount they use it.
People gathered around outside with green squares overlaid on their faces
Photograph: martinwimmer/Getty Images

A Beyoncé gig, the coronation of King Charles, and the British Formula One Grand Prix all have one thing in common: Thousands of people at the events, which all took place earlier this year, had their faces scanned by police-operated face recognition tech.

Backed by the Conservative government, police forces across England and Wales are being told to rapidly expand their use of the highly controversial technology, which globally has led to false arrests, misidentifications, and lives derailed. Police have been told to double their use of face searches against databases by early next year—45 million passport photos could be opened up to searches—and police are increasingly working with stores to try to identify shoplifters. Simultaneously, more regional police forces are testing real-time systems in public places.

The rapid expansion of face recognition comes at a time when trust in policing levels are at record lows, following a series of high-profile scandals. Civil liberties groups, experts, and some lawmakers have called for bans on the use of face recognition technology, particularly in public places, saying it infringes on people’s privacy and human rights, and isn’t a “proportionate” way to find people suspected of committing crimes.

“In the democratic world, we are an outlier at the moment,” says Madeleine Stone, a senior advocacy officer with Big Brother Watch, a privacy-focused group that has called for a ban and “immediate stop” on live face recognition, a proposal backed by 65 UK lawmakers. The EU, which the UK left in 2016, may ban the real-time use of face recognition systems, and one of its highest courts has called the technology “highly intrusive.” Various US states have banned police from using the technology.

Cops in England and Wales can hunt for potential criminals using two main kinds of face recognition. First, there are live face recognition systems (LFR): These usually include cameras mounted on police vans that scan people’s faces as they walk by and check them against a “watchlist” of wanted people. The LFR technology is deployed for some big events and announced in advance by the police. Second, there’s retrospective face recognition (RFR), where images from CCTV, smartphones, and doorbell cameras can be fed into a system that tries to identify the person based on millions of existing photos. Police use of both systems is increasing.

Two police forces in England and Wales—London’s Metropolitan Police and South Wales Police—have embraced LFR, using the technology for multiple years. (Police in Scotland, where policing is overseen locally, don’t use live systems but are reportedly increasing their use of RFR). So far this year, the Met and South Wales Police have used LFR on 22 separate occasions, according to statistics published on their websites.

Across 13 deployments, an estimated 247,764 people have passed the Met’s cameras this year, including at the King’s coronation and a soccer match between Arsenal and Tottenham. In 2023 so far, the Met’s system has pinged 18 times when it has matched a person with someone on the predetermined watchlist. Twelve of those people have been arrested. South Wales Police has used LFR nine times this year, scanning an estimated 705,290 faces and arresting two people.

Researchers and academics have long shown face recognition technologies to be biased or less accurate for people of color, particularly Black people. In all of the LFR uses by the Met and South Wales police forces this year, their figures show, there have been no false alerts or misidentifications. In short, every time the system made a match, it was correct, according to the departments’ data. The data published by the police forces show the LFR systems they are using are set to an accuracy threshold of either 0.6 or 0.64. A study conducted by the National Physical Laboratory, a UK public sector research organization, recommended the threshold setting of 0.64 as a way to reduce biased and incorrect matches.

Pete Fussey, a professor at the University of Essex who previously audited the Met’s face recognition, says that the report shows the algorithms may still be discriminatory and biased, but setting the thresholds neutralizes the system entirely. “If you desensitize the system, then you'll get fewer matches and fewer of those will be wrong,” Fussey says. Big Brother Watch’s Stone says it “brings into question the necessity and proportionality of the whole endeavor if you’re having dozens of police officers standing around the street looking at iPads just to make one arrest.” Stone argues “it’s not a good use of time.”

A spokesperson for the Home Office, the government department responsible for policing in England and Wales, says it is “committed” to giving police the technology they need to help solve crimes. “Technology such as facial recognition helps the police quickly and accurately identify those wanted for serious crimes, as well as missing or vulnerable people,” the spokesperson says, claiming it “frees up police time and resources.” They point to a blog post about police use of face recognition technologies.

“Live facial recognition isn't just deployed on its own. It's in support of wider policing operations,” says detective chief inspector Jamie Townsend, who is the operational lead for the Met's face recognition. A policing operation may make 30 arrests in an area with face recognition contributing two or three of them, he says.

Townsend says that the biometric data of people isn’t stored when they are scanned by LFR cameras, and he considers the system to be “precision policing” as it allows for the identification of specific individuals. The Home Office’s blog post claims LFR has led to the arrest of sex offenders and someone wanted for possessing a knife. South Wales Police did not respond to WIRED’s request for comment at the time of writing.

Two other UK police forces have tried LFR in recent months, raising concerns about how the technology will be used going forward and who is put on the watchlists of people the systems are looking for. There are more than 30 police forces that have not used LFR, although the government has said it is “very supportive” of more forces using it.

Around 380,000 people had their faces scanned by Northamptonshire police over three days at the British Grand Prix in July, according to public records requests. No arrests were made. A Freedom of Information Act request from Big Brother Watch revealed the force had placed 790 people on the watchlist for the event while only 234 were wanted for arrest. “It was largely individuals who were not wanted for any criminal reasons, which leads us to believe that these were protesters who were being put on watchlists,” Stone says. The previous year’s British Grand Prix was disrupted by protesters.

A statement from Northamptonshire Police says the watchlist “included a range of offences from organized crime to aggravated trespass, which had the potential of leading to the death or serious injury of the public, racing drivers and staff, especially if there was a repeat of the 2022 track incursion. We were not targeting those taking part in lawful, peaceful protests.”

Police forces are also being encouraged to ramp up the use of after-the-fact, retrospective face recognition. All police forces across the UK have the ability to run searches for faces against the Police National Database, which has more than 16 million photos and includes millions of images that should have been deleted years ago. In 2022, according to data from freedom of information requests, there were 85,158 face searches—up 330 percent on the previous year. Policing minister Chris Philp said in October that he wants the number of searches to double by May 2024.

Fussey, the University of Essex professor, says retrospective face recognition is often thought to be more “benign” than live face recognition, but he doesn't believe it is the case. “The issues and harms and human rights implications are the same, whether it's live or retrospective,” he says, adding there is a lot of ambiguity around how the technology is being used.

In August 2020, the UK’s Court of Appeal ruled that South Wales Police’s use of LFR was unlawful. Since then, police forces using the technology say they have changed their procedures in response to the court decision, and the Home Office spokesperson says there is a “comprehensive legal framework in the UK” that requires police to use the technology only when it is “necessary, proportionate, and fair.”

Many disagree. A wide-ranging review from the Ada Lovelace Institute, a nonprofit, says there is “legal uncertainty” about the use of LFR. Another report by University of Cambridge academics, from the Minderoo Centre for Technology and Democracy, concluded that three examined police deployments of face recognition “failed to meet the minimum ethical and legal standards.”

“It’s wholly legitimate for police to use technology to keep the public safe,” says Fussey, who recently completed a report on proposals to change the oversight of biometrics in the UK. “The question is about how lawful and necessary it is,” he says. “At the moment, we’re in a situation where the legal basis isn’t clear. There’s no external oversight of how it’s used, how it’s authorized, who’s on the watchlist.”