Police tweak 'biased' facial recognition software

Stuart WoodwardEssex
News imagePA Media Two police officers standing in front of a white police van, which has a live facial recognition camera on its roof. The two officers are blurred out, and the camera is in focus. There is an out-of-focus brick building in the background.PA Media

A police force has paused the use of live facial recognition (LFR) cameras after a study found it was statistically more likely to identify black people than other ethnic groups.

Essex Police has used the technology since summer 2024, but the study identified "a potential bias in the positive identification rate" of black people over white people on its watchlist.

The force said that following updates to its algorithm and software, it was confident that LFR cameras could be deployed again.

But campaign group Big Brother Watch said the technology was "authoritarian, inaccurate and ineffective in equal measure".

Essex Police said it commissioned two independent studies into its use of LFR – carried out by the University of Cambridge – with one of them indicating the potential bias.

In that study, 188 volunteers acted as members of the public in a controlled field experiment during a real police deployment, with the system correctly identifying about half the people on the watchlist who passed the cameras.

But the report said the system was more likely to correctly identify men than women, and "it was statistically significantly more likely to correctly identify black participants than participants from other ethnic groups".

The second study – which analysed more than 40 deployments of LFR technology between August 2024 and February 2025 – found it had scanned approximately 1.3 million faces in public spaces.

It said that officers intervened 123 times to speak to people and carried out 48 arrests – roughly one for every 27,000 faces scanned – and there was one confirmed mistaken intervention.

News imagePA Media A police officer sitting at a computer terminal with his hand on a mouse. On the monitor in front of him is a video feed of live facial recognition, which shows two people walking past the camera with their faces pixellated. There is writing next to them which says "Unknown Face".PA Media
Essex Police scanned approximately 1.3 million faces in public spaces between August 2024 and February 2025

A spokesperson for Essex Police said it "decided to pause deployments while we worked with the algorithm software provider to review the results and seek to update the software".

They added: "We have revised our policies and procedures and are now confident that we can start deploying this important technology as part of policing operations to trace and arrest wanted criminals.

"We will continue to monitor all results to ensure there is no risk of bias against any one section of the community."

The BBC understands that Essex Police procured a different LFR system to other forces, and it would resume using the technology in the coming weeks.

The Home Office declined to comment.

The government announced in January it would increase the number of LFR vans across the country from 10 to 50, with the home secretary saying she made "no apology" for rolling the technology out to all police forces.

But Jake Hurfurt, from Big Brother Watch, said it had warned that the use of LFR could "put the rights of thousands of people at risk".

"LFR as a tool of general mass surveillance has no place in a democracy like Britain, but if police are going to use it the very least the public can expect is that it doesn't racially discriminate against people," he said.

Follow Essex news on BBC Sounds, Facebook, Instagram and X.

Related internet links