Policing Minister Chris Philp has said police forces should double the number of searches they make using retrospective facial recognition technology by May 2024.
He has published a letter to police chiefs setting out the importance of the technology and urging them to exceed 200,000 searches of still images against the Police National Database over the next few months.
Philp also encouraged a wider use of the technology in capturing live footage of crowds where offences could be committed, saying it could have a strong deterrent effect.
His letter notes a significant increase in the use of retrospective facial recognition since 2021, says he expects all police forces to “use it to its full potential” and that he is reviewing the process regularly with National Police Chiefs’ Council leads.
It also expresses support for the use of live facial recognition, and says the Government has been working with industry to develop a capability roadmap for the technology.
Powerful tool
Philp said in a public statement: “AI technology is a powerful tool for good, with huge opportunities to advance policing and cut crime. We are committed to making sure police have the systems they need to solve and prevent crimes, bring offenders to justice, and protect the public.
“Facial recognition, including live facial recognition, has a sound legal basis that has been confirmed by the courts and has already enabled a large number of serious criminals to be caught, including for murder and sexual offences.
“This is not about acquiring new kit and deploying new tech for the sake of it; it is about staying one step ahead of criminals; delivering smarter, more effective policing and ultimately making our streets safer.”
The Home Office highlighted the use of facial recognition to identify a wanted sex offender at the King’s Coronation earlier this year, and at the recent Arsenal v Tottenham match in which police caught three suspects, including one for sexual offences.
It said that, to ensure transparency, the police will put up notices in areas where they will be using live facial recognition.
If the system does not make a match against a watch list, a person’s data is deleted immediately and automatically.
Controversial technology
Facial recognition has been a source of controversy for police forces in recent years. South Wales Police ran into a legal challenge over its deployment, and the Metropolitan Police has attracted criticism from civil liberties groups for its trial deployments.
In addition, the Scottish Police Authority recently published a digital strategy that includes a more cautious approach to the adoption of the technology.
But the Home Office has been a strong advocate for the technology, and last year the College of Policing published a code of practice for its use.
The Home Office has also highlighted an independent study by the National Physical Laboratory of the algorithm the Met and South Wales Police use. It found that the technology was 100% accurate when used on still images and produced only one in 6,000 false alerts when used on live images. The police have not had any false alerts this year over 25 deployments.
The study also found no statistically significant differences in the performance based on gender or ethnicity at the settings the police use.
The home secretary is also convening an event today bringing together government, law enforcement and the tech industry to discuss how best to tackle child sexual abuse images which have been created using AI.