London’s police chief has defended the use of facial recognition technology, labelling critics “ill-informed”.
Dame Cressida Dick said eight criminals had been caught using the controversial live facial recognition cameras.
She said “inaccurate” critics should “justify to the victims of those crimes why police should not be allowed to use tech… to catch criminals”.
Privacy campaigners say the systems flag up innocent people as wanted suspects.
The Metropolitan Police Commissioner was responding to a report calling for tighter rules on police use of technology.
The report, from the Royal United Services Institute, looked at the use of data and algorithms by police in England and Wales. Among its recommendations were that police should issue new national guidelines in this area.
But Dame Cressida used her speech at the report’s launch to issue a strong defence of the use of data analytics by her officers – including the controversial deployment of facial recognition cameras.
The roaming cameras, set up in areas of London for hours at a time, scan people’s faces and compare them to a list of wanted suspects. But an independent review showed that most matches are false alarms – 19% were accurate.
“If an algorithm can help identify, in our criminal intelligence systems material, a potential serial rapist or killer… then I think almost all citizens would want us to use it,” she said.
“The only people who benefit from us not using [it] lawfully and proportionately are the criminals, the rapists, the terrorists and all those who want to harm you, your family and friends.”
The Met says its tests show cameras can identify 70% of suspects who walk past them.
But privacy campaigner Big Brother Watch says it is “a highly controversial mass surveillance tool with an unprecedented failure rate [of] 93%”.
Even a technically brilliant test will mainly give false alarms for rare things – like being on a police watchlist. This is because it has so many more opportunities to generate false alerts.
Say you’re looking to identify the players at the FA Cup Final, based solely on these facial recognition cameras, and you scan the entire stadium for them.
A test that correctly identifies 70% of targets should flag about 15 of the 22 players.
But since it’s being tested on a 90,000 capacity crowd as well, if it generates a false alert for one in 1,000 people – you’ll get 90 false matches.
So only 15 of the 105 matches would be verifiably correct.
False alarms don’t mean “bin the test”, they mean “don’t treat every ‘match’ like a criminal”.
“It is purely magical thinking to suggest facial recognition can solve London’s problem with knife crime,” said Silkie Carlo, director of Big Brother Watch.
“The Commissioner is right that the loudest voices in this debate are the critics, it’s just that she’s not willing to listen to them,” she said.
But Dame Cressida argued privacy concerns were overblown.
“In an age of Twitter and Instagram and Facebook, concern about my image and that of my fellow law-abiding citizens passing through [facial recognition] and not being stored, feels much, much smaller than my and the public’s vital expectation to be kept safe from a knife through the chest,” she said.