Controversial surveillance software designed to help spot suspects in public spaces has received support by the UK home secretary Sajid Javid.
While civil liberties campaigners have criticised the technology and highlighted its inaccuracies, Javid said it was important that the police made use of the latest tools to help them solve crimes.
It’s a debate also taking place in the US, where Amazon has come under fire from campaigners against selling its Rekognition system to government agencies.
Rekognition is an online tool that works with both video and still images and allows users to match faces to pre-scanned subjects in a database containing up to 20 million people.
Amazon recommends law enforcement agents should only use the facility if there is a 99 per cent or higher confidence rating of a match and says they should be transparent about its usage.
The home secretary’s support could signal encouragement for tech and retail companies developing their own facial recognition technology ostensibly for consumer purposes, that could also be used by government agencies.
Several forces, including the Met, have been trialling the software at large-scale events such as football matches, festivals and parades, where high definition cameras are set up to detect faces and compare them with existing police photographs, such as mugshot photos from previous arrests.
Speaking at the launch of new computer technology to help police fight online child abuse, Javid said forces were right to “be on top of the latest technology”.
“I back the police in looking at technology and trialling it and… different types of facial recognition technology is being trialled especially by the Met at the moment and I think it’s right they look at that,” said Javid, as reported by the BBC.
Javid said longer term use of the cameras would require legislation.
“If they want to take it further it’s also right that they come to government, we look at it carefully and we set out through Parliament how that can work,” he added.
Civil rights campaigners in the UK have criticised the fact there is no regulation governing how police use the software or manage the data they gather.
Privacy rights group Big Brother watch told the BBC in May that use of facial recognition tech “must be dropped immediately,” due to the inaccuracies found with the technology.
Documents from the police, Home Office and university researchers reported by the BBC show that police are aware that ethnicity can have an impact on such systems, but have failed on several occasions to test this.
“The police’s failure to do basic accuracy testing for race speaks volumes,” said Big Brother Watch director Silkie Carlo, speaking to the BBC.
“Their wilful blindness to the risk of racism, and the risk to Brits’ rights as a whole, reflects the dangerously irresponsible way in which facial recognition has crept on to our streets. The technology had too many problems to justify its used. It must be dropped immediately,” added Carlo.