Met Police to deploy Live Facial Recognition tech across London

Met Police to deploy Live Facial Recognition tech across London

Live facial recognition cameras

The Metropolitan Police has announced that it will soon deploy Live Facial Recognition technology in specific locations across London to tackle serious crimes such as gun and knife crime, child sexual exploitation, and violence targeting the vulnerable.

The force said that Live Facial Recognition cameras will be set up at specific locations across London where it believes the likelihood of locating serious offenders will be high and each deployment will have a “watch list” made up of images of individuals wanted for serious and violent offences.

The Live Facial Recognition cameras will be clearly signposted, will not be linked to any other imaging system like CCTV or body-worn video, and will only be used to prompt police officers about the presence of specific individuals in locations where they will be deployed.

“This is an important development for the Met and one which is vital in assisting us in bearing down on violence. As a modern police force, I believe that we have a duty to use new technologies to keep people safe in London. Independent research has shown that the public support us in this regard. Prior to deployment, we will be engaging with our partners and communities at a local level,” said Assistant Commissioner Nick Ephgrave.

“We are using a tried-and-tested technology, and have taken a considered and transparent approach in order to arrive at this point. Similar technology is already widely used across the UK, in the private sector. Ours has been trialled by our technology teams for use in an operational policing environment.

“Every day, our police officers are briefed about suspects they should look out for; LFR improves the effectiveness of this tactic. Similarly, if it can help locate missing children or vulnerable adults swiftly, and keep them from harm and exploitation, then we have a duty to deploy the technology to do this.

“Equally I have to be sure that we have the right safeguards and transparency in place to ensure that we protect people’s privacy and human rights. I believe our careful and considered deployment of live facial recognition strikes that balance,” he added.

Met Police’s Live Facial Recognition tech was found to be 81% inaccurate by researchers

The announcement from the Metropolitan Police comes not long after the High Court in Wales ruled in favour of the use of facial recognition technology by police forces in a lawsuit filed by a Cardiff resident who claimed that local police violated his privacy rights by scanning his face without consent.

The judges noted that the deployment of facial recognition technology by police forces was lawful and justified and that the deployments were limited in scope and time.

In July last year, an independent report accessed by Sky News and The Guardian revealed that facial recognition technology used by the Met Police was 81% inaccurate– which meant that a vast majority of people flagged by the technology turned out to be innocent.

The study was commissioned by Scotland Yard and carried out by academics from the University of Essex after Met Police conducted trials of its Live Facial Recognition tech at various locations such as Whitehall, Leicester Square and Westfield Stratford.

“Our report conducted a detailed, academic, legal analysis of the documentation the Met Police used as a basis for the face recognition trials. There are some shortcomings and if [the Met] was taken to court there is a good chance that would be successfully challenged,” said Professor Fussey to Sky News.

Professor Pete Fussey and Dr Daragh Murray, who carried out the study, noted that not only did the Met Police lack “an explicit legal basis” to carry out LFR trials, its “Watch lists” were also not updated to explain why certain people were included in such lists. However, the Met Police defended the trials, stating that there was a legal basis for the use of Live Facial Recognition tech.

“We are extremely disappointed with the negative and unbalanced tone of this report… We have a legal basis for this pilot period and have taken legal advice throughout. We believe the public would absolutely expect us to try innovative methods of crime-fighting in order to make London safer,” said Duncan Ball, the Met’s deputy assistant commissioner.

ALSO READ: Alphabet CEO backs temporary ban on facial-recognition, Microsoft disagrees

Copyright Lyonsdown Limited 2021

Top Articles

Carnival Cruises hit by fourth data breach in 18 months

Carnival Cruises, one of the world’s largest cruise ship operators, has confirmed that it suffered another data breach in mid-March.

NHS Test & Trace Consolidates Cyber Security

NHS Test and Trace has teamed up with cybersecurity company Risk Ledger to proactively manage its supply chain cybersecurity risks.

The expert view: Accelerating the journey to the cloud

At a virtual seminar on 9 June 2021, sponsored by managed IT service provider Sungard Availability Services, eight senior IT decision makers gathered to discuss how organisations can accelerate their…

Related Articles

[s2Member-Login login_redirect=”” /]