Privacy and the police: British activist to appeal ‘sinister’ facial recognition

Privacy and the police: British activist to appeal ‘sinister’ facial recognition

Faciel recognition technology in use in public

Privacy campaigners in Britain said on Wednesday they would appeal a landmark ruling that supported “sinister” police use of facial recognition. The police use the technology to hunt for suspects.

The High Court in Wales dismissed a legal challenge by Ed Bridges, a resident of Cardiff. He had argued that local police breached his privacy rights, by scanning his face without consent. Judges said the case was the first of its kind worldwide.

Civil rights group Liberty, which represented Bridges, said it would appeal the “disappointing” decision.

Bridges said in a statement: “This sinister technology undermines our privacy. I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance”.

From malls to airports, facial recognition is increasingly present worldwide, raising fears over privacy. In August, Britain’s data protection watchdog said it was “deeply concerned” about the technology. This was after it emerged that museums, shopping centres and other properties were all using surveillance cameras.

In 2017, South Wales Police was the first British force to adopt the technology. The force has put cameras in place to check passersby against a database of offenders at dozens of locations. This includes football matches and rock concerts, according to its website. An identified suspect can be stopped on the spot, while others are not identified and their data is discarded.

Bridges, a 36-year-old civil rights campaigner, said his face was scanned at an anti-arms protest. On a second occasion, it was scanned when he was Christmas shopping. At a hearing in May, his legal team argued that recording people’s faces without consent or grounds for suspicion violated rights to privacy. It also violated equality and data protection laws.

However, the judges said the force’s use of the technology was lawful and legally justified. They noted deployment were limited in scope and time.

Chief constable Matt Jukes said in a statement: “I recognise that the use of artificial intelligence and face-matching technologies around the world is of great interest and at times, concern”. He added: “With the benefit of this judgment, we will continue to explore how to ensure the ongoing fairness and transparency of our approach.”

Computers have become skilled at identifying people by matching a scan of their facial features against a photograph. Despite this, critics say the technology is still prone to errors and could lead to excessive surveillance.

Reporting: Umberto Bacchi

Source: Reuters (Thomson Reuters Foundation) London, 04 September 2019 

Copyright Lyonsdown Limited 2021

Top Articles

Making employees part of the solution to email security

Security Awareness Training needs to be more than a box-ticking exercise if it is to keep organisations secure from email threats

Windows Hello vulnerability: Bypassing biometric weakness without plastic surgery

Omer Tsarfati, Cyber Security Researcher at CyberArk Labs, describes a flaw that allows hackers to bypass Windows Hello’s facial recognition Biometric authentication is beginning to see rapid adoption across enterprises…

Legacy systems are holding back your digital transformation

Legacy systems pose a threat to organisational security. IT leaders need to be courageous and recognise the need to upgrade their technology

Related Articles

[s2Member-Login login_redirect=”https://www.teiss.co.uk” /]