Privacy and the police: British activist to appeal ‘sinister’ facial recognition

Privacy and the police: British activist to appeal ‘sinister’ facial recognition

Faciel recognition technology in use in public

Privacy campaigners in Britain said on Wednesday they would appeal a landmark ruling that supported "sinister" police use of facial recognition. The police use the technology to hunt for suspects.

The High Court in Wales dismissed a legal challenge by Ed Bridges, a resident of Cardiff. He had argued that local police breached his privacy rights, by scanning his face without consent. Judges said the case was the first of its kind worldwide.

Civil rights group Liberty, which represented Bridges, said it would appeal the "disappointing" decision.

Bridges said in a statement: "This sinister technology undermines our privacy. I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance".

From malls to airports, facial recognition is increasingly present worldwide, raising fears over privacy. In August, Britain's data protection watchdog said it was "deeply concerned" about the technology. This was after it emerged that museums, shopping centres and other properties were all using surveillance cameras.

In 2017, South Wales Police was the first British force to adopt the technology. The force has put cameras in place to check passersby against a database of offenders at dozens of locations. This includes football matches and rock concerts, according to its website. An identified suspect can be stopped on the spot, while others are not identified and their data is discarded.

Bridges, a 36-year-old civil rights campaigner, said his face was scanned at an anti-arms protest. On a second occasion, it was scanned when he was Christmas shopping. At a hearing in May, his legal team argued that recording people's faces without consent or grounds for suspicion violated rights to privacy. It also violated equality and data protection laws.

However, the judges said the force's use of the technology was lawful and legally justified. They noted deployment were limited in scope and time.

Chief constable Matt Jukes said in a statement: "I recognise that the use of artificial intelligence and face-matching technologies around the world is of great interest and at times, concern". He added: "With the benefit of this judgment, we will continue to explore how to ensure the ongoing fairness and transparency of our approach."

Computers have become skilled at identifying people by matching a scan of their facial features against a photograph. Despite this, critics say the technology is still prone to errors and could lead to excessive surveillance.

Reporting: Umberto Bacchi

Source: Reuters (Thomson Reuters Foundation) London, 04 September 2019 

Copyright Lyonsdown Limited 2021

Top Articles

It’s time to upgrade the supply chain attack rule book

How can infosec professionals critically reassess how they detect and quickly prevent inevitable supply chain attacks?

Driving eCommerce growth across Africa

Fraud prevention company Forter has partnered with payments technology provider Flutterwave to drive eCommerce growth across Africa and beyond.

Over 500,000 Huawei phones found infected with Joker malware

The Joker malware infiltrated over 500,000 Huawei phones via ten apps using which the malware communicates with a command and control server.

Related Articles