A civil liberties group has launched a legal challenge against The Metropolitan Police for their use of automatic facial recognition (AFR) technology. The software is being implemented as a powerful police tool to track and analyse facial characteristics from a huge database of people.
Tech giants Apple, Google, Amazon and Microsoft are among the companies currently developing the software. Despite its potential, the technology has the capacity to be exploited by governments seeking to introduce mass surveillance.
The system is currently being piloted in London, Humberside, South Wales and Leicestershire in an effort to keep people safe. However, Lawyers for Big Brother Watch insist that the intrusive nature of AFR breaches the rights of individuals under the Human Rights Act.
Its use in public places has been described as “very intrusive” by one watchdog. The BBC also reports that court documents claim the Home Office has failed in its duty to regulate the technology in use.
Objectors have highlighted that the technology is flawed, and indeed the accuracy of the tech varies dramatically. Big Brother Watch argued that during the Met’s own trials, only two genuine matches were generated out of 104 system alerts.
The police have described the technology as "an extremely valuable tool.” The director of the civil liberties group, Silkie Carlo, argued that "when the police use facial recognition surveillance they subject thousands of people in the area to highly sensitive identity checks without consent."
Last month, Orlando police department dropped its pilot of Amazon’s facial recognition program, Rekognition, amid public outcry. The software has been widely criticised since its launch two years ago, but it could be reinstated in the future.
Although Microsoft has helped advance and innovate facial recognition tools, it is now recommending that the government introduces official regulation. The company’s President Brad Smith said that new legislation is imperative considering the software’s “broad societal ramifications and potential for abuse.”
Smith insists that the software “raises issues that go to the heart of fundamental human rights protections like privacy and freedom of expression.” Despite controversy, it is likely that the Met and other forces will use the legal battle to consolidate a legal framework for AFR.
Dr Suzanne Shale, who chairs the London Policing Ethics Panel, believes that the technology requires regulation. "We have made a series of key recommendations, which we think should be addressed before any further trials are carried out,” she added "we believe it is important facial recognition technology remains the subject of ethical scrutiny."