Cyber Week Sale! 50% Off All Access

U.K. Police Need to Put the Brakes on Facial Recognition Police in the U.K., backed by the government, are testing a facial-recognition system that is 20 percent accurate and treating those who avoid its gaze as potential suspects.

By Adam Smith

This story originally appeared on PCMag

You're reading Entrepreneur Europe, an international franchise of Entrepreneur Media.

via PC Mag

Britain has a close relationship with security cameras. London alone has one of the highest ratios of surveillance cameras per citizen in the developed world. Estimates from 2002 put the number of surveillance cameras in Greater London at more than 500,000; around 110 are used by the City of London Police, according to data obtained through a 2018 Freedom of Information request.

Being recorded apparently is not enough; London's Metropolitan Police Service has been testing the use of facial-recognition cameras, and the effort has the support of Home Secretary Sajid Javid -- who oversees immigration, citizenship, the police force and the security service. "I think it's right they look at that," he said, according to the BBC.

Although the upcoming election will decide the new leader of the Conservative Party, who will also become Prime Minister, it is unlikely that government attitudes toward facial recognition will change. Javid might move to another part of the government, but the civil libertarian side of the Conservative Party has been relatively quiet of late.

The problem is, facial recognition -- as it currently stands -- is often inaccurate. London police have been using facial recognition since 2016, but an independent report revealed last week showed that four out of five people identified by facial recognition as possible suspects were actually innocent -- a distinct failing in the machine learning used to train the system.

Professor Pete Fussey and Dr. Daragh Murray, from the University of Essex, analyzed the accuracy of six out of 10 police trials. Of 42 matches, only eight were correct, and four of those 42 were never identified because of crowding.

Nevertheless, the Metropolitan Police see the trials as a success and was "disappointed with the negative and unbalanced tone of this report," a deputy assistant commissioner told Sky News. The Met measures accuracy by comparing successful and unsuccessful matches to the total number of faces processed; by this rubric, the error rate was only 0.1 percent.

That was not the only error, however. The database used by the police was not current, and therefore identified people whose cases had already been closed. There is also "significant ambiguity" over the criteria around what puts a person onto the watchlist, the report states.

The Metropolitan Police informed citizens about the trials by handing out leaflets and tweeting, but the report deems this insufficient. "The information provided transparency regarding the time and location of the [live facial recognition] test deployments yet there was less clarity over the purpose of the deployment, who was likely to be the subject of surveillance, and how additional information could be ascertained," the reports says. Moreover, treating those who tried to avoid cameras "as suspicious ... undermines the premise of informed consent."

The report concludes that it's "highly possible [the trial] would be held unlawful if challenged before the courts." The implicit legal authority "coupled with the absence of publicly available, clear, online guidance is likely inadequate" when compared to human rights law, which requires that interference with individuals' human rights be "in accordance with the law, pursue a legitimate aim, and be 'necessary in a democratic society.'"

Controversy across the pond

The United Kingdom isn't the only country struggling with this problem. In the United States, facial-recognition algorithms have been criticized after research by the Government Accountability Office found that the systems used by the FBI were inaccurate 14 percent of the time. Moreover, that number does not take into account "the accompanying false-positive rate presents an incomplete view of the system's accuracy," which can adversely affect minorities due to systemic bias.

Microsoft also rejected a request by California law enforcement to use its facial-recognition system in police cars and body cameras, because of concerns that its algorithm was not sophisticated enough. "Anytime they pulled anyone over, they wanted to run a face scan," Microsoft President Brad Smith said. "We said this technology is not your answer."

A number of cities have rejected it; San Francisco outlawed facial-recognition technologies for government use, and Somerville, Mass., voted unanimously to pass anti-facial-recognition legislation because of its potential to "chill" protected free speech.

And while the United Kingdom's government can be much more compliant regarding potentially opressive technology -- from internet regulation to stop malicious content on social media to a near-ban on pornography -- its citizens should be careful letting it propogate.

Even if its accuracy improves, the difficulty in knowing when, where, and how facial-recognition software is being used means that it is difficult for citizens to give adequate consent to being constantly recorded and identified. There are too many warranted concerns about the data the algorithm is being trained on, the spaces being surveilled, and the effect it will have on our civil liberties for people let facial-recognition interfere with their right to a private life.

Adam Smith

Contributing Editor PC Mag UK

Adam Smith is the Contributing Editor for PCMag UK, and has written about technology for a number of publications including What Hi-Fi?, Stuff, WhatCulture, and MacFormat, reviewing smartphones, speakers, projectors, and all manner of weird tech. Always online, occasionally cromulent, you can follow him on Twitter @adamndsmith.

Business News

Get on the Project Management Certification Track With This Comprehensive Bundle

Dive into Agile, Scrum, PMP, and much more in 120 hours of training.

Business News

Another '30 Under 30' Recipient Was Arrested For Fraud—And She's Not the Only One. Here Are 6 Other Former Honorees Who Turned Out to Be Felons.

From promising prodigies to notorious felons, these are the most infamous former "30 Under 30" honorees who went from celebrated entrepreneurs to convicted criminals.

Business Ideas

63 Small Business Ideas to Start in 2024

We put together a list of the best, most profitable small business ideas for entrepreneurs to pursue in 2024.