LONDON — London’s police department said Friday that it would begin using facial recognition to spot criminal suspects with video cameras as they walk the streets, adopting a level of surveillance that is rare outside China.
The decision is a major development in the use of a technology that has set off a worldwide debate about the balance between security and privacy. Police departments contend that the software gives them a way to catch criminals who may otherwise avoid detection. Critics say the technology is an invasion of privacy, has spotty accuracy and is being introduced without adequate public discussion.
Britain has been at the forefront of the debate. In a country with a history of terrorist attacks, police surveillance has traditionally been more accepted than in other Western countries. Closed circuit television cameras line the streets.
The technology London will deploy goes beyond many of the facial recognition systems used elsewhere, which match a photo against a database to identify a person. The new tools use software that can immediately identify people on a police watch list as soon as they are filmed on a video camera.
The Metropolitan Police said in a statement that the technology would help quickly identify and apprehend suspects and help “tackle serious crime, including serious violence, gun and knife crime, child sexual exploitation and help protect the vulnerable.”
London has faced several terror attacks and seen an increase in crime in recent years. In November, police shot and killed a man wearing a fake bomb on London Bridge after two people were fatally stabbed. The police called the attack a terror incident. In 2017, another stabbing attack left eight dead and dozens wounded. Knife crime in England and Wales rose to a record high in the first nine months of last year, according to government statistics.
“Every day, our police officers are briefed about suspects they should look out for,” Nick Ephgrave, assistant commissioner of the police department, said in the statement. Live facial recognition, he said, “improves the effectiveness of this tactic.”
“As a modern police force, I believe that we have a duty to use new technologies to keep people safe in London,” he added.
Facial recognition, already widespread in China, is gaining traction in Western countries. Many cities and police departments, like the New York City Police Department, use technology comparing photos and other static images against a database of mug shots. An investigation by The New York Times this month found that more than 600 law enforcement agencies are using a facial recognition system by the company Clearview AI.
Use of real-time facial recognition is less common. NEC, a Japanese company that makes biometric and facial recognition services, sold London the technology now being adopted. Other buyers of its real-time facial recognition technology include Surat, a city of about 5 million people in India, and the country of Georgia, according to the company’s website.
The technology is also used every few weeks in Cardiff, the capital of Wales, often at big events like rugby matches or a concert for the heavy metal band Slipknot this past week. As of September, police in Wales say, the technology had helped in the arrests of 58 people who had been wanted.
Representatives at NEC did not respond to a request for comment.
According to researchers at Georgetown University, several U.S. cities have piloted the live facial recognition systems, often with mixed results. In Orlando, Florida, a pilot program that ended last year tried to match faces going past several cameras against individuals on a watch list. In Detroit, police purchased a face-identification system as part of a crime-prevention program. Last year, The Wall Street Journal reported on the failures of a New York pilot program to spot people as they drove past bridges and tunnels.
Use of facial recognition technology in the United States has generated a backlash. San Francisco, Oakland and Berkeley in California, along with Somerville and Brookline in Massachusetts, have banned its use.
Privacy groups criticized London’s decision and vowed to take legal action to try to stop the deployment of the software.
“This decision represents an enormous expansion of the surveillance state and a serious threat to civil liberties in the U.K.,” said Silkie Carlo, director of Big Brother Watch, a London-based group that has been fighting the use of facial recognition. “This is a breathtaking assault on our rights, and we will challenge it.”
Last year, a British judge said that police departments could use the technology without violating privacy or human rights, a case that is under appeal. The government’s top privacy regulator has raised concerns about the use of the technology, as did an independent report of a trial use by the Metropolitan Police.
Researchers have found problems with many facial recognition systems, including trouble accurately identifying people who are not white men. Civil liberties groups warn that as the technology improves, it will lead to constant surveillance, including an ability to track people as they move and watch whom they are speaking with.
“We can look to how London is using this technology as a sign of things to come,” said Clare Garvie, a researcher at Georgetown’s Center on Privacy and Technology who studies government use of facial recognition.
Britain has tested real-time facial recognition for a few years. In the trials, officers were often stationed in a control center monitoring a real-time feed of what was being recorded by nearby cameras. The system sent an alert when it had identified a person who matched someone on the watch list. If officers agreed it was a match, they would radio to other officers positioned on the street to pick up that person.
Last year, an independent review of a police trial found many problems, including its accuracy. Of 42 identifications made by the system, only eight were correct.
“It was incredibly inaccurate,” said Daragh Murray, a senior lecturer at the University of Essex who conducted the report. “Most times they didn’t actually find the people they were looking for. From just a technological perspective, you have to question the utility.”
Without clear laws about how the technology is used, police departments everywhere have wide latitude to put the camera systems in place, Murray said. Particularly concerning, he said, is the lack of transparency about how police decide when somebody is placed on a watch list.
“Too much leeway is given to the police,” he said. “What is needed is proper safeguards around its use.”
Tony Porter, Britain’s surveillance camera commissioner, has called for a moratorium on the use of live facial recognition systems until a fuller review can be conducted.
The Metropolitan Police said it would be transparent about deploying the technology. Officers will post signs and hand out leaflets when the cameras are in use.
Britain’s Information Commissioner’s Office, the country’s top privacy regulator, said it would monitor how the system is deployed in London. It said the police gave assurances that the department would take steps to reduce privacy and data-protection risks.
“This is an important new technology with potentially significant privacy implications for U.K. citizens,” the privacy regulator said in a statement.