Police body-camera maker Axon Enterprise announced Thursday it will not incorporate facial-recognition technology in its law-enforcement devices, a move that comes as California’s Legislature considers a statewide ban on such use.

In a blog post by CEO Rick Smith announcing its decision, the Scottsdale, Arizona, company, whose body-camera unit is based in Seattle, cited “serious ethical concerns” about the current capabilities of facial-matching software.

Thursday’s announcement by Axon, formerly called Taser International, followed a recommendation by an ethics board on artificial intelligence (AI) and policing technology that the company launched last spring.

In a 42-page report published Thursday, the board concluded that facial-recognition technology is not reliable enough to be used on body-worn cameras, given that it doesn’t equally identify people across all genders, ethnicities and races. Research has shown that facial-recognition technology, a form of artificial intelligence that identifies people’s faces in photos and videos, has less chances of accurately identifying darker-skin compared to lighter-skinned people.

“We just know that we had an ethical responsibility to do the right thing,” said Mike Wagers, the company’s vice president.

The A.I. Age | This 12-month series of stories explores the social and economic questions arising from the fast-spreading uses of artificial intelligence. The series is funded with the help of the Harvard-MIT Ethics and Governance of AI Initiative. Seattle Times editors and reporters operate independently of our funders and maintain editorial control over the coverage.

Axon’s announcement comes amid growing scrutiny of facial-recognition technology. In May, San Francisco became the first city to ban the use of facial-recognition technology by police and other municipal agencies, followed by Somerville, Massachusetts on Thursday evening. Berkeley and Oakland in California, are considering similar moratoriums. The California Legislature is currently considering a bill that would ban the pairing of face and other biometric surveillance with body-worn cameras.


“Policing works better when informed by thoughtful research,” Jim Bueermann, an Axon ethics board member and former police chief, said in a statement. “The research is telling us that face recognition isn’t in a stage of readiness to be used effectively on body cameras. Until face recognition can accurately help law enforcement officers identify individuals, the board agrees that it should be kept off body cameras.”

The ACLU considers the announcement by Axon, the biggest U.S. supplier of body cameras to law enforcement, a step in the right direction. It called on other tech companies such as Microsoft and Amazon to follow suit.

“Body cameras should be for police accountability, not surveillance of communities,” Matt Cagle, technology and civil-liberties attorney at the ACLU of Northern California, said in a press release. “Face surveillance technology is ripe for discrimination and abuse, and fundamentally incompatible with body cameras — regardless of its accuracy.”

Over the past year, major tech companies such as Microsoft, Amazon and Google have publicly acknowledged the concerns about facial recognition and urged the federal government to impose regulations on its use.

Talk to us

Reporter Melissa Hellmann is examining the social and economic impacts of artificial intelligence. Share your questions and insights on AI regulation, privacy concerns, benefits and the changing workplace with her at  Mhellmann@seattletimes.com or submit them on the form here.

Amazon has said it plans to continue selling its facial-recognition software Rekognition to governments. “If our government doesn’t have access to all of the most modern, sophisticated technology that the private sector has, we’re in trouble,” Amazon Web Services CEO Andy Jassy told an interviewer at Recode’s Code 2019 conference this month.

In a recent ACLU report about video surveillance, Jay Stanley, senior policy analyst with the ACLU Speech, Privacy, and Technology Project, predicted privacy concerns around police body cameras will be exacerbated if AI could allow the machines to flag suspicious behavior or assess the danger of a person. These capabilities “will represent a significant shift in the technology from a police accountability to a community surveillance tool,” wrote Stanley.

Axon didn’t dismiss the possibility of using facial recognition in the future. Wagers said that if the technology does improve in the future, the company will revisit the issue.

This story has been updated to reflect news about Somerville’s ban.