recently announced a one-year pause in use by police forces of its artificial intelligence software for recognizing faces. The next day, Microsoft said it doesn’t currently sell its similar product to U.S. police departments and won’t do so until the federal government passes a law regulating its use. Both steps came amid widespread protests against police brutality and misconduct targeted at Black people — a population on whom facial-recognition software performs poorly, leading to concerns that the technology is another vector for discrimination against people of color by law enforcement.

Other companies have gone ever further. Earlier this month, International Business Machines said it would no longer sell general-purpose facial recognition and analysis software. Google Cloud Platform stopped selling facial recognition as an off-the-shelf service, saying in late 2018 that the company wanted to allow more time to work through “important technology and policy questions.”

While Microsoft and Amazon received praise for their steps last week, the complicated nature of technology contracts, and the many different levels and types of law enforcement agencies — foreign and domestic, at local, state and national levels — mean there are still many unanswered questions about what the temporary bans cover and how they will be implemented. Both companies have declined to answer these questions.  

How are the companies defining ‘police?’ 

It’s not clear whether the bans mean just local police departments or also include federal law enforcement officers. On June 17, the American Civil Liberties Union released emails that showed Microsoft tried to sell facial recognition software to the federal Drug Enforcement Administration in 2017 and 2018. Amazon sought to sell its tools to ICE in 2018. Would the companies still try to make such sales in 2020? On June 18 at an event hosted by Politico, Microsoft President Brad Smith said the company doesn’t currently have any federal law enforcement customers. He said the company wouldn’t sell to federal law enforcement in any “scenario that either leads to bias against people of color or women,” and “won’t allow our technology to be used in any manner that puts people’s fundamental rights at risk,” but he stopped short of ruling out any sales of the technology to federal agencies that take part in many different kinds of policing, such as U.S. Immigration and Customs Enforcement, Customs and Border Protection and others.

What about overseas police departments and international police or investigative organizations?

In announcing Microsoft’s move, Smith specified that it related to U.S. police. Amazon’s two-paragraph blog post announcing its moratorium, meanwhile, mentioned U.S. Congress. The companies have declined to comment on whether they will or currently sell to organizations outside the U.S. That means they may still be targeting a large group of international customers with software they have found problematic to sell domestically. 


What’s the extent of the companies’ previous sales of the technology to police departments? 

Smith said Microsoft doesn’t currently have U.S. police customers, but the company hasn’t released any information on historical buyers. In its customer references, Amazon has touted some of its police clients, including the Washington County Sheriff’s Office in Oregon. It won’t say what happens to those customers during the one-year pause or whether Amazon will automatically turn off access to its face-scanning product, called Rekognition. A Washington County spokesman told The Seattle Times that it will stop using Rekognition during the moratorium, but it’s not clear what happens to others, or how many there are — Amazon Web Services CEO Andy Jassy has said he doesn’t know how many police departments use Rekognition. Without a thorough shut-off of existing customers, usage may persist.

If the companies do business with municipal governments, can they truly know if a local police department has access to the software?

In some cities, contracting for police tech is done through city offices or municipal chief information officers. In those cases, how would the companies know the software is intended for police use rather than some other local use case the companies permit, such as mapping and scanning local streets or city-owned parking lots? It might be possible for a city to license the product from Microsoft or Amazon and then to let it be used by police or for law enforcement tasks, where the issue of racial disparities in the software could be particularly harmful.

What about technology products other than facial-recognition software that can enable police surveillance?

Ring, the Amazon-owned doorbell camera maker, runs a program that lets police departments and other law enforcement agencies — some 1,300 and counting — request footage from users. Ring didn’t comment on whether the moratorium on police use of Rekognition would impact those police partnerships. Meanwhile, some privacy advocates say that pledges from Google and IBM to not sell general-purpose facial recognition software leaves the door open to tech-savvy users assembling such a system from other products still available from those companies. Google and IBM didn’t immediately return requests for comment.