The watchdog group has sent Amazon a letter calling on the company to stop selling facial recognition software to government entities like the Washington County Sheriff's Office in Oregon.
When the Washington County Sheriff’s Office in Oregon started using Amazon’s facial-recognition software in 2016, deputies welcomed a new tool to quickly identify suspects and solve cases.
The American Civil Liberties Union (ACLU) saw something different: a troubling extension of the government’s ability to keep an eye on its citizens.
The watchdog group on Tuesday sent Amazon Chief Executive Jeff Bezos a letter calling on his company to stop selling facial-recognition software to government entities like the Portland-area county, which has used the cloud-computing service to compare public images of people suspected of crimes to its database of criminal suspects.
“People should be free to walk down the street without being watched by the government,” the letter says. “Facial recognition in American communities threatens this freedom.”
Most Read Business Stories
- Flawed analysis, failed oversight: How Boeing, FAA certified the suspect 737 MAX flight control system | Times Watchdog
- Boeing has 737 MAX software fix ready for airlines as DOT launches new scrutiny of entire FAA certification process
- Belltown penthouse is region’s priciest condo sale ever — and new owners won't even live there
- Amazon finds an alternative workforce through Northwest Center, a Seattle nonprofit helping people with disabilities
- Boeing defends 737 MAX's cockpit add-ons, begins new pilot information sessions
Amazon says its terms of service require customers to comply with the law.
The letter was signed by representatives of several regional chapters of the ACLU as well as civil liberties groups, including internet freedom watchdog, the Electronic Freedom Foundation, and Human Rights Watch. Several signers, including Seattle-based One America and the Washington chapter of the Council on American-Islamic Relations, advocate for immigrants, refugees or racial groups that have historically been the victims of biased policing.
The letter is the latest flashpoint in an ongoing debate about the risk that increasingly powerful technologies making their way into law enforcement could trample civil liberties or otherwise invite abuse. Cities and civil liberties groups have clashed over surveillance tools like automatic vehicle license plate readers and software that turns cellular tower pings into location trackers, a debate that, in Seattle, led the city to scrap plans for surveillance drones and to dismantle a wireless mesh network of cameras and signal trackers.
Meanwhile, advances in artificial intelligence and cloud-computing have yielded commercially available algorithms capable of quickly combing public and private databases to build a profile of people or, in the case of Amazon’s tool, scanning photos or video to find similarities between faces.
Amazon introduced its facial-recognition software, called Rekognition, in 2016, one among dozens of tools built by the Amazon Web Services cloud-computing unit and designed to allow customers to harness the power of complex algorithms they don’t have the wherewithal to build themselves.
Chris Adzima, a technologist with the Washington County Sheriff’s Office, said in a presentation at Amazon’s Re:Invent trade show last year that it took him about a month to set up a program that used Rekognition to compare new images against the roughly 300,000 mugshots taken by the office since 2001.
In one case, Adzima used the system to compare surveillance footage of a shoplifter to the database, identifying the suspect by name and, after matching it with his public Facebook profile, arresting him.
In another instance, a citizen approached an officer with a tip about a man who had a warrant out for his arrest, but the only identifying information the tipster had was a hastily captured smartphone image of the suspect. Rekognition was able to cut through the clutter of the imperfect, glare-filled photo and identify the person.
“We don’t have a lot of resources, so we wouldn’t be able to [hire] a data scientist, but opening this stuff up to us allows us to build these things … based on machine learning and ultimately keep our citizens safe,” Adzima said. He added that he hoped the county would be able to integrate its system with neighboring jurisdictions.
Deputy Jeff Talbot, a Washington County Sheriff’s Office spokesman, said the technology is being used only for images acquired during criminal investigations, and that it isn’t being deployed in combination with surveillance cameras or real-time video.
He said the office’s policy on use of Rekognition software sought to balance civil liberties with the mandate to investigate crimes. “Concerns and questions about the use of such technology is completely valid, we’re sensitive to that,” he added.
Shankar Narayan, director of the technology and liberty project at the ACLU’s Washington chapter, says the technology invites abuse. “There is a long history and track record suggesting that, if you build a system and an infrastructure like this, it is prone to abuse and also prone to mission creep,” he said.
The city of Orlando, Florida, he said, was using Rekognition to compare footage from its network cameras on city streets to faces that police officers were tracking. Given the computing power at cities’ disposal, Narayan said it wasn’t far fetched to envision a citywide surveillance network, or near-real time identification of, for example, protesters from footage captured by police body cameras.
“This is not a neutral technology,” Narayan said. “There is really no place for a tool of this kind because of its invasive nature.”
Amazon isn’t the only company to market facial recognition software. A comparable tool built by Microsoft was released about six months after Amazon unveiled Rekognition.
Google sells a similar product, though its marketing materials indicate its functionality stops short of identifying individual faces. Last week, Gizmodo reported that about a dozen Google employees had resigned in protest over the search giant’s decision to sell artificial intelligence capabilities to the Pentagon for a program that aimed to speed analysis of footage shot by drones with automatic identification of people and objects.
Video: A look at how Amazon Facial Rekognition works and how it could be used. (Amazon Web Services)
Narayan said the ACLU has concerns about other companies’ facial-recognition offerings but made the decision to target Amazon in its appeal because of the company’s vast lead in the market for cloud-computing services, as well as its publicly identified government customers.
He said the ACLU had previously shared its concerns with the company.
After that outreach, he said, Amazon removed a recommendation from its website that law enforcement agencies use Rekognition on body camera footage, but “we have not really gotten a substantive response” to broader concerns, Narayan said. “We have gotten to the point where we believe that more attention needs to be paid in the public sphere.”