Amazon has joined the ranks of other technology companies, including Microsoft and Google, in acknowledging the risks of facial-recognition software and calling on the federal government to impose national regulations on the technology.
Amazon Web Services CEO Andy Jassy told an interviewer Monday that he welcomed federal legislation limiting the misuse of cloud-based facial-recognition software, such as Amazon Rekognition, at Recode’s Code 2019 conference in Arizona.
“Whether it’s private-sector companies or our police forces, you have to be accountable for your actions and you have to be held responsible if you misuse it,” Jassy said.
“I think the issue around facial-recognition technology is a real one,” he said.
Amazon Rekognition uses artificial intelligence to identify people’s faces in photos and videos. A coalition of 85 civil-liberty groups has criticized the company for selling the facial-surveillance technology to governments as a “new power to target and single out immigrants, religious minorities, and people of color in our communities,” as members wrote in a January letter to Amazon. Studies have also shown that Rekognition has higher chances of misidentifying images of darker skinned females than lighter skinned males.
Jassy’s announcement Monday reflected a February blog post by Amazon Web Services that acknowledged the potential for misuse of the technology.
He reiterated the company’s stance that law-enforcement customers should only use results of facial recognition searches in their decision-making process if there’s a 99% confidence score. Amazon offers free training to law-enforcement customers, Jassy said Monday, adding that customers who misuse the technology will be barred from using the platform.
In the over two years that the cloud service has been available, there’s not been a report of misuse by law enforcement, said Jassy. “I strongly believe that just because technology could be misused, doesn’t mean that we should ban it and condemn it.”
Google has taken a tougher stance on safeguarding the public from potential misuse of its facial-recognition technology. In a December blog post, the search giant vowed not to sell its facial-recognition products until it develops policies to avoid harmful outcomes.
Jassy said Amazon plans to continue selling its technology to governments. “If our government doesn’t have access to all of the most modern, sophisticated technology that the private sector has, we’re in trouble,” Jassy said. During Amazon’s annual shareholder meeting last month, only 1.7% of shareholders supported a proposal urging the company to stop selling its facial-recognition technology to governments.
The Seattle Police Department (SPD) stopped using its facial-recognition software about a year ago, said SPD spokesman Sergeant Sean Whitcomb. SPD had received national praise for facial-recognition practices that followed a strict policy created with input from the American Civil Liberties Union of Washington in 2014. The policy stipulated that SPD could only use facial-recognition software to compare jail mug-shot images to the photo of someone who was reasonably suspected of criminal activity. Whitcomb was unsure why SPD has stopped using facial-recognition software.
Shankar Narayan, ACLU of Washington’s technology and liberty project director, called Amazon’s announcement a “welcome step,” but he expressed concern over the type of regulation the company will propose.
“Face surveillance does not impact communities equally — a growing range of evidence shows it is likely to disproportionately impact vulnerable communities such as people of color, religious minorities and many others. Those very communities should have an opportunity, based on their historic experience, to say no to use of face surveillance by government entities, before we even get to the question of how the technology should be implemented,” said Narayan. “It’s our hope that tech companies such as Amazon and Microsoft, consistent with their public statements of concern around face surveillance, will support that happening.