A report from the Center on Privacy & Technology applauds Seattle while chastising many other law-enforcement agencies for poor facial-recognition database controls that have put half of U.S. adults in a “massive virtual line-up.”
Seattle police employ some of the best safeguards and practices in their use of facial-recognition technology, according to a new report that cites a lack of oversight and controls in other law-enforcement agencies nationwide.
The 150-page report, released Tuesday by the Center on Privacy & Technology at Georgetown Law in Washington, D.C., found half of American adults — more than 117 million people — are in a law-enforcement facial-recognition database, largely through driver’s license and identification photos, from at least 26 states, used by the police and FBI.
“Innocent people don’t belong in criminal databases,” Alvaro Bedoya, the center’s executive director and co-author of the report, said in a statement decrying the practice as a “massive virtual line-up. ”
“This has never been done for fingerprints or DNA. It’s uncharted and frankly dangerous territory,” Bedoya said of the findings in the report, “The Perpetual Line-Up: Unregulated Police Face Recognition in America.”
Most Read Local Stories
- Wondering why society went off-kilter during the pandemic? It was all predicted in this book
- An Idaho ICU doctor's touching message went viral. Here's what he told his co-workers
- Coronavirus daily news updates, September 21: What to know today about COVID-19 in the Seattle area, Washington state and the world
- There's an opening for the GOP in Washington state — and they're squandering it on conspiracies
- Coronavirus daily news updates, September 22: What to know today about COVID-19 in the Seattle area, Washington state and the world
Seattle police began using facial-recognition software in 2014, applying tight controls under which the department uses only jail mug shots based on a reasonable suspicion the person in the image has committed a crime.
At the time it was launched, it allowed police to quickly scan some 350,000 jail mug shots to determine whether there is a match with a suspect’s photo. Software measures the distance of points on the face using an algorithm of individual matching points on the eyes, the ears, the nose and the chin.
The Seattle City Council approved the department’s use of the software under a policy created with input from the American Civil Liberties Union (ACLU) of Washington — a safeguard singled out in the Georgetown report as apparently unique nationwide.
Seattle requires auditing of the program and doesn’t allow real-time, live camera use of facial recognition.
It also publishes information on the web about its program and requires a 96 percent accuracy rate of its vendor, said Clare Garvie, another author of the report, in a telephone news conference Tuesday.
In a statement, Garvie, a center associate who led its records requests to more than 100 law-enforcement agencies, cited a general lack of controls nationwide.
“With only a few exceptions, there are no laws governing police use of the technology, no standards ensuring its accuracy, and no systems checking for bias. It’s a wild west,” she said.
Seattle came under some scrutiny in a section of the report that says facial recognition disproportionately affects African Americans.
“Many police departments do not realize that,” the report said. “ In a Frequently Asked Questions document, the Seattle Police Department says that its face recognition system ‘does not see race.’ Yet an FBI co-authored study (in 2012) suggests that face recognition may be less accurate on black people. Also, due to disproportionately high arrest rates, systems that rely on mug shot databases likely include a disproportionate number of African Americans.”
In a letter sent Tuesday, the American Civil Liberties Union, The Leadership Conference for Civil and Human Rights and more than 40 other civil-rights and civil-liberties organizations called on the U.S. Justice Department’s civil-rights division to investigate potential racial bias in police use of the technology.
Shankar Narayan, technology and liberty project director for the ACLU of Washington, also cited concerns Tuesday about disproportionality and higher error rates involving people of color.
He also said while Seattle has a “relatively good policy” related to the current software program, it has no restrictions on possible future uses such as when police body cameras are deployed next year.
He said vigilance is needed under Seattle’s surveillance ordinance.
The FBI said facial-recognition technology is a valuable crime-fighting tool, according to a Washington Post story on the report. The technology provides leads that investigators can pursue but does not by itself single out people for arrest, the newspaper reported.
“Facial recognition algorithms are developed in the computer vision field, based solely on pattern matching techniques,” the FBI said in a statement. “ ‘Facial recognition’ algorithms do not actually compare ‘faces’ and they do not consider skin color, sex, age, or any other biographic.”