The Metropolitan King County Council voted Tuesday to ban the use of facial recognition technology by the county Sheriff’s Office and all other county departments, citing the technology’s threat to privacy and history of bias.
The new measure will prohibit county departments from acquiring facial recognition technology or using facial recognition information. Departments, including the Sheriff’s Office, could still use facial recognition evidence as long as they don’t produce it themselves or ask for it. They will also be permitted to continue using the technology in service of the federal program that searches for missing children.
The law would allow people to sue if facial recognition technology is used in violation. It would not ban the use of such technology by any other government within King County, or by private groups or individuals.
The County Council passed the legislation unanimously Tuesday and King County Executive Dow Constantine supports it and will sign it into law, his office said.
King County joins a growing number of jurisdictions across the country that have moved to ban the technology, after a ream of studies have shown it’s frequently biased against people of color.
“I think that the technology raises huge concerns, primarily on the inaccuracy of the technology, demographic biases and encroachment on civil liberties and privacy for everybody,” Councilmember Jeanne Kohl-Welles, the legislation’s lead sponsor, said. “I think it’s not appropriate for us to use.”
Sgt. Tim Meyer, a spokesperson for the King County Sheriff’s Office, said the department currently does not use any technology that would be affected by the ban and doesn’t expect any changes from the new law.
“The Sheriff’s Office operations will not be hindered by the proposed legislation,” Meyer said in an email. “This legislation reflects the values of the communities we serve.”
At least 13 cities across the country have some form of ban on using facial recognition technology, including San Francisco; Boston; Portland, Oregon; Portland, Maine; and Jackson, Mississippi. Vermont has also banned the technology. King County would be the first county to implement a ban, Kohl-Welles said.
In Seattle, a 2018 ordinance requires city agencies, including the Police Department, to get City Council approval before acquiring or using any surveillance technologies. The Seattle Police Department, late last year, said it had “no relationship or intent of relationship” with any facial recognition company, after a detective was found to have signed up for facial recognition software, potentially in violation of the ordinance.
Customs and Border Protection officers use facial recognition technology at Seattle-Tacoma International Airport, although they say they retain the photographs of U.S. citizens for no more than 12 hours.
Facial recognition technology analyzes characteristics of a person’s picture, such as the distance between their eyes and the distance from forehead to chin, and then matches those characteristics against a model or database.
Your smartphone can use the technology to make sure it’s you unlocking your phone. Social networks use the technology to tag people in photographs. But its use in law enforcement has proven much more thorny.
The technology has been much more effective when a straight-on, standard photo is compared to a standardized database. But pulling faces, for instance, from a crowd shot or a person in motion, is much dicier.
A 2019 National Institute of Standards and Technology study on facial recognition algorithms, including from Microsoft and Intel, showed that systems are up to 100 times more likely to misidentify Black and Asian people than white people. Previous research revealed that Amazon’s facial analysis system misclassified the gender of darker-skinned females 31% of the time. Other studies have shown the technology has much higher error rates for transgender and nonbinary people.
“Facial recognition technology is a powerful, privacy-invasive and racially-biased technology that gives the government unprecedented power to automatically identify, locate and track people based on images of their faces,” said Jennifer Lee, technology and liberty manager at the ACLU of Washington. “Use of these systems is not worth the risk.”