Microsoft recommended that tech companies be required to publish documents that explain their technology’s capabilities and limitations and that people be told when facial recognition systems are being used in a public place.
Microsoft President Brad Smith paints an Orwellian picture of the future in his latest call for government regulation of facial-recognition technology.
Smart camera systems could follow us anywhere, tracking our whereabouts and activities for companies and governments to scrutinize.
“It could follow anyone anywhere, or for that matter, everyone everywhere,” Smith wrote in a blog post Thursday.
Smith also pointed out the benefits of facial-recognition technology, which has received praise for helping police find missing children and identify criminals.
Most Read Business Stories
- Renter boom: Apartments filling up faster in Seattle area than anywhere in the U.S.
- Protecting your Internet accounts keeps getting easier. Here’s how to do it.
- Mom jeans made women love denim again
- This Seattle-area CEO made more than the heads of Microsoft and Starbucks — and he’s not in the tech sector
- 'Delete' doesn't really delete your data in most programs | Patrick Marshall Q&A
But without regulation, he added, “this use of facial-recognition technology could unleash mass surveillance on an unprecedented scale.”
It’s not too late to put safeguards on the technology before that happens in the U.S., he argues. “We must ensure that the year 2024 doesn’t look like a page from the novel ‘1984,’” he wrote, referring to George Orwell’s dystopian novel.
Smith outlined the company’s recommendations for government regulation and tech-company policies, which Microsoft has been developing since announcing this summer that it would support regulation of facial technology.
The proposals include a law that would inform consumers when facial-recognition technology is being used in a public place. The technology, which uses cameras and advanced machine learning systems to analyze and identify faces, is becoming increasingly common in the country as the technology gets more accurate, and is being used as a security measure in schools and at retail stores to observe consumers’ shopping patterns.
Microsoft also recommended laws that require people to review results from the artificial intelligence systems before they’re automatically used to make decisions about people’s actions, especially where there could be legal or other important consequences. This could help cut down on instances of bias and discrimination, Smith wrote, an issue that developers of facial-recognition technology have struggled with and come under fire for, especially when related to use of the technology by law enforcement.
Studies have found that several facial-recognition systems make more errors when identifying women and people of color rather than white men. Microsoft and others have vowed to work on the problem, and Microsoft notes that its own Face API system has become more accurate at identifying people.
In his blog post Thursday, Smith also recommended tech companies be required to publish documents that clearly explain their technology’s capabilities and limitations, allow third-party groups to independently test the systems, and require governments to obtain court orders in many cases before persistently monitoring people with facial- recognition technology.
Smith also outlined measures Microsoft would implement internally at the beginning of next year, including barring customers from using the technology to illegally discriminate, and pledging not to allow its technology to be used in law- enforcement situations that encroach on people’s democratic freedoms.
Two civil-liberties organizations acknowledged that Microsoft has done better than other companies in addressing the need for regulations, but said the recommendations announced Thursday did not go far enough.
“Microsoft gets some things right, but unfortunately the protections they’re suggesting are not sufficient,” said Shankar Narayan, who directs the technology and liberty project at the American Civil Liberties Union of Washington. “Their actions won’t prevent ‘1984,’ they will accelerate it,” he said.
Narayan called for a moratorium on tech companies selling facial-recognition technology to government and law-enforcement agencies. Even operating perfectly, he said, the technology can be used to racially profile and discriminate against groups of people.
Electronic Frontier Foundation echoed the call to prevent companies from selling to law enforcement. Adam Schwartz, a lawyer with the digital privacy organization, also said consumers and shoppers should have to actively opt-in to the technology, not simply be informed they’re being watched when they enter public places, such as a grocery store.
“You shouldn’t have to choose between buying a bag of apples for your kids and giving up your privacy,” he said.
Smith on Thursday acknowledged that Microsoft’s regulation recommendations are a first step on a complicated issue.
“This is a first word, not a final word,” he said.
Microsoft has strived to set itself apart from many fellow tech companies this year by positioning itself as a champion of privacy rights.
The Redmond company has gone much further than its counterparts in trying to protect privacy as facial-recognition technology is developed, said Michael Posner, director of the Center for Business and Human Rights at New York University. Some of Microsoft’s ideas will be hard to implement, he said, but “they are generally in the right direction.”
Smith’s recommendations came out the same day that he addressed the Brookings Institution in Washington, D.C., on the same topic and that other tech executives — including Microsoft CEO Satya Nadella — met at the White House in a session to field ideas for securing U.S. dominance in fields such as artificial intelligence, quantum computing and 5G wireless technology.
Earlier Thursday, a group run by tech employees called AI Now also released a report warning of the dangers and calling for audits of government use of artificial intelligence technologies.