LONDON (AP) — Scientists and tech experts — including professor Stephen Hawking and Apple co-founder Steve Wozniak — warned Tuesday of a global arms race with weapons using artificial intelligence.
In an open letter with hundreds of signatories, the experts argued that if any major military power pushes ahead with development of autonomous weapons, “a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow.”
Some people have argued in favor of robots on the battlefield, saying their use could save lives. Such weapons are still years away.
But the scientists warned that, unlike nuclear weapons, once they are developed they will require no costly or hard-to-obtain raw materials — making it possible to mass-produce them.
Most Read Nation & World Stories
- Inside the world of Buy Nothing, where dryer lint is a hot commodity
- Sports on TV & radio: Local listings for Seattle games and events
- Cheney’s consultants are given an ultimatum: Drop her, or be dropped
- Plan to honor Montana's heritage foundered when facts proved no match for fear
- A man said he hired a person with COVID-19 to lick groceries. He got 15 months in prison
“It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc.,” the letter said.
The signatories included leading figures globally in academia and business studying artificial intelligence — the idea that computer systems could replicate tasks normally requiring human intelligence, such as language translation or visual perception. They were joined by philosophers, historians, sociologists and geneticists.
Those signing letter included Elon Musk, Tesla Motors CEO; Demis Hassabis, who founded Google DeepMind; and Noam Chomsky, an emeritus professor at MIT.
Sean O hEigeartaigh, the executive director of Cambridge University’s Center for the Study of Existential Risk, said that he is hoping for a discussion on whether autonomous weapons should fall into the same category as chemical weapons and blinding lasers — namely that they be shunned.
“It’s imperative to hear the voices of the scientists,” he said of the many who have devoted their lives to having such systems benefit humanity.