Each big step of progress in computing — from mainframe to personal computer to internet to smartphone — has opened opportunities for more people to invent on the digital frontier.
But there is growing concern that trend is being reversed at tech’s new leading edge, artificial intelligence.
Computer scientists say AI research is becoming increasingly expensive, requiring complex calculations done by giant data centers, leaving fewer people with easy access to the computing firepower necessary to develop the technology behind futuristic products like self-driving cars or digital assistants that can see, talk and reason.
The danger, they say, is that pioneering artificial intelligence research will be a field of haves and have-nots. And the haves will be mainly a few big tech companies like Google, Microsoft, Amazon and Facebook, which each spend billions a year building out their data centers.
In the have-not camp, they warn, will be university labs, which have traditionally been a wellspring of innovations that eventually power new products and services.
“The huge computing resources these companies have pose a threat — the universities cannot compete,” said Craig Knoblock, executive director of the Information Sciences Institute, a research lab at the University of Southern California.
The research scientists’ warnings come amid rising concern about the power of the big tech companies. Most of the focus has been on the current generation of technology: search, online advertising, social media and e-commerce. But the scientists are worried about a barrier to exploring the technological future when that requires staggering amounts of computing.
The modern data centers of the big tech companies are sprawling and secretive. The buildings are the size of football fields, or larger, housing rack upon rack with hundreds of thousands of computers. The doors are bulletproof. The walls are fireproof. Outsiders are rarely allowed in.
These are the engine rooms of cloud computing. They help deliver a cornucopia of entertainment and information to smartphones and laptops, and they enable millions of developers to write cloud-based software applications.
But artificial intelligence researchers, outside the big tech companies, see a worrying trend in their field. A recent report from the Allen Institute for Artificial Intelligence observed that the volume of calculations needed to be a leader in AI tasks like language understanding, game playing and common-sense reasoning has soared an estimated 300,000 times in the past six years.
All that computing fuel is needed to turbocharge so-called deep-learning software models, whose performance improves with more calculations and more data. Deep learning has been the primary driver of AI breakthroughs in recent years.
“When it’s successful, there is a huge benefit,” said Oren Etzioni, chief executive of the Allen Institute, founded in 2014 by Paul Allen, the billionaire co-founder of Microsoft. “But the cost of doing research is getting exponentially higher. As a society and an economy, we suffer if there are only a handful of places where you can be on the cutting edge.”
The evolution of one artificial intelligence lab, OpenAI, shows the changing economics, as well as the promise of deep-learning AI technology.
Founded in 2015, with backing from Elon Musk, OpenAI began as a nonprofit research lab. Its ambition was to develop technology at the frontier of artificial intelligence and share the benefits with the wider world. It was a vision that suggested the computing tradition of an inspired programmer, working alone on a laptop, coming up with a big idea.
This spring, OpenAI used its technology to defeat the world champion team of human players at a complex video game called Dota 2. Its software learned the game by constant trial and error over months, the equivalent of more than 45,000 years of game play.
The OpenAI scientists have realized they are engaged in an endeavor more like particle physics or weather simulation, fields demanding huge computing resources. Winning at Dota 2, for example, required spending millions of dollars renting access to tens of thousands of computer chips inside the cloud computing data centers run by companies like Google and Microsoft.
This year, OpenAI morphed into a for-profit company to attract financing and, in July, announced that Microsoft was making a $1 billion investment. Most of the money, OpenAI said, would be spent on the computing power it needed to pursue its goals, which still include widely sharing the benefits of AI after paying off investors.
As part of OpenAI’s agreement with Microsoft, the software giant will eventually become the lab’s sole source of computing.
“If you don’t have enough compute, you can’t make a breakthrough,” said Ilya Sutskever, chief scientist of OpenAI.
Academics are also raising concerns about the power consumed by advanced AI software. Training a large, deep-learning model can generate the same carbon footprint as the lifetime of five American cars, including gas, three computer scientists at the University of Massachusetts, Amherst, estimated in a recent research paper. (The big tech companies say they buy as much renewable energy as they can, reducing the environmental impact of their data centers.)
Etzioni and his co-authors at the Allen Institute say that perhaps both concerns — about power use and the cost of computing — could be at least partially addressed by changing how success in AI technology is measured.
The field’s single-minded focus on accuracy, they say, skews research along too narrow a path.
Efficiency should also be considered. They suggest that researchers report the “computational price tag” for achieving a result in a project as well.
Since their “Green AI” paper was published in July, their message has resonated with many in the research community.
Henry Kautz, a professor of computer science at the University of Rochester, noted that accuracy is “really only one dimension we care about in theory and in practice.” Others, he said, include how much energy is used, how much data is required and how much skilled human effort is needed for AI technology to work.
A more multidimensional view, Kautz added, could help level the playing field between academic researchers and computer scientists at the big tech companies, if research projects relied less on raw computing firepower.
Big tech companies are pursuing greater efficiency in their data centers and their artificial intelligence software, which they say will make computing power more available to outside developers and academics.
John Platt, a distinguished scientist in Google’s artificial intelligence division, points to its recent development of deep-learning models, EfficientNets, which are much smaller and faster than conventional ones. “That democratizes use,” he said. “We want these models to be trainable and accessible by as many people as possible.”
The big tech companies have given universities many millions over the years in grants and donations, but some computer scientists say they should do more to close the gap between the AI research haves and have-nots. Today, they say, the relationship that tech giants have to universities is largely as a buyer, hiring away professors, graduate students and even undergraduates.
The companies would be wise to also provide substantial support for academic research including much greater access to their wealth of computing — so the competition for ideas and breakthroughs extends beyond corporate walls, said Ed Lazowska, a professor at the University of Washington.
A more supportive relationship, Lazowska argues, would be in their corporate self-interest. Otherwise, he said, “We’ll see a significant dilution of the ability of the academic community to produce the next generation of computer scientists who will power these companies.”
At the Allen Institute in Seattle, Etzioni said, the team will pursue techniques to improve the efficiency of artificial intelligence technology. “This is a big push for us,” he said.
But Etzioni emphasized that what he was calling green AI should be seen as “an opportunity for additional ingenuity, not a restraint” — or a replacement for deep learning, which relies on vast computing power, and which he calls red AI.
Indeed, the Allen Institute has just reached an AI milestone by correctly answering more than 90% of the questions on a standard eighth-grade science test. That feat was achieved with the red AI tools of deep learning.