Steve Lohr, At Tech’s Leading Edge, Worry About a Concentration of Power, NYTimes, Sept 26, 2019:
Computer scientists say A.I. research is becoming increasingly expensive, requiring complex calculations done by giant data centers, leaving fewer people with easy access to the computing firepower necessary to develop the technology behind futuristic products like self-driving cars or digital assistants that can see, talk and reason.
The danger, they say, is that pioneering artificial intelligence research will be a field of haves and have-nots. And the haves will be mainly a few big tech companies like Google, Microsoft, Amazon and Facebook, which each spend billions a year building out their data centers.
In the have-not camp, they warn, will be university labs, which have traditionally been a wellspring of innovations that eventually power new products and services.
“The huge computing resources these companies have pose a threat — the universities cannot compete,” said Craig Knoblock, executive director of the Information Sciences Institute, a research lab at the University of Southern California. [...] the scientists are worried about a barrier to exploring the technological future, when that requires staggering amounts of computing. [...]
A recent report from the Allen Institute for Artificial Intelligence observed that the volume of calculations needed to be a leader in A.I. tasks like language understanding, game playing and common-sense reasoning has soared an estimated 300,000 times in the last six years.
All that computing fuel is needed to turbocharge so-called deep-learning software models, whose performance improves with more calculations and more data. Deep learning has been the primary driver of A.I. breakthroughs in recent years.
Power hogs:
Academics are also raising concerns about the power consumed by advanced A.I. software. Training a large, deep-learning model can generate the same carbon footprint as the lifetime of five American cars, including gas, three computer scientists at the University of Massachusetts, Amherst, estimated in a recent research paper. (The big tech companies say they buy as much renewable energy as they can, reducing the environmental impact of their data centers.)
What are the appropriate metrics?
The field’s single-minded focus on accuracy, they say, skews research along too narrow a path.
Efficiency should also be considered. They suggest that researchers report the “computational price tag” for achieving a result in a project as well.
Henry Kautz, a professor of computer science at the University of Rochester, noted that accuracy is “really only one dimension we care about in theory and in practice.” Others, he said, include how much energy is used, how much data is required and how much skilled human effort is needed for A.I. technology to work.
A more multidimensional view, Mr. Kautz added, could help level the playing field between academic researchers and computer scientists at the big tech companies, if research projects relied less on raw computing firepower.
Keep in mind what the human brain accomplishes without anything remotely approaching those power requirements.
No comments:
Post a Comment