Monday, February 26, 2024

The energy costs of generative AI are currently large enough to impact the environment

David Derreby, The Growing Environmental Footprint Of Generative AI, Undark, Feb, 20, 24.

One of the things that David Hays impressed on me when I studied with him back in the 1970s is that computing, real computing, is a physical process. It requires physical resources and extends out in time. The emergence of generative AI makes that abundantly clear. From the article:

AI can run on many devices — the simple AI that autocorrects text messages will run on a smartphone. But the kind of AI people most want to use is too big for most personal devices, Dodge said. “The models that are able to write a poem for you, or draft an email, those are very large,” he said. “Size is vital for them to have those capabilities.”

Big AIs need to run immense numbers of calculations very quickly, usually on specialized Graphical Processing Units — processors originally designed for intense computation to render graphics on computer screens. Compared to other chips, GPUs are more energy-efficient for AI, and they’re most efficient when they’re run in large “cloud data centers” — specialized buildings full of computers equipped with those chips. The larger the data center, the more energy efficient it can be. Improvements in AI’s energy efficiency in recent years are partly due to the construction of more “hyperscale data centers,” which contain many more computers and can quickly scale up. Where a typical cloud data center occupies about 100,000 square feet, a hyperscale center can be 1 or even 2 million square feet.

Estimates of the number of cloud data centers worldwide range from around 9,000 to nearly 11,000. More are under construction. The International Energy Agency, or IEA, projects that data centers’ electricity consumption in 2026 will be double that of 2022 — 1,000 terawatts, roughly equivalent to Japan’s current total consumption.

However, as an illustration of one problem with the way AI impacts are measured, that IEA estimate includes all data center activity, which extends beyond AI to many aspects of modern life. Running Amazon’s store interface, serving up Apple TV’s videos, storing millions of people’s emails on Gmail, and “mining” Bitcoin are also performed by data centers. (Other IEA reports exclude crypto operations, but still lump all other data-center activity together.)

Yet:

Another complication is the fact that AI, unlike Bitcoin mining or online shopping, can be used to reduce humanity’s impacts. AI can improve climate models, find more efficient ways to make digital tech, reduce waste in transport, and otherwise cut carbon and water use. One estimate, for example, found that AI-run smart homes could reduce households’ CO2 consumption by up to 40 percent. And a recent Google project found that an AI fast-crunching atmospheric data can guide airline pilots to flight paths that will leave the fewest contrails.

Because contrails create more than a third of commercial aviation’s contribution to global warming, “if the whole aviation industry took advantage of this single A.I. breakthrough,” says Dave Patterson, a computer-science professor emeritus at UC Berkeley and a Google researcher, “this single discovery would save more CO₂e (CO₂ and other greenhouse gases) than the CO₂e from all A.I. in 2020.”

Patterson’s analysis predicts that AI’s carbon footprint will soon plateau and then begin to shrink, thanks to improvements in the efficiency with which AI software and hardware use energy. One reflection of that efficiency improvement: as AI usage has increased since 2019, its percentage of Google data-center energy use has held at less than 15 percent. And while global internet traffic has increased more than twentyfold since 2010, the share of the world’s electricity used by data centers and networks increased far less, according to the IEA.

However, data about improving efficiency doesn’t convince some skeptics, who cite a social phenomenon called “Jevons paradox”: Making a resource less costly sometimes increases its consumption in the long run.

There's more at the link.

No comments:

Post a Comment