GBA Logo horizontal Facebook LinkedIn Email Pinterest Twitter X Instagram YouTube Icon Navigation Search Icon Main Search Icon Video Play Icon Plus Icon Minus Icon Picture icon Hamburger Icon Close Icon Sorted
Guest Blogs

As Use of A.I. Soars, So Does the Energy and Water It Requires

Generative artificial intelligence uses massive amounts of energy for computation and data storage—accountability around AI's environmental impact is coming

Inside the Guian Data Center of China Unicom, which uses artificial intelligence in its operations. Tao Liang / Xinhua via Getty Images

Two months after its release in November 2022, OpenAI’s ChatGPT had 100 million active users, and suddenly tech corporations were racing to offer the public more “generative A.I.” Pundits compared the new technology’s impact to the Internet, or electrification, or the Industrial Revolution—or the discovery of fire.

Time will sort hype from reality, but one consequence of the explosion of artificial intelligence is clear: this technology’s environmental footprint is large and growing.

A.I. use is directly responsible for carbon emissions from non-renewable electricity and for the consumption of millions of gallons of fresh water, and it indirectly boosts impacts from building and maintaining the power-hungry equipment on which A.I. runs. As tech companies seek to embed high-intensity A.I. into everything from resume-writing to kidney transplant medicine and from choosing dog food to climate modeling, they cite many ways A.I. could help reduce humanity’s environmental footprint. But legislators, regulators, activists, and international organizations now want to make sure the benefits aren’t outweighed by A.I.’s mounting hazards.

“The development of the next generation of A.I. tools cannot come at the expense of the health of our planet,” Massachusetts Senator Edward Markey (D) said last week in Washington, after he and other senators and representatives introduced a bill that would require the federal government to assess A.I.’s current environmental footprint and develop a standardized system for reporting future impacts. Similarly, the European Union’s “A.I. Act,” approved by member states last week, will require “high-risk A.I. systems” (which include the powerful “foundation models” that power ChatGPT and similar A.I.s) to report their energy consumption, resource use, and other impacts throughout their systems’ lifecycle. The EU law takes effect next year.

Meanwhile, the International Organization for Standardization, a global network that develops standards for manufacturers, regulators, and others, says it will issue criteria for “sustainable A.I.” later this year. Those will include standards for measuring energy efficiency, raw material use, transportation, and water consumption, as well as practices for reducing A.I. impacts throughout its life cycle, from the process of mining materials and making computer components to the electricity consumed by its calculations. The ISO wants to enable A.I. users to make informed decisions about their A.I. consumption.

An Amazon data center in a Northern Virginia suburb. Jahi Chikwendiu / The Washington Post via Getty Images

Right now, it’s not possible to tell how your A.I. request for homework help or a picture of an astronaut riding a horse will affect carbon emissions or freshwater stocks. This is why 2024’s crop of “sustainable A.I.” proposals describe ways to get more information about A.I. impacts.

In the absence of standards and regulations, tech companies have been reporting whatever they choose, however they choose, about their A.I. impact, says Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside, who has been studying the water costs of computation for the past decade. Working from calculations of annual use of water for cooling systems by Microsoft, Ren estimates that a person who engages in a session of questions and answers with GPT-3 (roughly 10 t0 50 responses) drives the consumption of a half-liter of fresh water. “It will vary by region, and with a bigger A.I., it could be more.” But a great deal remains unrevealed about the millions of gallons of water used to cool computers running A.I., he says.

The same is true of carbon.

“Data scientists today do not have easy or reliable access to measurements of [greenhouse gas impacts from A.I.], which precludes development of actionable tactics,” a group of 10 prominent researchers on A.I. impacts wrote in a 2022 conference paper. Since they presented their article, A.I. applications and users have proliferated, but the public is still in the dark about those data, says Jesse Dodge, a research scientist at the Allen Institute for Artificial Intelligence in Seattle, who was one of the paper’s coauthors.

A.I. can run on many devices—the simple A.I. that autocorrects text messages will run on a smartphone. But the kind of A.I. people most want to use is too big for most personal devices, Dodge says. “The models that are able to write a poem for you, or draft an email, those are very large,” he says. “Size is vital for them to have those capabilities.”

Big A.I.s need to run immense numbers of calculations very quickly, usually on specialized Graphical Processing Units—processors originally designed for intense computation to render graphics on computer screens. Compared to other chips, GPUs are more energy-efficient for A.I., and they’re most efficient when they’re run in large “cloud data centers”—specialized buildings full of computers equipped with those chips. The larger the data center, the more energy efficient it can be. Improvements in A.I.’s energy efficiency in recent years are partly due to the construction of more “hyperscale data centers,” which contain many more computers and can quickly scale up. Where a typical cloud data center occupies about 100,000 square feet, a hyperscale center can be 1 or even 2 million square feet.

Estimates of the number of cloud data centers worldwide range from around 9,000 to nearly 11,000. More are under construction. The International Energy Agency (IEA) projects that data centers’ electricity consumption in 2026 will be double that of 2022—1,000 terawatts, roughly equivalent to Japan’s current total consumption.

A QTS data center under construction in Litchfield Park, Arizona last month. Ash Ponders / Bloomberg via Getty Images

However, as an illustration of one problem with the way A.I. impacts are measured, that IEA estimate includes all data center activity, which extends beyond A.I. to many aspects of modern life. Running Amazon’s store interface, serving up Apple TV’s videos, storing millions of people’s emails on Gmail, and “mining” Bitcoin are also performed by data centers. (Other IEA reports exclude crypto operations, but still lump all other data-center activity together.)

Most tech firms that run data centers don’t reveal what percentage of their energy use processes A.I. The exception is Google, which says “machine learning”—the basis for humanlike A.I.—accounts for somewhat less than 15 percent of its data centers’ energy use.

Another complication is the fact that A.I., unlike Bitcoin mining or online shopping, can be used to reduce humanity’s impacts. A.I. can improve climate models, find more efficient ways to make digital tech, reduce waste in transport, and otherwise cut carbon and water use. One estimate, for example, found that A.I. -run smart homes could reduce households’ CO₂ consumption by up to 40 percent. And a recent Google project found that an A.I. fast-crunching atmospheric data can guide airline pilots to flight paths that will leave the fewest contrails.

Because contrails create more than a third of global aviation’s carbon emissions, “if the whole aviation industry took advantage of this single A.I. breakthrough,” says Dave Patterson, a computer-science professor emeritus at UC Berkeley and a Google researcher, “this single discovery would save more CO₂ than the CO₂ from all A.I. in 2020.”

Patterson’s analysis predicts that A.I.’s carbon footprint will soon plateau and then begin to shrink, thanks to improvements in the efficiency with which A.I. software and hardware use energy. One reflection of that efficiency improvement: as A.I. usage has increased since 2019, its percentage of Google data-center energy use has held at less than 15 percent. And while global internet traffic has increased more than twentyfold since 2010, the share of the world’s electricity used by data centers and networks increased far less, according to the IEA.

However, data about improving efficiency doesn’t convince some skeptics, who cite a social phenomenon called “Jevons paradox”: Making a resource less costly sometimes increases its consumption in the long run. “It’s a rebound effect,” Ren says. “You make the freeway wider, people use less fuel because traffic moves faster, but then you get more cars coming in. You get more fuel consumption than before.” If home heating is 40 percent more efficient due to A.I., one critic recently wrote, people could end up keeping their homes warmer for more hours of the day.

Ai-Da Robot, an AI-powered robot artist, addressing the British House of Lords, October 11, 2022. Rob Pinney / Getty Images

“A.I. is an accelerant for everything,” Dodge says. “It makes whatever you’re developing go faster.” At the Allen Institute, A.I. has helped develop better programs to model the climate, track endangered species, and curb overfishing, he says. But globally A.I. could also support “a lot of applications that could accelerate climate change. This is where you get into ethical questions about what kind of A.I. you want.”

If global electricity use can feel a bit abstract, data centers’ water use is a more local and tangible issue—particularly in drought-afflicted areas. To cool delicate electronics in the clean interiors of the data centers, water has to be free of bacteria and impurities that could gunk up the works. In other words, data centers often compete “for the same water people drink, cook, and wash with,” says Ren.

In 2022, Ren says, Google’s data centers consumed about 5 billion gallons (nearly 20 billion liters) of fresh water for cooling. (“Consumptive use” does not include water that’s run through a building and then returned to its source.) According to a recent study by Ren, Google’s data centers used 20 percent more water in 2022 than they did in 2021, and Microsoft’s water use rose by 34 percent in the same period. (Google data centers host its Bard chatbot and other generative A.I.s; Microsoft servers host ChatGPT as well as its bigger siblings GPT-3 and GPT-4. All three are produced by OpenAI, in which Microsoft is a large investor.)

As more data centers are built or expanded, their neighbors have been troubled to find out how much water they take. For example, in The Dalles, Oregon, where Google runs three data centers and plans two more, the city government filed a lawsuit in 2022 to keep Google’s water use a secret from farmers, environmentalists, and Native American tribes who were concerned about its effects on agriculture and on the region’s animals and plants. The city withdrew its suit early last year. The records it then made public showed that Google’s three extant data centers use more than a quarter of the city’s water supply. And in Chile and Uruguay, protests have erupted over planned Google data centers that would tap into the same reservoirs that supply drinking water.

Most of all, researchers say, what’s needed is a change of culture within the rarefied world of A.I. development. Generative A.I.’s creators need to focus beyond the technical leaps and bounds of their newest creations and be less guarded about the details of the data, software, and hardware they use to create it.

Some day in the future, Dodge says, an A.I. might be able—or be legally obligated—to inform a user about the water and carbon impact of each distinct request she makes. “That would be a fantastic tool that would help the environment,” he says. For now, though, individual users don’t have much information or power to know their A.I. footprint, much less make decisions about it.

“There’s not much individuals can do, unfortunately,” Ren says. Right now, you can “try to use the service judiciously,” he says.

David Berreby writes the Robots for the Rest of Us newsletter. His work about AI and robotics has appeared in The New York TimesNational GeographicSlate, and other publications. This article originally appeared on Yale Environment 360 on February 6, 2024. It is republished here with their permission.

4 Comments

  1. quicksilvervt | | #1

    How is this relevant to building? I'm paying for building advice.

    1. Deleted | | #2

      “[Deleted]”

  2. Pahornamut | | #3

    The increasing use of AI reflects our advancement in technology, bringing solutions and innovations that once seemed beyond reach. However, this progress comes with its challenges, notably the heightened demand for energy and water. Recognizing these issues is the first step towards sustainable development. Innovators and stakeholders are already working on more energy-efficient algorithms and greener data centers. It's a dynamic field, where every challenge sparks new solutions. Just as we evaluate platforms based on their merits, as seen in a binarycent review for its trading efficiency, the tech community assesses and evolves AI's energy use, aiming for a balance between innovation and sustainability.

  3. Hammel_Shaver | | #4

    "Another complication is the fact that A.I., unlike Bitcoin mining or online shopping, can be used to reduce humanity’s impacts. "

    It's understandable that many people have only heard of Bitcoin and that it "uses a lot of energy". However, how, where, and when it uses energy makes a significant difference as to it's environmental impact, and indeed usefulness to environmentally friendly energy production.

    This study from Cornell found that Bitcoin miners partnering with pre-grid connected renewable energy infrastructure can increase the profitability of renewable energy build out.
    https://pubs.acs.org/doi/10.1021/acssuschemeng.3c05445

    https://www.barrons.com/news/renewables-overproduction-turns-electricity-prices-negative-2f133a45
    As the article describes many grids that have a larger share than average of wind and solar generation often have periods of time where a significant percentage of their electricity has negative prices! Energy grids are a balancing act, they must be able to meet the few days of the year of peak demand but also not overproduce the rest of the year. Wind and Solar are tricky intermitted producers and as such will often overproduce when there is not enough demand to consume that energy. Having a contract with a buyer of energy that will respond to the production of energy would be very useful!

    Bitcoin miners are a very unique power consumer. Because something like 80% of the operating expense of Bitcoin mining is in electricity consumption. Bitcoin miners seek the cheapest possible electricity. They are also very unique in that they are interruptible, portable, and scalable. This means miners can turn off and on at a moments notice and locate themselves near energy grids that often have negatively priced energy, and those grids happen to be powered by renewable sources.

    There are many more nuances to Bitcoin miners and their relationships with grids and energy production. As well as a byproduct of their energy consumption. Heat! https://www.msn.com/en-us/money/markets/marathon-to-heat-finnish-community-using-bitcoin-mining-waste-heat/ar-BB1oHJzF

    There is also a hydro powered Bitcoin mine in the Congo using heat to dry coacoa and using the income from mining Bitcoin to financially support the national park it is located in!
    https://www.technologyreview.com/2023/01/13/1066820/cryptocurrency-bitcoin-mining-congo-virunga-national-park/

Log in or create an account to post a comment.

Community

Recent Questions and Replies

  • |
  • |
  • |
  • |