Generative AI tools and LLMs such as ChatGPT have exploded onto the tech scene. Here's a look at what that costs the environment and how to decrease the negative impact.
Generative AI is among the top disruptive technologies, and its potential to transform work has driven widespread adoption.
GenAI has significant value to businesses. It has the potential to grow in value each year and even raise global gross domestic product. However, the use of GenAI has risks and downsides, including its environmental impact. For example, it has high resource requirements, a hefty carbon footprint and requires a lot of water. GenAI also drives the construction of more computing facilities and increases the amount of e-waste.
"There is a large cost to training these large language models, and these computers don't run forever, so there's the environmental cost to that e-waste, as well," said Kevin Walsh, director in the U.S. Government Accountability Office (GAO) IT and cybersecurity team.
GenAI's high energy requirements and carbon footprint
To date, concerns about GenAI's environmental impact have been secondary to businesses' desire to mature the technology and put it to use. However, researchers said more organizations are paying attention to the environmental toll the tech takes and developing ways to minimize it.
GenAI could also help itself here. Many industries have begun to use it and other AI technologies to solve intractable problems, such as human-induced climate change, pollution and society's strain on the planet.
Organizations also face continuing demands to monitor their carbon emissions, report on environmental, sustainability and governance issues and improve their sustainability practices. In addition to the growing financial costs associated with the resources needed to support GenAI, organizations may need to address the technology's resource-intensive nature and find ways to make it more sustainable.
However, exact figures on GenAI's true energy consumption are elusive for various reasons, Walsh said. To start, GenAI developers don't disclose information needed for accurate calculations.
The GAO's report, "Generative AI's Environmental and Human Effects," co-authored by Walsh, states most commercial GenAI developers don't offer information on the emissions from training their models. The report also said data centers' energy needs vary based on factors such as location and the location's weather, which complicates counting GenAI's specific energy needs. Most data centers also process GenAI along with other types of computations, so they can't get GenAI-specific data.
GenAI inference and training
Researchers have also voiced concerns about the environmental costs of AI in general. However, they've specifically called out GenAI because it has more significant energy requirements than other types of intelligent technologies, said Vivek Mishra, senior member of the IEEE.
To understand why GenAI is particularly compute-intensive, it's important to understand how it works.
Training comes first. Here, developers give the model massive datasets so it can learn patterns and relationships -- a process that teaches it to perform specific tasks and produce content when asked.
Inference comes next. Inference happens when trained AI systems go to work and take what they've learned to do the following:
Analyze new data.
Generate content.
Predict outcomes in response to a user's query or prompt for information.
From an environmental standpoint, both training and inference require significant computational power, which in turn requires significant amounts of electricity to run. As the models, training data and parameters become larger, more computational power is required and, thus, so is more electricity.
CO2 footprint of GenAI data centers
The computational work to run GenAI models, such as ChatGPT, Bard and Claude, occurs in large cloud computing data centers. Yet, even before GenAI, data centers left a mark on the environment. The International Energy Agency estimated that data centers accounted for 1% of energy-related global greenhouse gas (GHG) emissions and approximately 300 metric tons of carbon dioxide (CO2) equivalent in 2020.
Since then, data centers have grown significantly in number and size to meet constantly increasing computing demands. AI has contributed to this buildout, because data centers require more energy, which consequently generates more GHG emissions. Data center power demand will likely continue to grow as AI evolves, which will also increase CO2 emissions over time.Data center power consumption in the U.S. will account for almost half of the growth in electricity demand by 2030 because of AI use, according to the International Energy Agency's "Energy and AI" report. It takes more electricity to process AI-related data than for all energy-intensive goods -- including aluminum, steel, cement and chemicals -- combined.
It takes more electricity to process AI-related data than for all energy-intensive goods -- including aluminum, steel, cement and chemicals -- combined.
GenAI also requires more computation and memory than other types of computing, such as a regular internet search, according to Michel Beaudouin-Lafon, chair of the Association for Computing Machinery Europe Technology Policy Council and a computer science professor at Université Paris-Saclay.
"Running those GenAI models efficiently requires GPUs, rather than CPUs, and GPUs are more power hungry. And people don't tend to do one query with GenAI; they continue to do it again and again to refine [the results]," Beaudouin-Lafon said.
The 2024 report "Global Data Centers: Sizing & Solving for CO2" from Morgan Stanley estimated that data centers will generate around 2.5 billion tons of GHG emissions worldwide by 2030. This is three times higher than it would have been without GenAI.
However, GenAI's specific effect on data centers' CO2 footprint is unknown. According to the GAO report, carbon emissions differ greatly based on geography and the type of energy used. So, GenAI's carbon emissions can't be accurately estimated without that specific information. Further, a data center's emissions are only part of its carbon footprint. They require many components whose manufacturing, transportation and construction also produce GHG emissions.
GenAI data centers' water needs
Data centers require significant amounts of water to cool all the racks. Globally, data centers consume approximately 4.3 trillion cubic meters of water annually. The amount per organization varies depending on size. GenAI's specific effect on water consumption is currently unknown.
Still, GenAI is a thirsty technology. The GAO report states: "The water consumption of training a particular generative AI model could directly evaporate 700,000 liters of fresh water for cooling in a state-of-art data center. This is approximately the same amount of water to fill 25 percent of an Olympic sized swimming pool."
Data centers use up a significant amount of water annually. GenAI will only increase the amount going forward.
GenAI's hardware requirements and e-waste
Although much of the environmental concerns around data centers center on energy use, water consumption and emissions, concerns over hardware construction and disposal have also been part of the sustainable AI conversation.
The power required to build data centers' physical components varies from manufacturing's GHG emissions. The process to obtain the necessary materials, including rare earth minerals, takes an environmental toll, Mishra said.
Moreover, the growing demand for computing power and innovative hardware incentivizes hyperscalers and other data center operators to replace existing equipment with newer hardware that can handle the increasing volume of work and AI processing.
"As better chips, better architectural designs come on the market, data centers turn over equipment faster and then that ends up as e-waste," Beaudouin-Lafon said. E-waste from GenAI remains underexplored and will likely continue to grow over time.
Do companies still overlook generative AI environmental concerns?
Despite GenAI's effect on the physical world, it hasn't been top of mind for many -- if not most -- executives. Rather, most enterprise leaders have focused on how to use AI for gain and how to address AI's anticipated social impacts -- such as fears of job loss, data privacy concerns, biases and unintended consequences, said Rick Pastore, research principal at SustainableIT.org.
However, the C-suite and other leaders have been paying more attention to sustainability issues in the past year. They know that development, deployment and use of AI tools will affect a company's target for net zero or emission reduction, because it is such a drain on power, Pastore said, although many still fail to measure the environmental impact of their GenAI usage.
Still, companies have begun to realize that hyperscalers like AWS, Google and Microsoft Azure, which own a significant portion of the cloud computing infrastructure and data centers where AI computing happens, are not solely responsible for addressing sustainability issues, Pastore said.
While IT doesn't have much of a lever to pull, IT does have some accountability and influence on this issue.
Rick PastoreResearch principal, SustainableIT.org
"The hyperscalers do have the responsibility to address the environmental impacts … But while IT doesn't have much of a lever to pull, IT does have some accountability and influence on this issue," Pastore said.
Looking forward: What's the future of generative AI's environmental impact?
Numerous organizations seek to address the environmental effects of digital technologies and improve their sustainability practices.
How can we reduce GenAI's environmental harm?
Hyperscalers and GenAI providers have been called to be more transparent and report more efficiently on GHG emissions and other environmental effects of GenAI, Walsh said.
Organizations should more aggressively scrutinize whether GenAI is necessary for a task or whether another, less impactful computer program would suffice. "We have to figure out where and when to use GenAI and where the tradeoffs are," Walsh said.
Other experts advocate for the use of small language models and responsible AI frameworks to help minimize AI's effect on the environment. Hyperscalers are taking such steps, like seeking out more renewable energy sources, Pastore said, if for no other reason than to lower their energy costs.
Using GenAI to support sustainability efforts
AI can also help itself become more sustainable. People have started to apply AI to solve environmental challenges, including designing smarter energy grids or creating reforestation guidance.
Moreover, AI already helps people work more efficiently, which can reduce the time required to complete tasks and the energy and resources required to do so, Pastore said.
"AI will be a tool for environmental improvement, too, and it will be used more and more for that once the mechanics of it become more energy efficient. It could even help solve problems that it didn't cause," Pastore said.
Editor's note: This article was originally published in 2023. It was updated to reflect changes in GenAI usage and sustainability best practices.
Mary K. Pratt is an award-winning freelance journalist with a focus on covering enterprise IT and cybersecurity management.