Today's organizations are eager to realize the benefits of AI, which increasingly include reducing the company's carbon footprint. But make no mistake: AI has its own carbon footprint, which varies depending on the type of AI and the techniques used to train it.
For example, large-scale natural language processing models -- specifically transformer models -- have a very large carbon footprint, said Kjell Carlsson, principal analyst at Forrester.
Transformers are a type of neural network architecture, like recurrent neural networks (RNNs) and convolutional neural networks (CNNs). Unlike RNNs and CNNs, transformers can understand relationships between sequential data. This is particularly useful in language models, as transformers can better understand word context than other architectures.
"AI's footprint is coming up in conversations, but not usually with customers or end users. It's journalists and other analysts," Carlsson said. "To the degree we minimize the carbon footprint, that's valuable."
Still, if a company wants to understand its total amount of carbon dioxide emissions, it must understand AI is a contributing factor.
AI's carbon footprint
A University of Massachusetts Amherst research paper estimated that over an average lifetime, the carbon emissions associated with training a transformer model, such as BERT or GPT-2, with neural architecture search is equivalent to the carbon footprint of a car, including fuel.
Google and OpenAI published a paper that compares the performance and power requirements of both transformer and evolved transformer architectures. The latter is faster and requires less processing power (as explained below). The paper's authors encourage data scientists to consider four factors when calculating AI's footprint, which are the following:
- The algorithm. An evolved transformer uses 1.6x fewer floating-point operations per second and requires 1.1x -- 1.3x less training time than a transformer. An evolved transformer is also slightly more accurate than a transformer.
- The processor. Google's custom Tensor Processing Unit (TPU v2, which is a GPU) runs transformer and evolved transformer 4.3x and 5.2x faster than Tesla P100 GPUs, respectively. TPUv3 also uses about 1.2x less power in each case.
- The data center. Cloud data centers are about twice as energy-efficient as a typical enterprise data center.
- The energy mix. Greater use of clean energy reduces carbon emissions.
The paper focused on AI training rather than inferencing, even though training accounts for about 10% of a machine learning model's power consumption, while inferencing accounts for the remaining 90%. The power cost of training, however, is easier to calculate.
Measuring and reducing AI's carbon footprint
"There is a real need to think about how you're building these systems. Are you training a needlessly complex algorithm? How frequently are you retraining?" said Steven Mills, managing director and partner and chief AI ethics officer at Boston Consulting Group GAMMA.
Steven MillsChief AI ethics officer, BCG GAMMA
"There's also the AI supply chain, procuring algorithms, procuring computational hardware and thinking about the carbon footprint. For example, I can pull from AWS regions that rely on more sustainable energy sources, which will inherently reduce my carbon footprint," he said.
But how can a data scientist measure AI's footprint?
BCG GAMMA and others announced CodeCarbon, an open source project, which estimates the carbon footprint of computing, specifically the power used by privately hosted data centers and the underlying infrastructure from cloud providers. The project is intended to help data scientists make greener decisions about procuring compute. It also helps them optimize their code.
"If you're going to run AI, you're going to need 'the machines,' so you're going to leave a carbon footprint. As you get more data and the models [become] more complicated, the more energy you're going to consume to get the AI models you're looking for," said Dan Simion, vice president and North America artificial intelligence, data science and analytics practice lead at global professional services company Capgemini.
AI's green impact
There are many uses cases in which organizations use a combination of AI and IoT or industrial IoT to reduce their carbon footprint. According to a recent BCG study, companies can use AI to monitor their emissions, predict their future emissions and, armed with that knowledge, make adjustments to reduce emissions.
AI can also optimize logistics, reduce the materials required to build things, or otherwise reduce carbon emissions. For example, BCG's research estimates that by 2030, AI could reduce greenhouse gas emissions by 5% to 10% globally, which would translate to 2.6 to 5.3 fewer gigatons. Further, AI could generate $1 trillion to $3 trillion in value as applied to corporate sustainability.
AI for farming
One of BCG's winemaker clients plans to operate perpetually, but the company doesn't know what land it will own in the future. Land ownership is important to winemakers since individual fields produce greater or lesser yields over time based on several factors, including soil composition, rainfall, flooding and movements of surface waters.
Using that type of data, BCG built a crop yield model that understands how crop yields change over time, BCG managing director and partner Mike Lyons said.
Understanding the future state of crop yields informs real estate investments and divestments.
"The ultimate outcome was the client could buy, sell and protect land in a very forward-looking and strategic manner so they can continue to be a perpetual winemaker," Lyons said.
Capgemini's Simion said his organization uses AI to reduce the carbon footprints associated with supply chains, manufacturing processes and building machine learning models. The company also partnered with universities to understand how the migration patterns of whales are shifting because of climate change. In addition, Capgemini uses computer vision to identify diseased trees in forests that are difficult to reach on foot.
Other AI companies do similar work. DataRobot, for example, partnered with Entel Ocean, the digital unit of Chilean telecommunications company Entel, to automatically identify forest fires in Chile. Entel Ocean installed IoT devices on trees to collect surrounding environmental data. The company used DataRobot's machine learning and predictive models to process its collected data and predict forest fires in the area.
Looking at the clock
AI has both a negative and positive effect on the environment, and it's important to measure both. Many organizations feel the pressure of The Paris Agreement that requires member states to reduce their carbon footprint by 55% by 2030, compared to 1990.
Tools and frameworks are emerging to help understand AI's impact. Meanwhile, technology continues to improve at every layer of the tech stack, which should also help reduce carbon emissions, from algorithms to data centers and beyond.