Training an advanced AI model takes time, money and high-quality data. It also takes energy -- a lot of it.
Between storing data in large-scale data centers and then using that data to train a machine learning or deep learning model, AI energy consumption is high. While an AI system may pay off monetarily, AI poses a problem environmentally.
AI energy consumption during training
Take some of the most popular language models, for example.
A single V100 GPU can consume between 250 and 300 watts. If we assume 250 watts, then 512 V100 GPUS consumes 128,000 watts, or 128 kilowatts (kW). Running for nine days means the MegatronLM's training cost 27,648 kilowatt hours (kWh).
The average household uses 10,649 kWh annually, according to the U.S. Energy Information Administration. Therefore, training the final version of MegatronLM used almost the amount of energy three homes use in a year.
New training techniques reduce the amount of data needed to train machine learning and deep learning models, but many models still need a huge amount of data to complete an initial training phase, and additional data to keep up to date.
Data center energy usage
As AI becomes more complex, expect some models to use even more data. That's a problem, because data centers use an incredible amount of energy.
"Data centers are going to be one of the most impactful things on the environment," said Alan Pelz-Sharpe, founder of analyst firm Deep Analysis.
IBM's The Weather Company processes around 400 terabytes of data per day to enable its models to predict the weather days in advance around the globe. Facebook generates about 4 petabytes (4,000 terabytes) of data per day.
People generated 64.2 zettabytes of data in 2020. That's about 58,389,559,853 terabytes, market research company IDC estimated.
Data centers store that data around the world.
Meanwhile, the largest data centers require more than 100 megawatts of power capacity, which is enough to power some 80,000 U.S. households, according to energy and climate think tank Energy Innovation.
With about 600 hyperscale data centers -- data centers that exceed 5,000 servers and 10,000 square feet -- in the world, it's unclear how much energy is required to store all of our data, but the number is likely staggering.
From an environmental standpoint, data center and AI energy consumption is also a nightmare.
AI, data, and the environment
Using energy creates CO2, the primary greenhouse gas emitted by humans. In the atmosphere, greenhouse gases like CO2 trap heat near the Earth's surface, causing the temperature of the Earth to rise and throwing delicate ecosystems off balance.
"We have an energy consumption crisis," said Gerry McGovern, author of the book World Wide Waste.
AI is energy-intense, and the higher the demand for AI, the more power we use, he said.
"It's not simply the electrical energy to train an AI," he said. "It's building the supercomputers. It's collecting and storing the data."
McGovern pointed to estimates that by 2035, humans will have produced more than 2,000 zettabytes of data.
Alan Pelz-SharpeFounder, Deep Analysis
"The storage energy along for this will be astronomical," he said.
Right now, data's biggest users aren't doing much about the carbon footprint or AI energy consumption problem.
"I'm aware of some recognition [of AI's carbon footprint problem] but not a lot of action," McGovern said. "Data centers, which are the 'food source' for AI, have focused on electrical efficiency and have definitely made major improvements over the last 10 years."
While data centers have become more electrically efficient over the past decade, experts believe that electricity only accounts for around 10% of a data center's CO2 emissions, McGovern said. A data center's infrastructure, including the building and cooling systems, also produces a lot of CO2.
On top of that, data centers also use a lot of water as a form of evaporative cooling. This cooling method cuts down on electricity use but can use millions of gallons of water per day per hyperscale data center. In addition, the water used can get polluted in the process, McGovern noted.
"There is still this broad assumption that digital is inherently green, and that is far from the case," he said.
Businesses' environmental impact
While the average business can't change how major companies store their data, businesses concerned about their environmental footprint can focus on creating high-quality, rather than high-quantity, data. They can delete the data they no longer use, for example; businesses tend not to use 90% of data 90 days after it is stored, according to McGovern.
Businesses can also adjust how they're using AI or the type of AI they use.
Organizations can think about the specific use case they want to accomplish and pick an AI or automation technology dedicated to that use case. However, different types of AI have additional AI energy consumption costs.
Companies can get swept up in the idea that they need an advanced deep learning system that can do it all, Pelz-Sharpe said. However, if they want to tackle a focused use case, such as automating a billing process, they don't need an advanced system. These systems are expensive and use a lot of data, meaning they have a high carbon footprint.
A dedicated system will have been trained on a much smaller amount of data while likely completing a specific use case just as well as a more general system.
"Because it's highly specialized, that AI has been trained on the most accurate possible data" while maintaining a small data set, Pelz-Sharpe said. A deep learning model, meanwhile, must churn through massive amounts of data to achieve anything.
"In all our decisions, we must factor in the earth experience," McGovern said.