https://www.techtarget.com/searchdatacenter/tip/How-to-manage-data-center-water-usage-sustainably
According to the United Nations, water and climate change are strongly linked. Climate change causes higher water temperatures that worsen water pollution and quality, while melting glaciers impact irrigation and water supplies.
Energy production is highly reliant on water, which is why the spotlight is on data centers and their environmental impact. Data centers directly and indirectly consume vast amounts of electricity, heat and water.
The need for more data centers and rapid advancements in AI workloads is increasing water consumption throughout the industry. The accuracy of water consumption is cause for concern due to a lack of transparency across the industry, but some organizations are finally opening up.
Data centers consume the most water in three significant areas: cooling, humidification and electricity generation.
Data centers have highly sensitive, power-intensive IT equipment, much of which is constantly up and running. If equipment gets too hot, it can overheat, malfunction and break down. To maintain uptime and avoid equipment failure, many data centers use water-based cooling to help equipment remain at safe temperatures.
According to an article in npj Clean Water, there are several mechanisms for data center cooling. These mechanisms include chillers that reduce air temperature by cooling water and use it as a heat transfer mechanism; cooling towers that reroute hot air across a wet media to evaporate the water; and adiabatic economizers that spray water into the airflow or onto a heat exchange surface to cool air that enters the data center. While some of these systems recirculate water, water loss is inevitable due to evaporation and water quality maintenance.
Drier air creates static electricity buildup that can lead to electrostatic discharges. This is dangerous for the sensitive equipment in a data center. To reduce this buildup, many data centers use humidification systems to maintain a target humidity level. These systems can vaporize water and add it to the air or draw in air to remove moisture.
One of the challenges of water management in humidification systems is scale buildup. As water circulates through the system and evaporates, minerals are separated and left behind. If recirculated water becomes oversaturated with such minerals, humidification efficiency can worsen. Water must be regularly replaced. Managing water quality and efficiency can help reduce water loss in this process, but it can only be optimized, not eliminated.
Data centers use water indirectly through electricity. Power must come from somewhere, whether by burning coal to heat water and push steam through turbines or through hydropower plants that use the movement of water to power turbines. Water loss is particularly concerning for fossil fuel-based power plants and hydropower systems, though solar and wind energy consume water during manufacturing and construction.
According to the U.S. Energy Information Administration, the water intensity of electricity generation varies by technology. In 2021, for example, coal-powered plants had an average water withdrawal intensity of 19,185 gallons per megawatt-hour (MWh), while natural gas averaged 2,803 gallons per MWh.
Data center admins and owners must understand how water consumption differs among various electricity generation methods, while also considering the data center's location. By exploring various energy sources and their corresponding water usage rates, admins and owners can identify the most sustainable and efficient options. This knowledge will enable them to better track their actual water usage and gain insights into strategies for reducing their indirect water consumption, ultimately minimizing their environmental impact.
According to The World Counts, an open source community-driven project that aggregates consumption data from organizations worldwide, more than 4.3 trillion cubic meters, approximately 1.1 quadrillion gallons, of water are consumed globally every year. As of July, more than 2.3 trillion tons of freshwater have been used in 2025.
Tech giants, like Google, Amazon and Microsoft, release their water consumption figures to the public; however, there's some debate about the accuracy of such statistics.
In 2022, Google released its 2021 water consumption statistics to demonstrate its commitment to climate-conscious data center cooling. In 2021, Google's data centers consumed approximately 4.3 billion gallons of water, averaging 450,000 gallons daily.
Google has since updated these figures in its "Environmental Report 2025," which claims that the organization's data centers consumed about 8.1 billion gallons of water in 2024. In just three years, Google's water consumption has nearly doubled. Rapid AI advancement, especially Google's AI Overview feature, has also occurred during the last three years. AI integration and evolution lead to more energy use, which leads to more water consumption.
Google also claimed that its water stewardship projects replenished about 4.5 billion gallons of water, which is about 64% of Google's freshwater consumption in 2024. However, some skepticism remains around these figures.
For example, an investigative report from Kairos Fellowship suggests this metric is misleading because Google is using its annual water savings from water stewardship projects to offset its actual water use. These projects are not completed at the source where water is being extracted and consumed, meaning Google is not repairing the harm that's already been done. This makes it difficult to adequately measure the company's actual water consumption.
While Amazon has not disclosed its total water consumption, the company claimed that its AWS data centers use 0.15 liters (L) of water per kilowatt-hour (kWh). The average data center uses 1.8 L of water per kWh. This metric is known as water usage effectiveness (WUE), a term developed by The Green Grid that measures data center sustainability regarding water usage and its relation to energy consumption. WUE is a ratio of the consumption of water in liters and the consumption of energy in kilowatt-hours. The lower the ratio, the more efficient it is.
Amazon announced a commitment to be water-positive by 2030. The company interprets this as returning more water to communities and the environment than is used in its data center operations. In 2024, Amazon claims AWS was 53% of the way toward its goal of being water-positive.
An investigation by SourceMaterial and The Guardian found that Amazon's three proposed data centers in Spain are licensed to use about 755,720 cubic meters of water a year. That's the equivalent of enough water to irrigate more than 200 hectares, or about 500 acres, of corn, one of the region's main crops. This provides insight into how much water the company's data centers may consume, and how it could eventually impact food production.
Microsoft currently operates more than 300 data centers worldwide. According to company estimates, these data centers operate with an average WUE of 0.30 L/ kWh. This is a 39% improvement over 2021 estimates, when the global average was 0.49 L/kWh.
Some facilities are significantly more efficient than others. For example, Microsoft's data centers in Arizona, a drier, more arid region, have a WUE of 1.52 L/kWh, while the data centers in Singapore operate with a WUE of 0.02 L/kWh.
Microsoft has also committed to being water-positive by 2030. To achieve that goal, Microsoft will begin deploying closed-loop cooling systems that won't lose water to evaporation. One facility that will use this technology is in Mt. Pleasant, Wisconsin. The innovative system would consume a peak daily use of about 350,000 gallons.
According to Wisconsin Public Radio, the local community is skeptical. A local water and agriculture program director questions whether the systems will operate as intended.
The tech giants are not the only companies contributing to excessive water use. Most data centers consume water on a massive scale, and many companies do not publish or collect the requisite data to provide an accurate picture of how much water their data centers consume.
Data center admins must recognize the importance of transparency and share their water consumption figures. This will help create a baseline for the industry and contribute to new sustainability goals, ensuring the industry is doing its part to reduce water consumption, limit climate change and protect water availability.
Here are a few ways data centers can limit water consumption:
The biggest obstacle in curtailing data center water consumption in the years ahead is the rapid proliferation of AI. The water footprint of AI is already growing astronomically. For example, according to "Making AI Less 'Thirsty': Uncovering and Addressing the Secret Water Footprint of AI Models," training an LLM in Microsoft's data centers can directly evaporate 700,000 liters of water. Furthermore, by 2027, global AI demand is projected to account for 4.2 to 6.6 billion cubic meters of water withdrawal, which is more than the total annual water withdrawal of half of the United Kingdom.
If left unchecked, data center water consumption requirements will pose significant problems for companies, stakeholders and government entities worldwide. Water shortages, scarcity and stress will worsen, and the impacts of climate change, such as water depletion and biodiversity loss, will be further exacerbated. Mitigating these environmental impacts by managing water usage, investing in more efficient water technologies and adopting more sustainable water practices, such as circular water management, is necessary.
Jacob Roundy is a freelance writer and editor with more than a decade of experience with specializing in a variety of technology topics, such as data centers, business intelligence, AI/ML, climate change and sustainability. His writing focuses on demystifying tech, tracking trends in the industry, and providing practical guidance to IT leaders and administrators.
24 Jul 2025