Data center cooling systems and technologies and how they work
Extreme heat and cold can keep equipment from operating at peak efficiency. Explore cost-efficient and cost-effective cooling technologies and smart options for your facility.
One of the most vital tasks for any data center is environmental monitoring and management. High temperatures and humidity levels can damage IT equipment, leading to failures. Such conditions can also create discomfort for personnel working inside the data center.
Fortunately, many systems and technologies can help monitor and manage data center cooling to maintain optimal temperature and humidity levels.
What is data center cooling?
Data centers consume a lot of power, which generates heat. The more equipment in a facility, the more heat it generates. Data center cooling involves the tools, systems, techniques and processes used to maintain ideal temperatures and humidity levels inside a data center.
Proper data center cooling ensures the entire facility has sufficient ventilation, humidity control and cooling to keep all equipment within the desired temperature ranges.
Why is data center cooling important?
High temperatures and humidity levels are undesirable conditions for IT and electrical equipment. Most IT devices and equipment generate heat and need to get rid of it quickly to avoid performance degradation.
Facilities and equipment setups should be designed to minimize excess heat and humidity because these conditions can damage devices and equipment, causing them to malfunction or stop working. Worse, damaged equipment increases the facility's fire risk and other safety issues for on-site staff. These risks raise operational costs, as equipment must be repaired or replaced more often.
As most data centers run ASHRAE Class A1 and A2 equipment, facility managers must ensure their cooling systems are up to the task. The need to buy additional or up-to-date equipment to meet cooling requirements explains why the global cooling market will grow by nearly 14% annually until 2033, according to Astute Analytica.
The U.S. cooling market alone is expected to reach $8.24 billion in spending by 2029, according to Research and Markets.
How does data center cooling work?
Data center cooling transfers heat away from equipment and the air, replacing it with cooler air. This is typically done in one of several ways:
- Airflow strategies to maximize the removal of hot air and circulation of colder air, such as hot and cold aisle design, raised-floor cool air delivery, adiabatic cooling that uses air pressure differentials to regulate temperatures, such as free cooling
- Equipment cooling options that aim to cool directly onto hot components, such as direct-to-chip liquid cooling, immersion cooling, rear-door heat exchangers and microchannel heat exchangers.
- Cooling or heating the facility to the highest recommended temperature and replacing equipment once it fails. Using this heat-cooling, or close-coupled cooling, method can be cheaper, as it might cost significantly less than equipment replacement.
Current data center cooling systems and technologies
Air and liquid cooling are two of the most popular data center cooling methods, each with several approaches.
Air cooling
Air cooling has been the standard for data centers since nearly the beginning. It is a well-understood technology and strategy, and when combined with other options such as raised floors and hot- and cold-aisle designs, it can be adequate for smaller facilities or those handling typical workloads.
In a raised-floor setup, when the computer room AC (CRAC) unit or computer room air handler (CRAH) sends cold air, the pressure below the raised floor increases, forcing the cold air into the equipment inlets. The cold air displaces the hot air, which is then returned to the CRAC or CRAH, where it's cooled and recirculated.
In-row cooling units offer a more focused approach by placing them closer to the heat sources, improving cooling efficiency and response times to alerts or monitoring system changes.
A CRAH is more efficient than a CRAC, as it draws outside air in and cools it using chilled water instead of refrigerant. A CRAC functions like a residential AC unit, using refrigerants to cool the air. CRAC units are better suited to small data center closets because they can't keep up with the demands of enterprise-level data centers.
Hot and cold aisle layouts
With this air-based cooling strategy, server cabinets and racks are arranged in rows, with each row facing the opposite direction from the one in front. The hot and cold air aisles increase the efficiency of the cooling systems by enabling more targeted placement of intake and exhaust vents. Hot air is vented from the hot aisle, and cool air is pumped through the cold aisle. This prevents hot and cold air from mixing, allowing the cooling system to work more efficiently.
Add doors, walls or partitions to the layout to further direct airflow for hot and cold aisles. Cabinets should be as full as possible to avoid the empty spaces, gaps and cable openings that can leak hot or cold air into the opposite aisle, causing the cooling system to work overtime.
Liquid cooling
Liquid cooling options are evolving as server workloads and density increase, especially with AI workloads. They are more efficient than air cooling because they transfer heat more effectively from the hottest components in the equipment. Liquid cooling is more cost-effective because it can be installed directly on the devices that need it the most. It can also support greater equipment densities and items that generate higher-than-average heat, such as high-density and edge-computing data centers.
There are two main types of liquid cooling:
- Direct-to-chip liquid cooling. This method uses flexible tubes to deliver nonflammable dielectric fluid directly to the processing chip or motherboard component that generates the most heat, such as the CPU or GPU. The fluid absorbs the heat by turning it into vapor, which carries the heat out of the equipment through the same tube.
- Liquid immersion cooling. This method places the entire electrical device into dielectric fluid in a closed system. The fluid absorbs the heat emitted by the device, turns it into vapor and condenses it, helping the device cool down.
An additional cooling method is rear-door heat exchangers (RDHx). This method is typically combined with liquid cooling and adds a specialized door at the rear of server racks to chill the hot air expelled by the servers. At the same time, coolant transports the absorbed heat to a secondary cooling system. It can be passive RDHx, where the expelling airflow is generated by the server's internal fans, or active RDHx, where fans are added to the racks to assist in pulling exhaust air out of the racks and through the secondary cooling system.
Secondary cooling system equipment
Beyond the main cooling systems and options, there are other systems and equipment needed to ensure a reliable and efficient cooling system, including:
- Sensors. Temperature, humidity and airflow.
- Monitoring applications. Alerting software or modules that provide real-time feedback to data center operators before issues happen.
- Ducting systems and other physical equipment. Properly maintained ducting, heat exhaust/ingestion vents, hoses, raised floors and server racks are all necessary to preserve the cooling system's integrity, efficiency and uptime.
Importance of energy efficiency in data center cooling
Cooling systems should be part of a data center's overall energy-efficiency strategy. As hyperscale and AI-driven workloads increase, data center facilities will face ever-increasing energy bills, reaching nearly 21% of global energy demand by 2030, according to MIT Sloan School of Management.
Ensuring the facility's infrastructure, such as HVAC and power systems, is in good repair is a good first step. Next, operators can review the IT hardware they use to ensure it remains optimally functioning. Replacement and sunsetting processes can help by introducing more modern, efficient technologies as needed.
Exploring new cooling technologies is another way to manage energy efficiency. New and evolved technologies, such as free cooling and liquid cooling systems, can greatly reduce cooling needs and increase energy consumption efficiency across the facility.
Future data center cooling systems and technologies
Although liquid cooling is still relatively new, other data center cooling technologies are on the horizon, such as geothermal cooling methods, smart technologies that use AI and machine learning to better monitor and manage cooling, and evaporative cooling.
Striving for carbon-neutral data center cooling
Here are some ways data centers can use nature to cool their facilities:
- Geothermal cooling uses the near-constant temperature of the Earth's crust to provide cooling. It's a centuries-old idea, once used to keep food cold, adapted to our modern era. In data centers, geothermal cooling uses a closed-loop piping system with water or another coolant that runs through underground vertical wells filled with a heat-transfer fluid. Iron Mountain's western Pennsylvania data center, Verne Global in Iceland and Green Mountain in Norway use geothermal cooling.
- Evaporative cooling, or swamp cooling, takes advantage of the drop in temperature that occurs when water is exposed to moving air and begins to vaporize and change to a gas. A fan draws warm data center air through a water- or coolant-moistened pad, and as the liquid evaporates, the air is chilled and returned to the data center. It can cost a fraction of an air-cooled HVAC system and works best in low-humidity climates.
- Solar cooling converts heat from the sun into cooling that can be used in data center air cooling systems. The system collects solar power and uses a thermally driven cooling process to lower the building's air temperature. This is useful in areas with a lot of sunlight or for data centers looking to supplement their current cooling with a more environmentally friendly method.
- Kyoto Cooling is an enhancement of the free-cooling method that uses a thermal wheel to control airflow between hot and cold zones in the data center. Internal hot air is vented to the outside as the wheel rotates. The outside air then cools the wheel and the air that is drawn back into the facility. It uses between 75% to 92% less power to run than other CRAH systems, reduces carbon dioxide emissions and eliminates the need for water in the cooling system. The technology is used by United Airlines' data center outside Chicago and by HP's data center outside of Toronto.
Making data center cooling smarter
Because many newer data center cooling technologies require significant investment from facility owners, smart technology has become popular. Data center smart assistants, AI and machine learning technologies can monitor facilities more efficiently and make real-time adjustments to ensure optimal temperatures and humidity levels. Google, for example, uses smart temperature controls to reduce heat output and cooling usage.
Data center cooling robots can move within the facility, monitoring temperatures and humidity levels in specific server cabinets. One challenge with manually monitoring cabinet temperatures is that conditions change as soon as the cabinet is opened. Companies such as OneNeck IT Solutions have developed a robot sensor probe that fits into standard cabinets. The robot moves up and down a belt-driven rail inside the cabinet to collect temperature data for each rack. It then transmits the data using Bluetooth to connected devices so data center pros can create a full heat map of the cabinet.
Improving heat exchange technology
Most technology includes a heat exchange feature, and as data centers handle increasing computer workloads, this technology is improving too. Server microchannel heat exchangers are evolving to use larger channels and different fluids, enabling more efficient heat transfer. They also use less cooling refrigerant than traditional exchange options, increasing their overall performance.
Data center demand will only increase, so facility owners and their customers must look to more efficient, cost-effective cooling solutions -- whether that's less environmentally harmful options, such as geothermal and free cooling, or investing in and combining newer technologies, such as liquid immersion cooling for high-powered servers.
Editor's note: This article was updated in March 2026 to reflect new statistics, data center cooling practices and to enhance the reader's experience.
Julia Borgini is a freelance technical copywriter, content marketer, content strategist and geek. She writes about B2B tech, SaaS, DevOps, the cloud and other tech topics.