Getty Images

Tip

Explore liquid cooling for data centers

Discover the different classifications of liquid cooling -- such as direct-to-chip, liquid immersion or rear-door heat exchangers -- before adopting it in your data center.

As the performance levels and rack power densities of modern computing equipment climb, more companies transition from air to liquid cooling, since liquid offers a more efficient method of transferring heat. Despite some concern about mixing liquid and electronics, liquid cooling technology has evolved to make such concerns more obsolete.

Water at standard conditions is far better at conducting heat per unit volume than air, which means liquid cooling increases both cooling effectiveness and energy efficiency for data centers that employ it. Plus, it's easier to manage than high volumes of air. ASHRAE Technical Comittee 9.9 has even added another liquid cooling classification to standardize the breadth of liquid cooling applications.

Categories of liquid cooling

Liquid cooling refers to any practice where liquid enters a cabinet to carry away heat, even when combined with air movement. There are two fundamental classifications of liquid cooling: Direct, where liquid actually cools the server and/or components, and close-coupled, where air cools the server and/or components but the heat is rejected to a liquid stream.

These two classifications can be broken down further into several variations:

  • Direct liquid cooling uses a conductive interface and is often referred to as direct-to-chip cooling. The coolant doesn't actually touch the computing components; rather, the cooling fluid circulates through the plate to which the processor chips are mounted. This is a highly effective form of cooling but requires a plumbing system within the cabinets, in addition to the power and network cabling systems, so management can be challenging. High-performance research processors and supercomputers use direct-to-chip cooling most often.
  • Liquid immersion usually means servers submerge in a tub of nonconductive fluid that carries away heat, but it also includes server-level immersion that seals each server in a container of liquid. The original cooling medium was mineral oil, although data centers can also use other fluids, such as 3M Novec. Businesses can immerse standard servers with relatively minor modifications, such as fan removal and sealed spinning disk drives. Solid-state drives require no modification at all. Full immersion provides liquid thermal density, which absorbs heat for several minutes after a power failure without the need for backup pumps. Tanks equivalent to 42U rack capacity can cool up to 100 kilowatts (kW) in most climates with only an outdoor heat exchanger or condenser water. The minimal mechanical refrigeration requirements make liquid immersion a great choice for free cooling.
  • Rear door heat exchanger (RDHx) units are close-coupled indirect systems. They circulate liquid through coils embedded in cabinet doors to remove server heat before exhausting into the room. Use them for total cooling applications or to bring high cabinet exhaust temperatures within the cooling limits of air conditioners. For total cooling, an RDHx unit keeps the entire room at the IT equipment inlet air temperature. This makes hot and cold aisle cabinet configurations and air containment designs unnecessary since exhaust air cools to inlet temperature and can recirculate back to the servers. The most efficient RDHx units are passive, meaning that server fans move air through them. They are generally limited to between 20 kW and 32 kW of heat removal. RDHx doors incorporating supplemental fans can cool higher heat loads, up to 60 kW.

Other considerations for liquid cooling

High-density computing hardware that uses direct liquid cooling requires constant fluid circulation through a power failure. High-performance devices enter self-protective thermal shutdown within seconds if cooling is interrupted. Such devices often require chilled water storage -- which can sometimes use the residual from large header pipes -- and ancilliary pumps on uninterruptible power supply (UPS) backup.

RDHx units might not always require backup pumps since the enire room remains at server inlet temperature. Depending on the size of the room, the air temperature can remain within ASHRAE's allowable limits until generators start. Use rapid restart chillers to resume cooling quickly after restoring power.

Most liquid-cooled devices are both high performance and mission-critical, so ensure power and cooling are redundant. Design double-ended piping by including isolation valves that isolate segments for maintenance without losing liquid flow through the rest of the cooling loop.

Electrical design is equally important. Many liquid-cooled devices shut down within seconds if cooling fails, so keep backup pumps and door fans on UPS. Keep any mechanical portions on a separate, redundant UPS circuit.

Dig Deeper on Data center design and facilities

SearchWindowsServer
Cloud Computing
Storage
Sustainability
and ESG
Close