shyshka - Fotolia
Data center liquid cooling is emerging as a possible alternative to air cooling, but the available options are expensive and largely unproven.
For decades, air cooling has been the cheapest, most effective method to cool IT equipment. But the emergence of CPU-intensive applications and small form-factor servers -- which are becoming more popular in modern data centers -- is creating problems for air cooling systems.
Traditional air-based cooling is not efficient for modern data centers. Air has mediocre conductivity, so organizations must increase the size and strength of their fans and cooling systems to maintain temperatures. In some cases, these setups simply cannot cool a data center infrastructure.
Air systems run into problems when electricity consumption reaches 16 to 20 kilowatts per rack, according to Henrique Cecci, a research director at Gartner. Currently, only the world's largest data centers face these cooling challenges. Most companies use less than 10 kilowatts of power per rack. But the march to greater densities is ongoing and, eventually, many organizations will need better alternatives.
Admins try something new
Because traditional air cooling methods are no longer meeting the needs of hyperscale data centers, some companies are turning to novel techniques. Many companies have raised the floors in their data centers a foot or two to provide more space to disperse heat. Other organizations have deployed hot/cold aisle design or have reorganized their data center infrastructure.
"Some financial and technology companies are leaving the entire floor below their data centers vacant and using it as cooling conduits," said Ryan Orr, senior consultant at the Uptime Institute. These designs are costly, but the energy cost savings and the high price of alternatives make it viable.
Liquids are between 50 and 1,000 times better at conducting heat than air. Water has been a data center liquid cooling option for decades, but IT administrators often balked at its deployment; if a leak occurred, it would fry the server.
However, higher server densities have now led some large businesses to take that risk. In some cases, companies trust that the liquid cooling piping technology is advanced enough to contain leaks. Other companies use water in a limited fashion.
Companies rely on water to cool the rear doors on their racks, which are often the hottest spots in the server. Vendors such as IBM and Schneider Electric are building self-contained rear door water cooling units where leaks won't ruin the servers.
"We are seeing use of liquid cooling grow at a high rate -- 25% -- but it is still a small niche compared to the use of air overall," said Gartner's Cecci.
New challenges emerge with data center liquid cooling
Some vendors are trying to create new elements via nanotechnology. These emerging nanofluids have a greater capacity for heat than conventional cooling fluids and are also non-conductive to electrical currents.
"Right now, the nanotechnology work is largely experimental, and more research needs to be done to determine its potential impact," Gartner's Cecci said.
These new coolants are in the early developmental stages, so it is not clear how they will perform in production. Currently, there's no clear alternative to water, and data center experts are worried about the environmental effect of some new coolants.
The new setups and coolants are expensive; in some cases, they cost 10 to 100 times more than traditional cooling systems, and there isn't a lot of the research available for their use with commercial products.
"No standards exist for the new cooling equipment," Cecci said. Consequently, companies can't mix and match components from different suppliers.
Staffing problems arise
Aside from concerns about standards, admins are unfamiliar with the new technology.
"Training programs for new cooling techniques will need to be developed," the Uptime Institute's Orr said. Even after training, staff resistance can remain.
"For the last 30 or so years, IT has been told that liquids should not be used in the data center," Orr said. "Changing that mindset will be difficult."
These new systems also require a high level of cooperation between IT admins, who oversee the computing infrastructure, and operations technicians, who manage the building. In the past, the two groups have worked at an arm's length, but expertise from both departments is crucial for data center cooling uptime and efficiency.
Because a lot is at stake when testing data center liquid cooling setups, IT also needs to get support from top management.
Overall, the data center industry is realizing liquid cooling and nanofluids might prove to be more effective for keeping data centers cool. Which coolants and technologies will emerge as the best options remains an open question.