Enthusiasts have been using extreme cooling solutions for years, including water cooling loops, full submersion oil, and Peltier coolers. These setups have allowed DIY builders to run heavily overclocked systems without cooking their hardware, but have generally been been too esoteric to appeal to the enterprise world
Demand for data centers to pack more hardware into the same amount of space has increased the electrical and monetary cost for air cooling to the point that it is prohibitively expensive. This situation has encouraged some companies to explore alternatives, such as full submersion oil cooling and water cooling.
By fully submerging the hardware, oil is better able to affect the transfer of heat from the components and out of the facility. Much like the home-built “aquarium PCs,” oil-cooled servers also have a pump to circulate the oil and a radiator to cool it down before it’s returned to the system. In that respect it is similar to watercooling — just without using water-blocks. Like water, oil has a higher specific heat capacity than air, meaning it can absorb more heat for a given volume of coolant.
Where oil-cooled servers differ from home setups is in thescale, as data centers require massive radiators and heat exchange cooling stages. These systems use oils that are specially engineered dielectric fluids that do not conduct electricity, and in the case of Hardcore Computer’s Core Coolant oil it features “1,350 times greater heat rejection capacity by volume than air.” 3M’s Fluorinert is another popular choice for liquid cooling as it is many times more efficient than air.
Some companies such as Green Revolution Computing allow companies to overhaul their existing blade servers and then submerge them vertically in a large container of oil. The oil would then be circulated by a pump through a radiator located outside of the building. Boston, another oil cooling company, also offers pre-configured, hot-swappable micro blade servers running two Xeon 5600 CPUs that can slide into server racks. They utilize a series of pipes on the rear of the rack to pump oil through each blade where the internal hardware is cooled by oil. In either case, oil cooling the servers enables the data centers to pack more hardware into the same amount of space.
Traditionally, data centers utilize massive AC systems and spaced out server racks on raised floors. As demand for more hardware increases though, and server density goes up, air cooling becomes less efficient, resulting in data centers throwing a great deal of money into complex and expensive air cooling. Oil cooling can provide much more efficient cooling for closely packed hardware and at a lower cost. For example, Green Revolution Cooling claims up to a 95% reduction in cooling costs and a 10% to 20% reduction in power usage with their fully submerged oil cooling solutions. By oil cooling the data centers do not have to rely on massive AC systems that can account for close to half of a data center’s cooling budget. IDC has also found that for every $1 spent on hardware, it costs $.50 to keep the serverspowered on and cooled. The 2011 Green Data Conference announced that the cooling systems in data centers can use as much as 40% of total data center power consumption.
Intel recently performed a study [PDF] that determined liquid cooling works well for keeping racks of high powered compute servers cool, but it becomes more costly than air cooling for low-powered server hardware racks (i.e. storage servers). Instead of looking at it in terms of completely air- or liquid-cooling, they recommend a heterogeneous approach that grants data centers with a best-of-both-worlds solution that is as cost and energy efficient as possible.
While oil cooling offers greater efficiency and allows servers to run quieter and be more densely packed, it does have some barriers to adoption. Using liquid to cool hardware results in server racks that are much heavier, and data centers would need to ensure that the floors could sustain the weight (especially if they are currently using a raised floor). Also, the initial expense to overhaul the data center by installing a pump, radiator, and all the necessary connecting pipes would be very expensive. Further, using fully submerged oil cooling means that hardware is less accessible and upgrades/hardware faults are more difficult to perform, requiring additional training for technicians.
In the end, oil cooling servers can increase server density, and reduce cooling costs and electrical usage. New installations that need as much compute power as possible in a certain amount of area,such as supercomputers, would benefit the most from liquid cooling. Especially in areas of the world where the ambient air is more humid and hotter to start with, using oil — instead of forced air conditioning systems — is more efficient and can save electrical and cooling costs. As oil cooling technology improves, and more companies develop solutions around it, oil cooling has the potential to become a viable alternative for future data center servers.