The idea of immersing servers in oil to keep them cool isn’t entirely new — passionate gamers have been housing their systems in vegetable oil for years. But it’s time to take notice of this trend when Intel starts singing its praises as a potentially revolutionary method for slashing the price of running a data center.
The microprocessor giant just finished a yearlong test of Green Revolution Cooling’s mineral-oil server-immersion technology and is very happy with the results. According to Mike Patterson, senior power and thermal architect at Intel, not only does the technology appear perfectly safe for server components, but it might also become the norm for anyone needing maximum computer power or building out data center capacity.
Oil immersion, you say?
For the record, Green Revolution Cooling has been in the spotlight for a few years now and has reportedly attracted some big users — GigaOM’s own Katie Fehrenbacher actually profiled it and its technology in 2010 — but Intel is the biggest company to publicly come out as one of those customers. In essence, the company’s product, called CarnotJet, houses servers in a specialized coolant oil that absorbs the heat they give off and is then sent to a radiator where it’s cooled before being recycled back into the server housing.
And the technology is incredibly effective. Patterson said that whereas traditional air-cooled server racks often operate at a Power Usage Effectiveness rating of about 1.6 (meaning cooling tacks on a 60 percent increase over the power needed power the servers’ computing workloads), Intel’s oil-immersed servers were operating at a PUE between 1.02 and 1.03. It’s possible to achieve similarly low PUE ratings with traditional air- and liquid-cooling methods, Patterson said, but getting there can require some serious engineering effort and cost a lot of money.
As for concerns over the effect of all that oil on the servers’ processors, hard drives and other components, Patterson says companies probably shouldn’t sweat it. When its test period ended, Intel sent its servers to its failure-analysis lab, which, he said, “came back with a thumbs up that a year in the oil bath had no ill effects on anything they can see.”
Designing for the future
To be clear, though, Intel isn’t about to replace all the air-cooled servers in its data centers with oil-cooled ones. Rather, Patterson said, it’s just now in the evaluation phase. “We’re doing our math to understand if we developed an oil optimized platform, what that would mean [for performance, efficiency, etc.].”
The results might mean Intel adopts the technology for production, or might mean Intel can help its server-manufacturer partners drive oil immersion into the mainstream. The first serious users of the technology — even withinin Intel — will probably be high-performance computing departments that want to put as much power as possible toward computing and little as possible toward cooling. “I think once it has proven itself in the HPC arena,” Patterson said, “further adoption will be the next step.”
He’s right about the heat-related pain points in HPC. I recently spoke with Greg Rusu of Peer1 Hosting, who explained to me some of the economic challenges of its on-demand HPC cloud service called Zunicore. Unlike the commodity scale-out servers that populate many cloud data centers, Peer1′s HPC servers are chock full of powerful processors and GPUs and can cost tens of thousands of dollars per machine.
And then there’s the power bill of those high performance servers: “These servers easily consume anywhere between 15 to 20 times the power of a high-end Xeon server,” Rusu said. “… They’re not just a little bit hotter, they’re a lot hotter.” While immersing Peer1′s HPC servers in oil might not cut down on their sticker prices, it could signficantly reduce the cost of operating them. A handful of Green Revolution’s early customer are research centers running HPC systems.
Taking oil immersion mainstream
But not everyone is willing to retrofit air-cooled servers for a life in oil, so taking oil immersion mainstream might require letting businesses buy off-the-shelf servers designed for oil immersion. Patterson said Intel’s research into how to build oil-optimized servers could result in a reference architecture around which servers manufacturers could begin building such systems.
Most servers today follow design principles for optimal airflow, but “we could throw some of those rules out,” he said, and maybe build a better server. The obvious steps are eliminating anything to do with fans, sealing hard drives (or going to solid-state drives) and replacing any organic materials that might leech into the oil. A redesign of the heat sink probably would be in order, as would a rethinking of where things sit on the motherboard.
Although it’s a new idea to most companies, Patterson thinks the cost savings associated with oil immersion might might make it more palatable than you’d think. “I think the C-level [acceptance] may actually happen sooner,” he said. “… The bigger hurdle might be the data center operations folks themselves who don’t pay the energy bill.”
This is especially true as companies start building out data center space and are looking to save on construction costs as well as energy bills. Oil immersion means there’s no need for chillers, raised floors or other costly measures typically required for air cooling. And, Patterson added, it’s possible the energy stored in the hot oil could be reused more easily than the warm air servers return today, thus making a data center even more efficient.
However, as with anything new, even penny-pinching CFOs will have to come to terms with an entirely new way of operating their servers. “The first time yo uhear about it, you think, ‘Oh, come on, that’s a crazy idea’ … ” Patterson said. “You just have to get past the initial reaction. I think it’s an emotional response more than anything.”