16 Comments

Summary:

A handful of Intel servers just emerged from a yearlong bath in an oil-based coolant, and the results were remarkable. The servers ran at a PUE just above 1.0, and showed no ill effects from the oil. Is oil immersion coming to a rack near you?

Intel motherboard oil
photo: Intel

The idea of immersing servers in oil to keep them cool isn’t entirely new — passionate gamers have been housing their systems in vegetable oil for years. But it’s time to take notice of this trend when Intel starts singing its praises as a potentially revolutionary method for slashing the price of running a data center.

The microprocessor giant just finished a yearlong test of Green Revolution Cooling’s mineral-oil server-immersion technology and is very happy with the results. According to Mike Patterson, senior power and thermal architect at Intel, not only does the technology appear perfectly safe for server components, but it might also become the norm for anyone needing maximum computer power or building out data center capacity.

Oil immersion, you say?

For the record, Green Revolution Cooling has been in the spotlight for a few years now and has reportedly attracted some big users — GigaOM’s own Katie Fehrenbacher actually profiled it and its technology in 2010 — but Intel is the biggest company to publicly come out as one of those customers. In essence, the company’s product, called CarnotJet, houses servers in a specialized coolant oil that absorbs the heat they give off and is then sent to a radiator where it’s cooled before being recycled back into the server housing.

And the technology is incredibly effective. Patterson said that whereas traditional air-cooled server racks often operate at a Power Usage Effectiveness rating of about 1.6 (meaning cooling tacks on a 60 percent increase over the power needed power the servers’ computing workloads), Intel’s oil-immersed servers were operating at a PUE between 1.02 and 1.03. It’s possible to achieve similarly low PUE ratings with traditional air- and liquid-cooling methods, Patterson said, but getting there can require some serious engineering effort and cost a lot of money.

As for concerns over the effect of all that oil on the servers’ processors, hard drives and other components, Patterson says companies probably shouldn’t sweat it. When its test period ended, Intel sent its servers to its failure-analysis lab, which, he said, “came back with a thumbs up that a year in the oil bath had no ill effects on anything they can see.”

Designing for the future

To be clear, though, Intel isn’t about to replace all the air-cooled servers in its data centers with oil-cooled ones. Rather, Patterson said, it’s just now in the evaluation phase. “We’re doing our math to understand if we developed an oil optimized platform, what that would mean [for performance, efficiency, etc.].”

Intel’s servers racks in their oil homes.

The results might mean Intel adopts the technology for production, or might mean Intel can help its server-manufacturer partners drive oil immersion into the mainstream. The first serious users of the technology — even withinin Intel — will probably be high-performance computing departments that want to put as much power as possible toward computing and little as possible toward cooling. “I think once it has proven itself in the HPC arena,” Patterson said, “further adoption will be the next step.”

He’s right about the heat-related pain points in HPC. I recently spoke with Greg Rusu of Peer1 Hosting, who explained to me some of the economic challenges of its on-demand HPC cloud service called Zunicore. Unlike the commodity scale-out servers that populate many cloud data centers, Peer1′s HPC servers are chock full of powerful processors and GPUs and can cost tens of thousands of dollars per machine.

And then there’s the power bill of those high performance servers: “These servers easily consume anywhere between 15 to 20 times the power of a high-end Xeon server,” Rusu said. “… They’re not just a little bit hotter, they’re a lot hotter.” While immersing Peer1′s HPC servers in oil might not cut down on their sticker prices, it could signficantly reduce the cost of operating them. A handful of Green Revolution’s early customer are research centers running HPC systems.

Taking oil immersion mainstream

But not everyone is willing to retrofit air-cooled servers for a life in oil, so taking oil immersion mainstream might require letting businesses buy off-the-shelf servers designed for oil immersion. Patterson said Intel’s research into how to build oil-optimized servers could result in a reference architecture around which servers manufacturers could begin building such systems.

Most servers today follow design principles for optimal airflow, but “we could throw some of those rules out,” he said, and maybe build a better server. The obvious steps are eliminating anything to do with fans, sealing hard drives (or going to solid-state drives) and replacing any organic materials that might leech into the oil. A redesign of the heat sink probably would be in order, as would a rethinking of where things sit on the motherboard.

Intel motherboard oil

A motherboard sitting in oil.

Although it’s a new idea to most companies, Patterson thinks the cost savings associated with oil immersion might might make it more palatable than you’d think. “I think the C-level [acceptance] may actually happen sooner,” he said. “… The bigger hurdle might be the data center operations folks themselves who don’t pay the energy bill.”

This is especially true as companies start building out data center space and are looking to save on construction costs as well as energy bills. Oil immersion means there’s no need for chillers, raised floors or other costly measures typically required for air cooling. And, Patterson added, it’s possible the energy stored in the hot oil could be reused more easily than the warm air servers return today, thus making a data center even more efficient.

However, as with anything new, even penny-pinching CFOs will have to come to terms with an entirely new way of operating their servers. “The first time yo uhear about it, you think, ‘Oh, come on, that’s a crazy idea’ … ” Patterson said. “You just have to get past the initial reaction. I think it’s an emotional response more than anything.”

  1. How do you maintain this stuff? Sticky fingers and oil dripping all over the floor every time you need to swap a blade out?

    Share
    1. Derrick Harris Friday, August 31, 2012

      That’s actually why he said operations guys might have the hardest time adjusting. Gotta let oil drip out after you pull something.

      Share
      1. If you built a drip that that vibrated it wouldnt that big of a deal.

        Share
  2. Great concept, but why do you need to immerse the whole server into the pond? Maybe this requires a rethink in board layout design.

    Share
    1. Michael V. Pelletier Saturday, October 27, 2012

      Good point – given today’s interconnect technologies, maybe they could connect the front-side-bus of the CPU via a cable, and turn the CPUs of the systems into hot-pluggable modules like glorified flash drives – just keep the CPUs and graphics cards in oil, and everything else stays dry.

      Share
  3. So what’s new? Dielectric oils have been used to cool and suppress sparks in transformers and high-voltage circuit-breakers since ages. Although the current generation of oils are free of Polychlorinated biphenyls (PCBs), to call them environmental-friendly would be an oxymoron.

    Share
  4. Electric transformers have been oil cooled for years. Maybe since the 1930′s.
    In my experiance with these, the oil is not sicky-gooey like cooking oil. And the units are sealed so the oil stays clean for years.
    Changing a blade though might take some ‘out of the box’ thinking amd engineering.

    Share
  5. Very cool stuff, we were testing oil submersion for a while but never took it all the way.

    We’re still using watercooling for high overclocks but think it would be REALLY cool to have a fully submerged system (Although then maintenance would be tricky… I mean, oily!)

    Share
  6. This isn’t revolutionary – it’s more like recovery of a lost art. Supercomputers and some high-end mainframes had liquid cooling more than 35 years ago, since the Cray-2 or thereabouts. Of course each machine was an individual that needed a batallion of techs to keep it functioning, rather than running reliabiliy in quantities of tens of thousands.

    Share
  7. Data centers will look like machine shops, with grimy oily floors. I’m not sure how this will enhance the prestige of the profession either…

    Share
    1. Michael V. Pelletier Saturday, October 27, 2012

      CityBadger: our prestige doesn’t come from the actions we take in maintaining and repairing the systems, or the way in which we get our hands dirty, it comes from the power and capability our flawlessly-operating and cost-effective systems can deliver to our users to allow them to achieve their goals.

      Share
  8. This doesn’t sound attractive if you treat this too literally — taking designs similar to current air-cooled servers and dunking them in oil. But if you think about where the biggest heat generators are — the CPU and GPU — they are candidates for encapsulation into a dedicated unit that can be made to efficiently couple heat to an external closed-loop oil chiller system; or with some engineering, to actually be added/removed from the loop with very little oil leakage.

    What’s exciting is that this immersion approach is possible with relatively inexpensive oil, instead of inert liquids (like Flourinert) that was used with supercomputers in the past.

    Share
  9. ” passionate gamers have been housing their systems in vegetable oil for years” Mineral oil not vegetable. Vegetable oil will break down mineral oil won’t.

    Share

Comments have been disabled for this post