1 Comment

Summary:

Dunking servers in a bath of oil sounds like the fastest way to break some very expensive hardware, but not for startup Green Revolution Cooling, which builds energy efficient liquid-cooled servers. Its first customer, collocation firm Midas Networks, will implement the technology later this year.

GRC2

Dunking servers in a bath of oil sounds like the fastest way to break some very expensive hardware, but not for startup Green Revolution Cooling (GRC), which builds energy efficient liquid-cooled server systems. Now, according to HPCwire, GRC’s first customer, collocation firm Midas Networks, will implement the technology later this year.

Here’s how it works: GRC fills a rack enclosure with a mineral oil mixture that doesn’t conduct electricity, but has 1,300 times the heat capacity of air, and dunks the servers in it. The mixture can be cooled using much less energy — partly since it’s in contact with all the parts of the hardware —  and a pump circulates the oil to a heat exchange outside of the building.

GRC says its system can cut total data center power consumption nearly in half (45 percent). Cooling data centers, traditionally using expensive and large chillers, often accounts for between 40 and 60 percent of a data center’s electricity usage, meaning it eats up a lot of the budget. The energy use associated with data centers is expected to double over the next decade, with a Greenpeace report predicting that cloud computing will consume 1.96 trillion kWh of energy by 2020. According to estimates from an MIT and Carnegie Mellon report, Google has been spending about $38 million annually on electricity for data centers.

GRC’s set up is not as easy as simply pumping oil through an enclosure. The enclosure is tipped on its back, and the servers slide in vertically. Christiaan Best, the co-president of Green Revolution Cooling, says in this YouTube video, that the fans and optical drives are also removed, and hard drives must be sealed. That means that the wet architecture diverges from the “rack ‘em and stack ‘em” approach that has kept IT managers gainfully employed all these years.

Until server manufacturers and vendors give the official go-ahead –remember some sell their own cooling systems — demand could be muted for a third party selling liquid-cooling. Server manufacturers design their systems to operate in air-cooled server rooms. Without IBM’s or HP’s blessing, for example, IT departments will be reluctant to subject their hardware to potentially warranty-voiding modifications.

As Pedro explained in this article for GigaOM Pro (subscription required), liquid cooling has been used in a variety of systems for years, but has been mainly fodder for computer enthusiasts, or  used in specific applications in data centers. IBM has developed rack doors with chilled water to cool servers, and other companies have affixed “water blocks” to processors and other heat-generating components of a server in order to siphon off heat. Over the past year, GRC and other startups have emerged looking to get IT managers to embrace liquid-cooling as a standard method for removing the heat generated by servers.

Another startup that’s been winning praise for its wet architecture is four-year-old Iceotope, from Sheffield, UK. The company came out of stealth mode in November 2009, and has been demonstrating a liquid-cooled server setup that it says has the potential to cut data center cooling costs by up to 93 percent. Iceotope dunks the server motherboards into modules that are filled with an “inert liquid” that doesn’t short out the delicate electronics, and Iceotope has said it will commercially sell its systems sometime in the second half of 2010.

For more research on green data centers check out:

Are Liquid Cooled Servers Coming to a Data Center Near You?

Will Software Or Sensors Win In Data Center Efficiency?

Report: Green Data Center Design Strategies

You're subscribed! If you like, you can update your settings

  1. Nothing new…Sounds like Isolthermal Systems Research out of Spokane WA.

    Share

Comments have been disabled for this post