Stay on Top of Enterprise Technology Trends
Get updates impacting your industry from our GigaOm Research Community
Dunking servers in liquid as a way to cool them is one of the more unconventional ways to reduce the energy consumption of data centers. But the latest Internet giant to eye the splashy tech is Google (s GOOG), which according to Data Center Knowledge, has patented a design for a liquid-cooled server sandwich that uses two motherboards wrapped around a liquid-cooled heat sink.
The idea behind liquid-cooling for servers is that liquids are much better at transferring heat than air. Given about half of the energy consumption of a data center goes toward powering cooling technology (largely via giant inefficient chillers), finding more efficient cooling options will be critical for reducing the carbon footprint of the Internet. Even using natural air is a more efficient way — and an increasingly attractive way — to cool data centers compared to the standard chillers.
Liquid-cooled servers can offer huge potential energy efficiencies and financial savings. According to this GigaOM Pro report (subscription required) on liquid-cooled servers, U.K.-based startup Iceotope says that its technology can cut data center cooling costs by a whopping 93 percent. Another startup Green Revolution Cooling says its system can cut total data center power consumption nearly in half (45 percent).
Because the idea of liquid-cooled servers is so new, vendors have thought up a vast array of designs around the tech. Google’s server sandwich is just the latest. IBM (s IBM) supplies rack doors with chilled water to cool servers. Other companies affix “water blocks” to processors and other heat-generating components of a server to siphon off heat. Iceotope dunks entire server motherboards into modules that are filled with an “inert liquid” that doesn’t short out the delicate electronics.
There are still a variety of hurdles for the nascent idea of liquid-cooled servers. The technology started out as a way for DIY-computing enthusiasts to crunch out extra efficiency several years ago, and has just started to move into a more commercial phase. As we laid out in this GigaOM Pro report, difficulties for the market include ruining the warranties for traditional servers, rethinking the layout of the data center, and helping IT managers get used to the idea of using liquids in the often sterile data center environment.
Who knows if Google will actually implement its liquid-cooling server tech, as a variety of its server designs like the wave-powered data center seem more theoretical than ready to build. But Google has been particularly focused on reducing its data center energy consumption and has turned to a variety of novel technologies to deliver that goal.
Last year Google unveiled that it had eliminated the standard centralized backup power supply, and opted for a battery-per-server approach. And Google has been investigating how servers can be redesigned and enhanced with software so that they can be as energy efficient when used lightly, say, when web services are pinging them less frequently, as they are at maximum use, like during the livestreaming of Obama’s inauguration (see Google: Servers Should Be More Like People, GigaOM Pro).
For more research on green data centers from GigaOM Pro (subscription required):
Image courtesy of clayirving’s Flickr feed Creative Commons.