As ethereal as the terms “web services” and “cloud computing” sound, there’s nothing lightweight about the power and cooling required to make the Internet run. It takes data centers, plain and simple — each running 24/7 and housing thousands of servers and data storage systems – to satisfy our growing appetite to tweet all day long, watch the Olympics streamed online, and Skype our friends across the globe. If you’re thinking it’s a recipe for sky-high power bills and added greenhouse emissions, well, then you’re right. Let’s take a look at how some of the biggest web firms are handling IT infrastructure growth while bringing technology and innovative data center design principles to bear on lowering energy costs and reducing carbon emissions.
While the company fiercely guard its tech secrets of success, it has peeled back enough of the curtain to reveal how it tackles energy efficiency in the data center. It started with the surprise announcement that the company uses custom servers with built-in batteries, helping to eliminate energy loss caused by uninterruptible power supply (UPS) systems, and it has become in recent months a virtual blueprint for operating a green data center.
Part of this blueprint involves “containerized” data centers. Rather than organizing server racks like rows of dominoes, servers are packed into shipping containers that are easier to cool and maintain than a large enclosed space. Other environmentally-friendly methods include air-side economizers (free cooling) and re-using or recycling 100 percent of its servers. For an extreme example of Google’s approach, look no further than its completely chiller-less facility in Belgium. During the week or so that temperatures rise above the safe zone, work is offloaded to Google’s other data centers and the servers are shut down.
Facebook is breaking away from leased data center space as an IT growth strategy and is now building its own computing facility in Prineville, Oregon. Expected to reach completion in 2011, the new data center will rely on free-cooling, evaporative cooling and a proprietary UPS to help Facebook reach its goal of a super-low 1.15 PUE rating (the industry average hovers around the 2.0 mark).
In a novel cost saving twist, it will also recapture server waste heat and use it warm up offices when the weather turns cold. Also like Google, Facebook is pursuing custom, energy efficient servers that are specifically tailored to the computing task at hand and avoid the power-sapping overhead of industry standard servers.
Because it is part of News Corp, it’s difficult to disentangle MySpace’s IT infrastructure from that of its parent company. But nonetheless, some intriguing details have emerged. MySpace has also invested in solid-state storage from Fusion-io, helping the company boost performance and save $120,000 in energy costs a year. It’s a move that eliminates a lot of power and cooling costs associated with running some of its hard drive-based systems — 99 percent in fact — and helped the company massively downsize its server footprint.
Chicken coops sound as far removed from cutting-edge data centers as you can get, but Yahoo is banking on elements of this low-tech design to maximize its free-cooling potential in Lockport, NY and reach a PUE of 1.1. Situated near Buffalo, the $150 million complex will draw from the region’s hydroelectric sources to power its servers.
It’s a combo that attracted the U.S. Department of Energy’s attention and helped Yahoo capture a nice chunk of stimulus funds meant to spur advancements in data center energy efficiency (subscription required). Apart from its new build, Yahoo has emerged as a proponent of cold-air containment and containers in recent years.
Microsoft is adding data center capacity at a brisk pace as it ramps up its Web- and cloud-based services. In Northlake, Ill. near Chicago, the software giant employs shipping containers in conjunction with a more traditional data center design and uses hot/cold air containment to lower cooling costs. In Dublin, Ireland, the site of another Live Services and Azure cloud computing center, Microsoft is using the region’s climate to forego powering up its cooling systems up to 95 percent of the time.
Related GigaOM Pro green data center research & analysis, including:
- Report: Green Data Center Design Strategies (sub required)
- Report: The Future of Data Center Storage (sub required)
- Facebook, Apple Building New Data Centers, But Why? (sub required)
Image is an artists rendering of Facebook’s datacenter.