Data centers’ ability to suck up inordinate amounts of electricity is turning them into the Hummers of the computing world. And much like Hummers, their power-guzzling ways means they are becoming increasingly costly to run. We’ve already covered the efforts of companies to reduce heat, increase server utilization and build green data centers. Now Andrew Hopper, head of the Cambridge University Computing Lab, is working on a solution that could help reduce the demand data centers place on the grid.
Hopper’s vision combines cloud computing and renewable energy: He wants to take electrical transmissions costs out of the equation by placing a data center directly at the site of a renewable energy source and use fiber optic cable to link it to the entity that uses it. Hopper is also the co-founder of Level 5 Networks, which was bought by 10 Gigabit chip maker SolarFlare.
Virtualization and fast Ethernet, which enable services such as Amazon’s EC2, will make Hopper’s idea feasible. The ability to separate the hardware from software through virtualization is what enables computing clouds to exist. Those clouds allow companies, developers or anyone with the ability to tap into that resource, to ship its computing jobs over to Amazon’s servers, no matter where they are located in the world.
The challenge is figuring out how to build software that can monitor electrical generation, prioritize compute jobs and then figure out when and where to send those jobs based on whether the wind is blowing or the sun is shining. Hopper believes it would make sense to attach data centers, possibly in a container (as shown in the image), directly to a renewable energy source. The source could be located in the middle of a desert, on a platform attached to an ocean wind turbine, or anywhere else where power could be easily generated.
One of the issues with renewable sources of energy, is that the places where it’s most abundant, such as winds blowing across the ocean or solar power in the desert, are inaccessible and thus, expensive to attach to the electrical grid. An example is Texas’ $4.9 billion plan to bring wind energy generated in the barren, western part of the state to the more populous center. Bringing the data center to the power solves that problem as long as the area can be reached via a cheaper fiber optic cable. There are still issues of servicing such remote data centers, but the plan to have multiple ones around the world offers redundancy.
But there are still computing tasks that need to be worked out if this vision is to materialize in the next decade or so. “If it turns out you’re chasing the energy and copying a lot of data, then that’s less attractive,” says Hopper. “But with good caching, and if you’re only moving the data once or twice it, might work. You could design software similar to old-fashioned job scheduling on a mainframe. Back then the scarcity was the computing and today it’s energy.”
As data centers take up more and more energy, Hopper’s ideas may help the computing industry solve one of its fastest-growing problems.
image of a theoretical modular data center installation courtesy of Sun Microsystems