Indexing the world’s information and making it accessible takes a lot of people, a lot of machines and a lot of energy.
I was talking to a good friend recently and reported some hearsay about how a server now costs more in its useful life than it costs to buy. I found that amazing, but his response was even more astounding. “Well, we should put them in poor people’s houses to give them heat,” he quipped.
It sounds dumb at first, but really, it’s pure genius. If that much energy is being used, and half of that energy is used for cooling, we could put those servers to work as electric heaters. The “host families” could also get some broadband access, and institutions would save on data center build-outs. It’s a shame that our culture and the technical practicalities of distributed computing make the idea impractical.
But it got me thinking. How much energy really is burned in those big data centers? What follows next is guesstimation and inference based on popular opinion and, er, Google search returns ( I may appear to pick on Google, but it’s just because it happens to be a convenient example…)
- Google is rumored to have anywhere between half a million and 1 million machines in data centers. around the world. I am assuming it is the largest single-purpose commercial installation that we know about. (Lets not think about the government’s data demands for now.)
- Each machine consumes about 500 watts of energy, including cooling systems.
- Energy overhead for networking and other support structures is nominal, so I’ll ignore it for my guesstimate.
So, let’s take the worst case here:
1,000,000 Machines using 500 watts of energy an hour = half a gigawatt an hour.
Wow. That’s a lot. In Google’s own words, that’s about half what a city the size of San Francisco needs every hour.
That poses a worrying thought: Information on the web is increasing in an exponential manner, and Google will increase its capacity to meet demand. Even doubling that energy use would require the kind of power produced by a mid-size nuclear reactor. Now, I’m sure my calculations are probably a little over-dramatic, but really the quantities are on order and kind of astonishing when you think about it. And Google is not the only one. The cloud computing craze has to be powered somehow and the cloud’s power will come from a huge collection of these datacenters.
We could “Google” less and index only some of the world’s information — but heading back to ignorance doesn’t seem to be the right path to me. Instead, what we need is to rethink the “faster, better by throwing in more power” mentality of processor design and think around the physics of current computing and energy supply. It’s no easy feat. Perhaps also we need algorithmic innovation as well (Green PageRank, anyone?) Google is stepping up the mark with its investment programs in sustainable clean energy and building closer to energy supplies.
Collectively, chip designers, programmers, users, policy makers and academics will have to create a gestalt of contributions that run leaner, cleaner and cooler. Perhaps they should look at Google’s lead as a start
Thanks to Josh Aller, James McBride, Saul Griffith and Aaron Huslage for their assistance.