Data centers are becoming a bigger and bigger draw on energy resources. According to a report from the Environmental Protection Agency last year, they accounted for only 1.5 percent of the energy consumed in the U.S. But an interesting story by James Niccolai of IDG suggests that figure is growing fast.
Niccolai attended a sort of utility summit to address the question of the growing appetite for power among data centers. PG&E, which hosted the meeting, said demand for power from data centers in its region has grown to between 400 and 500 megawatts at a given moment today, up from between 50 and 75 megawatts only a year and a half ago. A PG&E official noted,
“We had tremendous growth in data center capacity in the dot-com boom that never got filled. I can tell you that that capacity is now full to the gills, and they are asking us for more power.”
Utilities could build more plants to generate more power, but they, somewhat shrewdly, see that it’s cheaper to get the data centers to run more efficiently. And apparently they are traditionally very inefficient. Solutions run from the high-tech (virtualization software) to common sense (using natural air to cool data centers.)
Joe Skorupa, a Gartner analyst at the event, said other power-saving ideas can be employed by end users themselves. He suggested getting rid of Gigabit Ethernet to the desktop, which sucks more power than lower-capacity Ethernet, and often isn’t really needed by users. And ditching sophisticated “Eight-line color display VoIP phones” is another good way to cut power consumption, Skorupa pointed out. The lesson: if you don’t really need it, it’s costing you money in energy use.
{"source":"https:\/\/gigaom.com\/2008\/03\/29\/data-centers-are-sucking-more-power-from-the-grid\/wijax\/49e8740702c6da9341d50357217fb629","varname":"wijax_26f120067f6db7234fbdc9143d8186b7","title_element":"header","title_class":"widget-title","title_before":"%3Cheader%20class%3D%22widget-title%22%3E","title_after":"%3C%2Fheader%3E"}