With data center energy costs on course to outweigh the cost of the gear inside them, tech industry executives are beginning to realize the magnitude of the problem. At the “Building the Green Data Center” panel at the Always On Conference in Palo Alto, Calif., Dave Edwards, research analyst with Morgan Stanley said: “energy costs are expected to exceed the cost of equipment itself, and it’s approaching 30 to 40% of IT costs.”
Data center electricity use tops $2.7 billion annually in the U.S., and $7.3 billion worldwide, and the costs are only rising. “If you were to implement all the data center efficiencies that could be reasonably achieved by 2015, (it would save the) equivalent of the annual electricity consumption of 1.8 million homes,” said Stephen Trainor, VP Strategy, Level-3.
Other panelists on stage like Steve Sams, IBM’s VP of Global Site and Facilities, Mike Rigodanzo, HP’s SVP of Technology Services, and Subodh Bapat, an engineer with Sun Microsystems, agreed.
“Our customers tell us that power, cooling and energy efficiency is bubbling up to the top of their priority list,” said HP’s Rigodanzo.
Panelists pointed out that while a lot of attention is being paid to making chips consume less power, those developments won’t be nearly enough. There are other costs, like for cooling, which are escalating out of control. Network devices also stay in a state of constant alert, consuming power all the time, so meters, sensors, and virtualization technology could make it possible to measure where power is being inefficiently used and better manage power-hogging technology.
Another option for a more energy-efficient data center would be to design servers that are more rugged and can withstand constant high operating temperatures, so they don’t take as much energy to cool. But, as HP’s Rigodanzo pointed out to me after the panel, the cost to build this “rugged” hardware could easily end up outweighing savings in cooling costs.