4 Comments

Summary:

[qi:gigaom_icon_hardware] As compute demand increases, demand for power in data centers is soaring. To help IT professionals halt the spread of watt-consuming servers, the industry needs to develop software that can communicate the ways in which the various layers of the data center perform and interact. […]

[qi:gigaom_icon_hardware] As compute demand increases, demand for power in data centers is soaring. To help IT professionals halt the spread of watt-consuming servers, the industry needs to develop software that can communicate the ways in which the various layers of the data center perform and interact. They need a binary version of Cesar Millan — a data center whisperer.

Speaking at a panel held Wednesday night in Austin, Texas, several folks from the large server shops and a distinguished engineer who runs a data center for IBM spoke about the challenges of keeping power consumption down in a world where computing demand is going up. (For a truly in-depth look at this topic, check out our GigaOM Pro report — subscription required.) The panel went beyond just power and cooling (thank goodness) to focus on how companies are increasingly viewing power consumption in the data center as a whole, rather than merely as the sum of of the data center’s processors.

IBM’s Scott Winters said he saved 30 percent on his energy costs over three years while increasing his computing abilities by 50 percent and his storage by 150 percent. He did this in two primary ways: by virtualizing his data center and creating a pool of shared resources that are used on demand, and by paying attention to software he has running that tells him what’s happening on his servers.

“My data center was whispering secrets, and now I have a way to understand them,” Winters said. He said his IBM software and linking that software to the physical infrastructure helped him reach such an understanding, especially in regard to managing power consumption. It’s a strategy that HP has embraced with its products; there are also several startups pushing data center sensor networks that allow the data center’s server hardware and its physical infrastructure like the chillers and air conditioners to communicate.

But as the facilities and IT infrastructure merge (the jobs of the facilities manager and the IT manager are also on a path to merge, according to members of the panel) standards are needed. The folks building the physical infrastructure typically use proprietary software in their products and sensors and getting that sensor network to talk to your servers can require a big programming effort. Once folks can manage their physical infrastructure and their hardware, the next step is to tie the physical and hardware layers to the application layer. That’s a big dream, and we’re still far off. But given the demand for computing and constraints on providing the power to meet that demand, it’s an issue that panels like the one Wednesday night will help solve.

  1. [...] covered the trend of IT and facilities managers merging a few weeks ago, and Vikas Aggarwal, the co-founder of Zyrion, confirmed that customers of its [...]

    Share
  2. [...] resources with their internal cloud environments. Elastra’s efforts are just the latest in a growing trend toward saving data center costs by using the least possible amount of power to accomplish any given task. Especially in the [...]

    Share
  3. [...] information technology, which is creating a data center that is aware of the application and can deliver exactly the performance required for a specific task and no more. This saves on power costs and also implies that we’ve [...]

    Share

Comments have been disabled for this post