Weekly Update

Measuring IT Energy, from Virtual Environment to Application

Imagine if companies could track the energy usage of a single web user uploading video content to YouTube, or the amount of energy consumed by a the click of a mouse to store photos in the cloud. It might sound like a pipe dream, but Sentilla CTO Joe Polastre thinks that eventually the rapidly growing industry around data center and IT energy efficiency could provide that kind of transparency.

This morning, Sentilla unveiled new features of the company’s Energy Manager software that track performance and energy use across both virtual and dedicated server environments. Virtualization has been the main trend in data center efficiency to date, with the benefits of consolidating workloads onto fewer servers pretty self-evident. But comparing relative computing resources and energy efficiency of virtual versus dedicated servers has been a challenge. Sentilla’s new capabilities could be quite useful for companies thinking of moving to virtualization, as a tool to calculate the relative costs and benefits of doing so versus sticking with a dedicated, single-server environment.

Sentilla’s “virtual meter” technology measures the server’s activity against how much power it uses at different levels of utilization. The process yields energy use data with 95 percent or better accuracy. That style of energy tracking makes “metering” virtual machines fairly simple, Polastre said. That’s useful not only for planning optimal virtualization strategies, but also for squeezing operational efficiencies out of virtualized environments by finding and eliminating unused virtual server space and other similar tasks.

Even so, there are some things Polastre wishes Sentilla’s technology could do that it can’t — yet. “You can certainly calculate what’s going on at the endpoints. What’s going on in the middle is kind of out of your control,” he explained. In other words, Sentilla’s Energy Manager can calculate computing and energy efficiency at the server and networking equipment level, but it can’t tell how much energy is being used by an individual application being run on different IT environments. That kind of “watts-per-click” capability could track how much energy is used by each piece of hardware an application uses as it processes a request sent by a desktop computer to a server in some far-away data center.

In order to eventually accomplish this, Polastre envisions something called “application dependency mapping,” which would seek to ask each application a series of questions such as what storage it hits and what network traffic flow is involved. From that stream of data could come a calculation of how much energy and IT capacity is used on some kind of “per-click” basis. “In all honesty, we’re not doing this today,” he told me. “But this is where we’re headed.”

Right now companies are simply struggling to make sense of the different energy and IT equations arising from the move to virtualization. That makes the idea of getting even more involved with more complex “kilowatt-per-query” type application-specific computing energy calculations a daunting one, said Katie Broderick, a senior research analyst at IDC.

Still, it would be useful to resolve questions about energy efficiency of different IT architecture options — one of them being cloud computing. Pike Research has predicted the move to the cloud will yield a 38-percent decrease in IT energy expenditures by 2020. But as we’ve discussed recently, cloud computing can be a net energy loser if the costs of transporting lots of data across long, energy-hungry networks are taken into account.

That fact raises the question of which uses of cloud computing are more energy efficient and which aren’t. It’s pretty easy to predict broad categories of tasks that might be less energy efficient when performed in the cloud — streaming video data from a remote data center versus running it from a DVD on a laptop is one example Pike Research cited. Getting specific information on an application-by-application basis, however, is beyond the capabilities of green IT technology today. Could new technologies like the application dependency mapping Polastre envisions yield answers to that question? And, can it be carried out at a low enough cost and high enough level of simplicity to make it worthwhile? I’m curious to hear your thoughts — feel free to contact me and share your ideas.

Related Research: How to Make Cloud Computing Greener

Question of the week

How can the IT industry move from monitoring energy usage across different computing environments to monitoring it for every application?