5 Comments

Summary:

Imagine if companies could track the energy consumed by a lone click of a mouse to store photos in the cloud. It might sound like a pipe dream, but it could one day be reality.

LBNL_CloudComputeCenter

Imagine if companies could track the energy usage of a single individual web user uploading video content to YouTube, or the amount of energy consumed by a lone click of a mouse to store photos in the cloud. It might sound like a pipe dream, but Sentilla CTO Joe Polastre thinks that eventually the rapidly growing industry around data center and IT energy efficiency could provide that kind of transparency.

That amount of bite-sized data about energy consumption and computing could really help when it comes to resolving questions about the energy efficiency of cloud computing. Already researchers at Melbourne have been focusing on singling out energy consumption per application — from processing, to storage to software as a service — for cloud computing in an effort to try to shed light on whether cloud computing is more energy efficient for all applications (hint, it’s not).

Polastre sees an entire new frontier in this field, which he calls “application dependency mapping,” and which I go into more depth an article for GigaOM Pro (subscription required). To us, energy-per-click is an easier way to imagine it.

Sentilla itself is working on its own software that, while it doesn’t deliver energy-per-click, looks more deeply into tracking the energy consumption of servers that are both virtualized — basically more and more software is packed onto fewer servers — and servers that have not been virtualized. The idea is to treat a collection of servers that have benefited from products like VMWare’s virtualization software to standard servers in terms of how much computation power-per-watt you’re getting out of both. The software also helps form recommendations about steps to take to squeeze more computing and energy efficiency out of computing environments.

Virtualization has been the main trend in data center efficiency to date, with the benefits of consolidating workloads onto fewer servers pretty self-evident. But a way to drill into the efficiency benefits would help all industries involved. According to IDC, deployment of virtual machines outpaced deployment of real physical servers last year, and virtualized servers now make up some 20 percent of all physical servers and account for 60 percent of IT workload.

All if this work on energy consumption, is really all about cost. And Sentilla’s tool could be quite useful to help customers calculate relative costs and the economic benefits of going virtual versus sticking with the standard. Sentilla says it already has customers using its new software to manage freshly virtualized servers, and results are showing an 18 percent drop in overall IT costs  — not bad, considering the gains presumed to have already come from moving to a condensed, virtualized environment.

Right now, Sentilla’s new software can track relative energy use at the general IT level, but it can’t yet measure per-application energy use across environments. But Polastre thinks being able to ask each application a series of energy-related questions will some day be on its way.

More transparency will be crucial to cut — or just maintain — the energy consumption of the Internet. While Pike Research has predicted the trend of companies moving services to the cloud will yield a 38-percent decrease in IT energy expenditures by 2020. But as we’ve covered recently, cloud computing can be a net energy loser if the costs of transporting lots of data across long, energy-hungry networks are taken into account.

One thing to remember, too: at the end of the day the costs involved in developing and implementing this kind of per-click energy tracking might not pencil out. As Polastre explained to me, energy savings from virtualization are outweighed by additional savings from avoided costs of managing servers, maintaining software licensing and the like. Energy mainly matters when it comes to data center capacity constraints, he said — cutting wasted watts allows data centers to expand IT capacity within their existing walls and avoid building or buying new data center space.

For more research on green data centers check out GigaOM Pro (subscription required):

Image courtesy of Lawrence Berkeley National Laboratory via Creative Commons license.

You’re subscribed! If you like, you can update your settings

  1. I have found one web page that provides an estimate of the energy used to access their web site per user. The site is about an experimental Marvell ARM based rack server(s) project.

    At the bottom of the page it show the energy usage. ie “This page took 24.69 joules of energy to render (zen4 : 9.89 secs : 4 sessions : 0.83 amps)”

  2. That’s cool – can you post the link?

    1. http://www.linux-arm.org/Main/LinuxArmOrg

      It seems the site is down so I have emailed
      Nick.Stevenson@ARM.COM the site admin

    2. Also check out work done at Microsoft on the same topic

      http://research.microsoft.com/en-us/projects/joulemeter/default.aspx

      Joulemeter estimates the energy usage of a VM, computer, or software by measuring the hardware resources (CPU, disk, memory, screen etc) being used and converting the resource usage to actual power usage based on automatically learned realistic power models.

  3. IT Comes First For Data Center Efficiency: Cleantech News and Analysis « Tuesday, February 22, 2011

    [...] innovations as solid state drives and ARM chips for servers, cloud-based storage and physical asset management in virtualized environments will be held alongside discussions on the facilities side, like ASHRAE’s latest cooling guidance [...]

Comments have been disabled for this post