Microsoft’s energy management tool, Hohm, which launched this week, is a clear play to help consumers save energy. Log into the Hohm web site, enter your ZIP code and other details about your residence, and the service predicts your home energy use (or links to your historical energy use via your utility) and suggests ways to curb it.
But Hohm is also Microsoft’s first consumer-facing web service that is hosted entirely on Azure, the company’s cloud computing software control system — and Azure boasts some cutting-edge energy savings features of its own. Even if Hohm isn’t eventually able to convince consumers to cut their energy consumption, the way it’s hosted could represent the future of more energy-efficient computing.
We’ll forgive you if you thought Azure was just an outdated color in a Crayola box. Microsoft announced the cloud computing platform more than six months ago, and while few details are known about it, what is known is that it will be used by companies that want to deploy large web services and host them in a cloud computing model. But Craig Mundie, Microsoft’s chief research and strategy officer, explained to us in an interview this week that Azure is expected to be more efficient than standard web hosting and offer better power utilization, partly because the cloud takes advantage of on-demand scalable computing, growing and shrinking the amount of computing that’s applied to a particular task (and thus power used). In addition, the servers will feature efficient hardware designs and make better use of power management software, Mundie said.
In the past, Microsoft would have allocated a couple of data centers to house Hohm, said Mundie. In that model, even if it were the middle of the night, the web service would grind away in case there was a visitor. That meant it would be using more energy than needed. But with Azure’s set-up, the servers are more like a car with different cylinders. “When they go uphill, we’ll turn them on, and when we’re coasting downhill, we’ll turn them off,” said Mundie.
It might seem like an obvious way to build a system. But right now, servers in general aren’t very “energy proportional” — they don’t consume a small amount of energy when used only a little bit, or a lot more energy when used extensively. As Google suggested recently in its mini-book about data centers (GigaOM Pro subscription required), if servers were a little more like humans, which have evolved to use energy only when needed, they would consume significantly less of it. Google, like Microsoft, is thinking about ways to design servers and power management software for servers so that it can mimic this type of dynamic power consumption.
The trend of turning to cloud computing as a more efficient way to utilize computing power will only grow, too. At the Structure 2009 conference, held by our sister site GigaOM, the CEO of content delivery network company Akamai, Peter Sagan, said that cloud computing was a much more efficient and green way to do computing, a sentiment that was echoed by executives from web companies, telecom firms and Internet infrastructure makers.
However the networked computing world is tweaked, at the end of the day, it needs to more closely focus on using as little energy as possible. According to the Environmental Protection Agency, data centers currently consume 1.5 percent of total U.S. electricity, and that number is poised to more than double by 2011. In an increasingly broadband-connected world, in which our media, commerce, communications and work life will be increasingly hosted in the cloud, energy conservation in IT will become extremely important. And who knows? Hohm’s legacy of being the first external web service hosted on the energy-efficient Azure platform could stretch beyond its practical intentions to help reduce global energy consumption.
This article also appeared on BusinessWeek.com.