Stay on Top of Enterprise Technology Trends
Get updates impacting your industry from our GigaOm Research Community
Another study out this week has found that if companies adopt cloud computing, they can reduce the energy consumption of their IT and save money on energy bills. The report, created by research firm Verdantix and sponsored by AT&T (s T), estimates that cloud computing could enable companies to save $12.3 billion off their energy bills. That translates into carbon emission savings of 85.7 million metric tons per year by 2020.
The Verdantix report isn’t the first one to deliver such a finding. Last year Pike Research found that cloud computing could lead to a 38 percent reduction in worldwide data center energy use by 2020, compared to what the growth of data center energy consumption would be without cloud computing. Another study from Microsoft (s MSFT), Accenture and WSP Environment and Energy last year found that moving business applications to the cloud could cut the associated per-user carbon footprint by 30 percent for large, already-efficient companies and as much as 90 percent for the smallest and least efficient businesses.
All of that is good news. Cloud computing is one of the most disruptive Internet infrastructure shifts to happen in recent years. Web companies have been embracing cloud computing in order to buy flexible, lower cost, on-demand computing power from companies like Amazon (s AMZN). And these cloud computing services generally replace the computing that would have been done by companies’ own in-house computing resources.
However, it’s always good to take these studies with a grain of salt. There’s a reason AT&T and Microsoft are looking into the energy efficiency of cloud computing: they sell cloud computing services.
Other studies have also found that cloud computing isn’t always the most energy efficient computing option, and in certain instances the cloud can be more energy intensive than traditional in-office computing. A report from University of Melbourne researcher Rod Tucker and his team, which I wrote about for GigaOM Pro (subscription required), found that cloud computing can indeed save energy when it leads simply to the consolidation of servers, but looking at three different applications of cloud computing — storage, software and processing — energy efficiency savings are negated in some scenarios.
For example, one such instance when the cloud isn’t more efficient, according to Tucker’s research, is when companies are using cloud computing for storing data. Tucker found that when the number of downloaded and accessed files becomes larger (more than one download per hour for a public cloud storage service), those energy efficiency gains are erased.
There’s enough research out there by now that shows that cloud computing is overall more energy efficient than traditional in-house computing. Which is great news for Internet companies and cloud computing providers. The growing energy consumption of the Internet, data centers and our always-on connected devices will only continue to grow, so efficiency trends will only to continue to become important.
Image courtesy of The Planet.