Data center energy use has become a hot topic this year. While Derrick Harris and I have bandied about our thoughts on the need for clean power, energy efficiency is still top of mind for most data center operators. There are plenty of opportunities to cut data center energy consumption: cooling systems, low-power servers, efficient power generation and distribution, uninterruptible power supplies, solid-state storage… The list is extensive — and also grounded in IT hardware and machinery that keeps computing facilities up and running.
But a new aspect of data center energy use is getting increasing attention lately: software. If the code running on all the servers in a data center was inherently more energy-efficient and governed by its own energy-aware logic, IT managers could have another, less costly option to consider for reducing data center energy use: fewer servers. But how do developers code for energy efficiency when the impulse is to build programs that are ever bigger, better and faster?
Tools at Long Last
Companies like Intel have been promoting energy-efficient coding techniques during the past few years. The problem is that efforts to draw attention to the issue often take the form of research findings and best practices tucked deep in the developer areas of web sites (like Intel’s own) instead of sharing the spotlight with the latest and greatest tech. But that’s changing, now that new tools are arriving on the scene. For instance, Google collaborated with researchers at the University of Michigan and Northwestern University last year to develop PowerTutor, which allows Google phone software developers to “see the impact of design changes on power efficiency.”
Now, Intel is working to bring its insights out of white papers and into practical use by providing developers with visibility into the energy intensiveness of software applications with its Energy Checker SDK. According to the company, the SDK “provides the tools to measure both software productivity and hardware platform energy consumption to facilitate reporting energy efficiency metrics.” Those metrics take the form of counters — software gauges, so to speak — that can be used to correlate how much “useful work” is performed with a set amount of energy, and vice versa. It’s early days yet, but so far, Intel software engineer Jamel Tayeb has tinkered with Energy Checker to measure the energy consumed by a program via a command line, and even improve the accuracy of its own energy measurements.
Over at Microsoft Research, a team is readying the release of an energy profiling tool called Joulemeter that can be used to generate an energy report encompassing an entire computer system, a specific software application and even a virtual machine. It’s a capability that can have a profound impact on how firms focus their software development efforts.
Programming is a balancing act, explains Jie Liu of the Joulemeter team at Microsoft Research. “Software developers constantly make decisions during software development to optimize for speed, memory size, storage, etc.,” he states. For developers, tools like Joulemeter and Intel’s Energy Checker open the door to new levels of code optimization. Efficiency will no longer just be measured by an application’s toll on the processor or its memory footprint. Soon, coders can take energy consumption into account and even assign a monetary value (as in the case of Intel’s SDK).
Energy Measurement in Dollars
If you read between the lines of Intel’s Energy Checker intro page, the business opportunities for data center operators and software developers might be too good to pass up. The company hits all the right notes in appealing to software makers and ISVs that cater to the corporate IT crowd. Correlating business productivity and energy consumption is no longer a mystery with Energy Checker.
Similarly, Microsoft’s Liu extols the budget-friendly aspects of energy-aware software methodologies. He argues that “a software-based solution is very attractive both because it can be low-cost and because it can provide user-friendly granularities of energy consumption measures.” Low-cost and ease of use? Yes, please.
Both software offerings can help data center operators tackle the issue of overprovisioning, the act of disproportionately devoting too many IT resources than is necessary for a given task. In short, businesses are often paying for way too much hardware and related energy and cooling costs for the amount of work performed by the software they run. With an accurate portrait of how much energy is required to run software within acceptable performance levels, data center operators can do a better job of “right sizing” their server infrastructures. This can include server consolidation strategies like employing virtualization, which allows servers to pull double (or triple, or more) duty. As a bonus, IT shops can free up some precious watts and delay or outright prevent costly new data center expansions or builds.
Facebook made waves recently when it introduced HipHop, a homegrown PHP “source code transformer” (don’t call it a compiler) that dramatically improves speed. A fast and responsive web site is critical to web-based companies, but there’s another benefit. “With HipHop, we’ve reduced the CPU usage on our web servers on average by about 50 percent, depending on the page,” says Haiping Zhao, a Facebook engineer. Multiply that by hundreds or thousands of servers, and the energy-saving opportunities are abundantly clear. The capital expenditure benefits are equally hard to ignore. As Zhao so succinctly adds, “Less CPU means fewer servers, which means less overhead.”
Impact on the Software Business
With the visibility provided by tools like Joulemeter and Intel’s SDK, energy-efficient coding techniques become a little more attainable for the development community at large, and what is today a feature may soon become a core piece of company decision making when it comes to software tools deployed inside the data center. Here’s how to get ahead of this trend:
- All software can potentially benefit from energy-assessment tools, but automated IT management tools are most likely to be impacted in the near term. Developers targeting this market should be educating themselves about how to balance performance and energy efficiency. Energy Checker, for instance, makes it possible to generate energy statistics, which in turn can be used to help inform automated data center workload management tools. In such a scenario, not only can IT shops optimize for performance, but also for energy savings.
- Big energy savings and lower infrastructure costs may help tilt purchasing decisions or long-term licensing contracts in a green software maker’s favor. For software makers, the arrival of software energy metrics can help them carve out a competitive advantage. Apart from software’s feature set and capabilities, another big factor influencing IT purchasing decisions is hardware requirements. Lower those requirements, and the energy costs to fulfill them, and it can prove a compelling motivator that sets a software firm’s product apart.
- Developers who not only can write cleaner, leaner code, but also understand other, more advanced efficient coding strategies will have a competitive advantage in the job market. With the prospect of improved performance and lower energy costs, companies may soon be able to add energy efficiency to their internal development teams’ priorities or weigh a feature’s utility against the energy costs required to support it. Developers who are prepared to deliver on these demands will be versed in software energy assessment tools and will be able to account for the energy impact of their coding techniques, strategies and optimizations. Expect this tactic to begin spreading from a few code optimization die-hards to the developer community at large in the coming years.
- Data center operators can expect more IT flexibility in the coming years. The advent of energy-aware workload management can help data center operators maximize their data processing capabilities, even under tight power constraints. Fed by real-time energy metrics and backed with historical data, IT shops will be able to weigh the energy cost of specific workloads, which, depending on business priorities and performance goals, could result in money- and energy-saving workload re-prioritizations.