This article originally appeared in the cleantech section of GigaOM Pro, our premium research subscription service (subscription required).
If one looked back at the last sixty years and told a story about the electrical grid, it would be a tale of monumental growth as electricity use in the U.S. is about 13 times what it was in the 1950s. Americans now consume about 20 percent of all energy worldwide.
But in the last decade that growth has slowed to just under one percent per year. The story of the next sixty years will be less about demand growth, but about creating a stable grid in a world of increasing amounts of renewable energy.
The issue with renewable energy integration comes down to the reality that with wind and solar, generation is intermittent. You can’t just power up or down a generator as needed. Which has meant that utilities are increasingly looking for its customers to help balance the grid through programs like demand response and frequency regulation which incentivize customer to use less or more power at the precise time that operators need to adjust demand.
The opportunity for data centers
It’s this scenario that has some people wondering if data centers might have a role to play in helping utilities maintain grid stability. It often surprises many to learn that data centers are built with about 20 percent extra capacity and on top of that can have utilization rates as low as 10 percent. They are designed to absorb spikes in usage, like when Michael Jackson died and sites from Twitter to MSNBC struggled to stay available.
Much is written about how companies like Apple and Facebook have gone to states like North Carolina and Oregon respectively to access cheap, abundant power for its data center buildouts. And it’s precisely because data centers are such power hogs that they make attractive targets to shed and take power as needed to help balance the grid. Facebook’s Oregon data center has a capacity of 28 megawatts out of a total regional grid capacity of 720 megawatts, making it the largest commercial user of power in the region.
When Power Assure launched its data center energy management software company in 2007, it always had the long term vision of saving its customers money, not just through better visualization and automation of its server hardware, but through generating revenue via smart grid incentive programs. It reasoned that if data center operators could analyze their computing loads and energy consumption, they could find the flexibility to take and shed power loads as needed.
“We always had it in our roadmap that we would move into the smart grid, but it was unclear how that would take effect. The market had not really gelled,” Power Assure’s CEO Brad Wurtz told me.
Five years and $30 million dollars in equity raised later, Wurtz feels the market has matured to the point where data centers can begin to feel comfortable with the proposition of assisting the grid, both because data centers have better visibility into their own utilization patterns and because the smart grid has improved communication systems between utilities and its customers. Power Assure says it has ongoing trials with customers participating in load shifting and automated demand response.
Talking dollars and cents
Wurtz and I ran through the numbers for PJM, the regional transmission organization (RTO) for 13 states from New Jersey to Illinois. PJM is tasked with coordinating the electrical grid for over 60 million people as well as integrating new sources of renewable energy.
In 2011, for example, PJM was willing to pay $116,339 per megawatt to load shed but an even higher number–$173,074–to be available to deliver load shifting on an as needed basis. With all demand response the less notice the customer receives the greater the customer is compensated because what RTOs need is quick response to shifts in demand.
For a 30 megawatt data center that was well managed with 20 percent excess capacity, generating a few million dollars through grid incentives becomes pretty doable. Wurtz noted that one new client has built into its data center the ability to adjust its power consumption every 15 minutes to be able to optimize its energy pricing.
Demand response is an easier and less well compensated way for data centers to earn money because it typically gives utility customers at least a day’s notice that it will need to shed load. But some are looking into the future towards frequency regulation, the second by second balancing of the grid to maintain 60Hz, which is better compensated because of how quickly data centers would need to respond.
The question is always: is it worth the risk? The fear at the heart of every data center manager is downtime. No one gets fired for spending too much money on hardware and power consumption. People do get fired if a site goes down, even if the reason is that servers had to be shed to meet the needs of the grid.
But, in the end, risk exists on a spectrum meaning that there’s downtime risk for every data center no matter how prepared it is, as the Michael Jackson episode showed. And the better every data center knows its utilization patterns, both historically and second by second, the more possible it becomes for them to identify opportunities to help out the grid and make some cash in the process.
Image courtesy of Google.