A lot of the interest in the smart grid industry tends to swirl around consumers and home energy management. It can be fun to picture consumers one day buying gadgets at big box retailers that will let them micromanage their home heating, cooling and lighting, and it’s an especially attractive venture option for investors and entrepreneurs who cut their teeth on Internet and telecom startups. But utility execs say they are looking very hard at other types of software platforms, tools and applications that can help their networks automatically manage grid functions much deeper in the power grid, with distributed computing and with as little manual and human interaction as possible. That presents a largely untapped opportunity for startups and incumbent players alike.
“The amount of data that is coming in with the smart grid will overwhelm us,” said Andy Tang, PG&E’s senior director of the Smart Energy Web, who spoke at our Smart Grid Bunker Event last week. Warren Weiss, managing director at Foundation Capital, said at the event that the smart grid will create 3,000 times more data on a daily basis that utilities will have to manage. In an article last October, analyst Jack Danahy estimated that if 140 million smart meters, which update energy info every 15 minutes, are installed over the next 10 years in the U.S., they could produce a massive 100 petabytes. Just 1 PB (or 1 quadrillion bytes of information) is equivalent to the amount of data contained in 20 million four-drawer filing cabinets filled with text.
That amount of data means that “eventually the power grid will move to a distributed computing model,” said Tang where automated decisions are made at the edge of the network but with some sort of supervisory control layer. A distributed computing model is quite different from the centralized mainframe model and point-to-point system that utilities largely have in place today, Tang explained. Or, as Jeff Taft, Accenture’s smart grid chief architect, put it to Earth2Tech back in October: “most utilities are still living in the days of batch processing.” Ultimately the process of delivering a fully automated “super-grid” will take a good decade, Tang said, so expect to see a variety of middle steps where different layers of human interaction are still maintained for some time.
From the perspective of the IT world, a human-less network seems like an old idea. Distributed computing at the edge of broadband networks makes sure that networked services are rendered speedily to the end user — web users have little tolerance when it comes to choppy web video or glitchy online voice services like Skype. And in the Internet world, distributed computing is taking on new shapes, potentially providing a lens for what utilities’ computing services will look like decades from now. Foundation’s Weiss said at our event last week that he even sees utilities one day creating their own private computing clouds that will work as the backbone for their smart grids.
In the shorter term, this switch to a distributed computing model for utilities will create an opportunity rife for innovation for entrepreneurs, investors and existing IT firms. Utilities are actively looking for applications and a software layer that can help a utility manage all the new data — from renewable energy sources, from electric cars, from demand response programs — and drive devices at the edges of the network, said Tang. “I’ve seen a lot of Power Point presentations on this idea, but not a lot of products,” Tang said.
From a venture capital perspective, Weiss said he is looking at the next-generation of customer information management systems, outage management systems, billing systems — including the prepaid cellular-style model — and data analytics that can mine and better predict how energy usage happens. Those types of innovations could come from existing smart grid software firms like eMeter or Ecologic Analytics, IT firms that are focused on the smart grid like IBM and Accenture, incumbent meter makers like Itron, or new startups that have just emerged.
Smart thermostat software maker EcoFactor is an example of one of these new types of startups that has developed a service on the idea of automation, distributed computing and dynamically managing devices based on smart algorithms. While EcoFactor bills itself a consumer-focused service — automatically managing consumer’s connected thermostats and shaving off heating and cooling without the consumer’s comfort affected — its service will no doubt be incredibly valuable from a utility-perspective. As John Steinberg, CEO of EcoFactor and a participant in last week’s event, put it, connected consumer thermostats can act as a way of distributed grid load management, and not surprisingly EcoFactor’s first customer is Texas utility Oncor, which is utilizing the startup’s technology for a demand response program.
EcoFactor’s service can run over the Internet, bypassing the smart meter — a benefit, as most utility AMI networks don’t currently provide enough data to run that type of service. The product is also an example of how — even at the consumer level — the smart grid could move toward the machine-to-machine communications model. As Steinberger pointed out, he spent about a month taking notes on his solar PV system’s performance using a spreadsheet and cross checking it with the meter — but after that month, he just lost interest. Steinberg says he’s the rule, not the exception; the majority of consumers don’t need to — or even want to — interact with their energy consumption.
Of course, some consumers may want to be involved in their home energy management, but the size of that market remains to be seen; it’s becoming increasingly clear that the architecture of the smart grid will one day resemble the guts of broadband networks even more closely than they do today. From a utility perspective that could deliver a good deal of trepidation and worry over their new role and how to embrace this new technology. PG&E is one of the more progressive utilities, and it operates in a market that has incentives that promote energy efficiency. For many regions in the U.S., it will be an uphill battle to get utilities to embrace the distributed computing IT model. However, from an entrepreneur’s perspective, disruptive change is always a good thing. As Weiss put it: “It is time for utilities to come out of the darkness and into the light.”