Why the Smart Grid Needs to Ditch Its Dated Architecture, Now

powergridgeneric9The term “batch processing” was coined back in the 1950’s in the days of mainframe computers: A computer operator would feed a batch of punch cards into the computer, which would then process the information during a scheduled time, and hopefully deliver the needed information back the next morning. Compare that to today, when most computer processing is done through real-time and “event-driven” processing — the data is fed in and the computer quickly spits each bit of info out in seconds. Broadband networks connected to our computers have been built out around this idea of real-time computing.

But when it comes to the power grid, and the future smart grid, most utilities are still living in the days of batch processing, says Jeff Taft, Accenture’s smart grid chief architect (Accenture sells services and software that can bridge the gap between these two systems — more on that below). While batch processing might be a good fit some smart grid applications — such as standard monthly billing and data-heavy non time-sensitive processes — a smart grid based on batch processing would mean a power grid that’s a lot less intelligent than many of our modern communication networks and computing systems, and one that could stifle innovation in the power industry.

For example, many utilities do not plan to build networks that will support real-time energy consumption feedback for customers. I touched on that subject earlier this year in a post entitled “Why the Smart Grid Won’t Have the Innovations of the Internet Any Time Soon.” After interviews with a handful of utilities, it was clear to me that while most utilities are building networks that can collect data from a smart meter every 15 minutes, they’re planning to bring that data to the utility processing center only every 24 hours. That’s pretty much the idea behind old-skool batch processing.

As I noted in my previous article, non-real-time processing of consumer energy data could be less effective for changing consumer energy consumption behavior, since the consequences of different choices — running an appliance at peak vs. non-peak hours, for example — won’t be apparent until a day later. Without real time processing utilities could miss out on the innovations of some real-time-centric applications (see my comparison to GPS networks, which needed real time data in order for the killer app — turn-by-turn driving directions — to emerge). And ultimately batch-oriented systems could be less intelligent, less able to react quickly to specific events and less advanced than what current computing technology can offer.

There are two main reasons for why utilities have, and are continuing to maintain, this type of architecture. First, processing information in batches, with less robust networks, can keep a utility’s costs down. Another reason is that utility systems just haven’t needed this type of intelligence in the past, so it’s still unclear where it would and would not make sense for a utility to make the extra investment for real time processing. But the fact is that most utilities are setting up their future smart grid networks in this way, meaning that this technology could be around for a long time to come — utility infrastructure lasts for 30-50 years in some cases.

Fortunately, new software upgrades and services could allow this old-style infrastructure to process data closer to real time. In particular, Accenture has been paying attention to this problem, and as Taft explained to us, the consulting and services company, which has about 300 utility clients, offers a product called INDE (stands for Intelligent Network Data Enterprise), which delivers software, database services and a system to bridge the gap between batch processing and real time processing for utilities.

David Haak, Accenture’s global lead for smart grid assets and services strategy, described INDE as “the central nervous system” for a smart grid project. Haak and Taft say that Xcel Energy has been using its INDE system for its Boulder Smart Grid City project, and has been able to more quickly process energy information closer to real time across the network. Before it installed INDE, Xcel Energy wasn’t able to accept the individual inputs for a vast number of devices, from meters to in home energy devices, explained the Taft.

While Accenture’s INDE won’t solve the problem of the batch-processing mentality at utilities, it will help with the transition. In the end it’s all about the most efficient way to process a whole lot of information about energy and develop systems that can quickly react to, and interact with, the millions of points on the network (that’s us). To make sure real-time processing plays a much bigger part of the smart grid in the near term, utilities need to increasingly listen to companies like Accenture that have strong roots in the information technology industry and can share a lot of lessons learned from computing and the architecture of the Internet.


Comments have been disabled for this post