Southern California Edison has installed Teradata data warehouse appliances to store and serve data from customers’ smart grid meters, Teradata said in a press release this morning. According to information provided to me by a Teradata representative, the deployment includes two Teradata Active Enterprise Data Warehouse systems (one for production and one for disaster recovery) each capable of storing 50TB before compression. It’s probably not the last such deal we’ll hear in the coming months and years, as smart grid operators must do something to store and analyze the huge streams of customer data flowing in from customer endpoints.
According to a December report from Pike Research, the market for smart-grid data analytics will reach $4.2 billion by 2015, which means lots of business for companies like Teradata that can help power companies store, manage and analyze that data. On top of the 100TB of capacity, for example, Southern California Edison is also leveraging Teradata’s Utilities Logical Data Model, which lets users analyze data from a variety of sources, including data that hasn’t or won’t ever make its way into the data warehouse. Other than Teradata, vendors such as IBM, Oracle, EMC and SAP all have products to handle both storage and analytics, as do a variety of utility-industry-specific vendors. As will be discussed at April’s Green:Net conference, the smart grid is fast becoming a reality, which means operators are looking for technology solutions.
Aside from data warehouses, some energy companies are supplementing their smart grid data strategies by utilizing Hadoop. Unlike data warehouses, which generally house structured (or relational) data that is critical to a company for one reason or another, Hadoop is designed to inexpensively store and process large amounts of unstructured data from devices, sensors and other non-traditional sources. That’s a natural fit for smart grid operators, which might want to analyze these types of data without taking up precious data warehouse space. In fact, many data warehouse vendors, including Teradata, have technology partnerships with Cloudera to enable bi-directional data access and analysis between data warehouses and Hadoop clusters.
Southern California Edison isn’t using Hadoop, but others are. As my colleague Katie Fehrenbacher reported in 2009, the Tennesse Valley Authority is using Hadoop to store and process “data about the reliability of electricity on the power grid using phasor measurement unit (PMU) devices.” Additionally, General Electric is currently seeking to hire a “computational intelligence researcher” with Hadoop experience to help “enable new business and product opportunities in diverse areas such as Smart Grid, finance, and services.”
The smart grid is just one reason for the advent of the big data era, though, which has companies of all types trying new approaches to gain insights from their constantly growing volumes of data. In fact, many of Teradata’s smaller competitors — including Netezza, Greenplum, Sybase and Vertica — have been acquired lately in the name of capitalizing on customers’ data woes, and Hadoop is getting attention from a wide variety of industries outside its web roots. We’ll talk about all this and more at our upcoming Structure Big Data conference March 23 in New York City.
Image courtesy of Flickr user Beige Alert.
Related Research about on Big Data from GigaOM Pro: