The Internet of Things brings with it a massive amount of data as device proliferation extends far beyond our smartphones to our homes, cars and entire cities. For now, the use of the data is confined largely within the application’s use. For example, the Nest learning thermostat aggregates its data, runs analytics and makes decisions within its siloed system.
But in the future all of that data is unlikely to remain siloed if for the most basic reason that many players see gold in data the way those in the 19th century saw gold in oil. While today data remains largely available only to the system collecting that data, it’s unlikely to always be that way. To that end, startups like Terbine have raised initial funding to become an intermediary for connecting buyers and sellers of sensor data worldwide. The vision brings forth to mind a market where the more valuable the data, the more costly it is. Essentially, allow a market to determine the value of data.
There are, of course, many barriers to this sort of idea. Cisco released a paper in 2012 called “Unlocking Value in the Fragmented World of Big Data Analytics: How Information Infomediaries Will Create a New Data Ecosystem.” From the paper:
“As things stand now, the data ecosystem is highly fragmented. Between those who create data and those who could potentially extract value from it sits a labyrinth fraught with complexity, disparity, and miscommunication. If analytics are to be the new “refinery,” some of that fragmentation will need to be addressed with greater connectivity, trust, and efficiency.”
Connecting data sellers with buyers is a significant hurdle. There are primary issues related to the volume of unstructured versus structured data. Additionally, common formats for encoding data, that would allow buyers and sellers to exchange an easily readable format, are still to come. Simple tagging of data with information like location of origin or time of day will have to be made uniform. Not to mention privacy concerns are likely to mount as companies begin to sell data generated from sensors either owned by consumers or deployed in public spaces.
In practical terms, the market will also favor intermediaries or data buyers that find a way to leverage the data in realtime. This will require continued declined in processor costs and robust high speed broadband to allow for the quick transfer of data. If the idea that milliseconds in terms of data and decision making sounds like a little much, one doesn’t have to look further than Wall Street and electronic trading to see that companies will pay money to get data faster and to have the analytics in place with the right processor power to automate decisions.
These examples may sound nefarious and borderline invasions of privacy but the near term applications are likely to go beyond using data and analytics for simple profit. Municipalities will be able to integrate sensor data on traffic flow to reduce jams. Data from wearables could be crunched by doctors and hospitals to prevent adverse outcomes or to adjust medication. Globally, sensor data should provide some insights into tracking of patterns in climate change or infectious disease. On the business end, the initial outcomes are likely to be in operational efficiencies as inventory is tracked and customer behavior is better understood.
Still integrating outside data, data not generated from internal company or systems sources, is a next evolutionary step. CIOs I’ve met with have told me they are excited about this possibility as the next evolution—seeing if outside data could be leveraged to improve outcomes. What’s needed is a simple and easy place to even visualize the breadth of data available, never mind put it into a common format and deliver it in real time. This is a huge task, but as Terbine founder David Knight told Gigaom recently, “What I really like is being involved with things people say can’t be done.” Never say never.