4 Comments

Summary:

To take full advantage of big data, businesses must think about how to use those mountains of data as they come into the network, not store it and hope to gather insights weeks or even months later. To do this, we need new tools.

avalanche

Most enterprises think they know how promising their data is. The truth is, they don’t realize just how much value is hidden in the massive amounts of data they sit on — even as more data rolls on in. And because of this, the best insights – the ones that can be harnessed for transformative change – are at risk of getting buried in today’s data avalanche.

Analyze your data in the now.

Back when I was working on my PhD, I worked in the lab of a major telecommunications operator. My job was to run algorithmic possibilities on sensor-generated data so I could identify valuable trends and clues to network performance.
The best part of the day was when the FedEx truck arrived and I could get my hands on boxes of network data storage drives with mountains of months-old data generated by those sensors just waiting to be analyzed. Talk about timely insights being dead on arrival!

My employer had no idea what insights lay buried inside those drives. And yet, collecting, storing and sending the data out for analysis was the only option they it at the time. At that point I realized the model for data analysis had to change on a fundamental level, especially if data was going to continue its exponential growth curve. Businesses needed to analyze data as the avalanche roared in, and it was going to take some sturdy tools to do it.

The smartphones and tablets we rely on today contain a wealth of information on us — our preferences, our habits, our behavior. And this is just one kind of machine interface. there are also cars, for example, which now come equipped with an array of sensors to gauge everything from driving styles to road conditions and wear-and-tear, all in the interest of making driving safer and more enjoyable. Meanwhile, cities are deploying wireless sensors in stoplights for improved traffic surveillance. In disaster-prone regions, bridges and buildings can even evaluate their own stress points.

This phenomenon gives us an extraordinary opportunity — one that no civilization has had before — to know the now. If businesses act fast enough, they can distill that knowledge into timely, intelligent, data-driven insights for more agile operational and business processes. For example, an auto collision warning that pops up three weeks after the crash itself is useless. It’s the immediacy of insight that can then be translated immediately into action that safeguards us against disaster.

We need tools for real-time analysis

So now that the ability to gather such immediate data from a variety of devices and places exists in our world, it’s imperative we put it to work for our advantage. How can businesses parse data in a timely manner to identify trends, glean new insights into customer behavior, and respond immediately to changing market dynamics or customer habits? How can we best take divergent sources of data and dynamically fuse them together so people, machines and processes make optimal responses at any given moment in time?
In order to save this data from a premature death, and catapult it into a driving force for a data-driven global economy both the enterprise and the analytics architecture must rise to the occasion. Enterprises need a new approach to analytics where contextually-aware applications are based on specific use cases, built on a new data processing stack and backed by a new economic model.

As it stands today, big data analytics technology is comprised of many disparate toolsets and technologies. What’s missing is a foundational architecture to support all these individual tools and technologies — a complete, holistic stack that can help organizations get from data ingestion to data decisions in one fell swoop. This new architecture must recognize that a sensor-rich world creates data continuously, and in order to take immediate action, the analysis too must also be done continuously, rather than after-the-fact, once the data is stored away. This new architecture must also combine a variety of data sources instead of keeping them in silos. And it must elastically scale to the petabytes of structured and unstructured data that are now generated on a nonstop basis.

Equally important is the need for a new economic model for data processing. Today, enterprise customers spend tens of millions of upfront dollars on data projects, the majority of which goes toward capturing and storing the data. They must then wait a year or more to start seeing value from their data assets.

Our data-rich world therefore needs a new paradigm where enterprises first spend on analytics — not storage — with an agile, iterative approach that proves out the value of a particular idea in the first days and weeks of deployment. Once proven, this use case is swiftly rolled out as an application that any business manager can use to make decisions. This business value-led approach to big data can then be scaled across other functional areas of the business and power data-driven decision-making across the enterprise.

Once enterprises embrace this new approach, big data’s vast potential will no longer be crushed by its own weight. If data is at risk of being lost in the avalanche, our analytics platforms should serve as first responders to the emergency.

Anukool Lakhina is the CEO of Guavus.

  1. Add to that the need for everyone to be online, adding and using the data pool in real time. As things currently stand the infrastructure can’t support it, with many condemned to dial up or satellites or adsl with asymmetric speeds often less than a megabit per second. We can see the future, but it isn’t here yet. With moral and optic fibre we can bring it on?

    Share
  2. Anukool, good insight! With the explosion of big data, companies are faced with data challenges in three different areas. First, you know the type of results you want from your data but it’s computationally difficult to obtain. Second, you know the questions to ask but struggle with the answers and need to do data mining to help find those answers. And third is in the area of data exploration where you need to reveal the unknowns and look through the data for patterns and hidden relationships. The open source HPCC Systems big data processing platform can help companies with these challenges by deriving insights from massive data sets quick and simple. Designed by data scientists, it is a complete integrated solution from data ingestion and data processing to data delivery. More info at http://hpccsystems.com

    Share
  3. The first question has to be about what the end goal is for analysing the data. Too many people talk about blinding piping data into “analysis” systems to see what might come out but I’m skeptical that works. There has to be an end goal so you know not only what you’re looking for in the end, but also what data to collect and store in the first place.

    Share
  4. David,

    Truer words have never been spoken. We hear too many pundits talk about the technology, yet they never define the vague goals of “deep analysis” and “data insights.” The most important aspect of data is its VALUE. If it has no value, more of it doesn’t help. Businesses need to start by asking questions, and let the questions guide the investment in infrastructure or analysis – IF it is needed.

    Share

Comments have been disabled for this post