9 Comments

Summary:

Ron Bodkin of Think Big Analytics discusses the best and worst practices for adopting big data technologies and actually getting results. Companies must beware of dangerous decisions, charlatans and disastrous missteps.

Big data has reached a critical stage. The market is poised to grow to more than $50 billion by 2017, but more than 55 percent of big data projects fail.

With so much opportunity coupled with hype and misinformation, we are in the midst of the big data Wild West. There is a standoff coming between those that understand what big data is — the ones making investments to collect, store and harvest it — and those that are buying snake oil and don’t understand how big data can impact their business.

The good

Big data allows you to fail. This may sound counterintuitive, but when it comes to big data, there’s good failure and bad failure.

Confused? Here’s an explanation: Big data, in its raw form, allows for a “test and learn” approach. Companies have to create many small “failures” by developing hypotheses and examining them against the data. This allows enterprises to develop a truly coherent strategic approach grounded in data.

These “failures,” part of the process of uncovering good unbiased analysis, create tremendous opportunities for companies in a number of areas: customer recommendations, risk measurements, device failure predictions and streamlining logistics to name a few.

In some of the best cases, businesses are developing new products from their insights, creating new revenue streams and even transforming their culture to be data-driven. To reach this stage of evolution, businesses must attack scalability and cost containment, develop agile analytics and insights, and look to optimize their business with automated predictive analytics at scale. Only then can they truly move to the final stage of the process, where big data capabilities literally transform your business and help you create new products based on the data.

Speed is the key to success in the early stages of big data implementation. The faster you can complete projects and build organizational expertise in using data in this new way, the sooner you can create value and move to a more sophisticated stage of adoption.

Quantcast is a perfect example of this four-stage transformation. It began as a free audience-measurement service to directly measure website traffic. As it gained traction and grew from thousands to billions of events per day, Quantcast quickly hit a requirement to scale beyond traditional database technologies. As the business expanded, it became essential to provide higher quality insights and analytics, to add value and deepen relationships with audience measurement.

Quantcast quickly saw the need to invest in data science to address complex challenges in counting unique people and identifying demographics and interests among a sea of anonymous activity. Subsequently, Quantcast used this environment to test a number of products and offerings, including a highly successful scalable advertising solution, called Lookalikes, that allows advertisers to target new consumers who are highly similar to their already-engaged ones. Quantcast is now earning more than $100 million in annual revenue, and its dexterity allowed the company to react as big data opportunities developed.

Quantcast, Google, Facebook, LinkedIn are pioneers in big data and have gone through these stages of evolution. As big data moves from webscale pioneers, other enterprises are beginning to embrace it to create value and building competency. For example, large IT suppliers are using detailed technical product data, blended with business data, to optimize their businesses with predictive analytics that results in capabilities such as proactive servicing of customer devices.

Structure Data 2013 OMX Group Ron Bodkin Think Big Analytics

Ron Bodkin at Structure: Data 2013. Source: Albert Chau itsmebert.com

The bad

Unfortunately, many enterprises are still in the kick-the-tires mode and are only exploring big data from a cost containment or storage scalability point of view. They may be looking at “agile analytics” — the ability to be flexible with your data, unconstrained by traditional limitations of developer resources or database capacity — to work with data.

But this means that many, if not most, companies exploring big data are missing opportunities to improve their business and provide better service to customers. They are also potentially missing out on the chance to develop new products based on data rather than just intuition. They are reaching big data plateaus — achieving the ability to store data, but not extracting additional value.

Big data requires an investment of people and resources — the human layer to make sense of all the technology. Traditional methods of cost savings often require staff reductions. With big data, the opposite is true. In order to achieve big data breakthroughs, companies need to invest in technology and people. Larger enterprises that are unable to move quickly and make these investments will allow smaller, nimbler competitors to gain a competitive edge.

The ugly

It doesn’t get any uglier than a majority failure rate — that aforementioned 55 percent. Why the failure? One thought is that in the Wild West of data, there are hustlers and charlatans who promise the world but don’t produce results. These vendors realize there’s a lot of hype around big data and behave accordingly. Many legacy consultants and systems integrators have positioned themselves as experts despite their lack of qualifications.

Likewise, many established product vendors are marketing last-generation environments as “big data.” Many of these vendors have their work run on a single computer with SAS models — this is not big data! Others focus on data munging and reporting, as well as extracting, transforming and loading against small relational databases. These vendors often have some pre-built proprietary software that doesn’t leverage open source standards such as Apache Hadoop.

We are at a crucial point in the trajectory of big data — one that begs for consistent results in order to ensure continued growth. When companies are fooled into thinking inadequate technologies or techniques are big data, their business will suffer as their analytics fall short of what is necessary. If their projects fail or provide misinformation, these enterprises will lose ground to their competitors who did understand how to properly use big data technologies.

Rod Bodkin is co-founder and CEO of Think Big Analytics. He will be speaking Thursday, Sept. 19, at our Structure: Europe conference in London.

Feature image courtesy of Shutterstock user Antonio Gravante.

  1. Like it or hate it – Big Data is the future. We are in the very early stages – when we start getting insights from BigData, it would be fun.

    http://fakevalley.com/breakthrough-coffee-and-sugar-are-bought-together/

    Share
    1. Immanuel Nwachukwu Tuesday, September 17, 2013

      Coffee and Sugar are bought together… and so what. Didn’t we already ‘guess’ this correctly for ages? How are researcher on terabytes of data proclaiming this as a breakthrough?

      Share
    2. Immanuel Nwachukwu Tuesday, September 17, 2013

      Nevermind… I think that was a joke.

      Share
  2. Reblogged this on SoLoMo Law and commented:
    Great thoughts about the promise of Big Data. Just be sure you know where your data comes from, and how others are using the data they are getting from you. Do right by your users and they will do right by you.

    Share
  3. Hey Ron, nicely written. We all know that big data is the future! Thanks for sharing your thoughts :-)

    Share
  4. Big data will rule the world in twenty first century and is destined to be rather as important as money…any laxity in ensuring its purity and making it noise free may snowball into making decision making process prone to corruption and lack of integrity..

    Share
  5. Ron – you highlight some great points about the realities of large data analytics projects. There is huge potential, but getting to it requires a lot of effort and thought. We are actually hosting an event along this lines next week: http://dataanalyticsb2bforum.eventbrite.com/ . We have a panel of data analytics companies talking about real world case studies of customers deployments (good, bad & ugly). Please feel free to join us if you are interested.

    Share
  6. Ron,
    You highlight some great points on the realities of large data analytics projects. There is huge potential for value, but getting to it requires real effort and thought. We are actually hosting an event next Monday very much along these lines. We will have leaders from data analytics companies sharing real world case studies – the good, the bad, and the ugly – from customer implementations. Please feel free to join if you are interested: http://dataanalyticsb2bforum.eventbrite.com/

    -Matt

    Share
  7. i agree with most of this, it would seem that using big data for many companies should and perhaps must involve the old fashioned approach known as the “scientific method.”

    For Business. Sure man, though I must admit I ripped this info off from the tricksters over at Wickipedia

    Empirical evidence (also empirical data, sense experience, empirical knowledge, or the a posteriori)….., and that math I never learned Algorythm

    Share

Comments have been disabled for this post