1 Comment

Summary:

All sorts of firms today are storing data in the belief that they will be able to leverage it in powerful ways. But, if they don’t know what they’re doing, the data lakes can turn into landfills.

Raymie Stata Bhaskar Ghosh LinkedIn Adam Fuchs Sqrrl Data Ron Bodkin Think Big Analytics Structure:Europe 2013
photo: Anna Gordon/GigaOM

We hear about “big data” everywhere these days and, for companies, it’s assumed to be a good thing: more knowledge, more analytics, more scale. But, in reality, big data pools can become an expensive, incoherent mess that no one is sure what to do with.

These “data landfills” are a problem, especially for companies that are dipping their toes into big data for the first time. At GigaOM’s Structure:Europe event on Thursday, four data infrastructure expert described the problem and how to avoid it.

Raymie Stata, a former Yahoo CTO who is now CEO of Altiscale, explained that there is a lack of good meta-data tools for Hadoop that can identify data lakes, and flag which ones are useful for various parts of a business.

Stata said that too often it’s only a firm’s operations team that sees the raw data, depriving product teams of a chance to see rich signals they could use to respond to customers. Meanwhile, the data continues to pile up, creating costs, and leaving firms to ask questions like:

“Where did this come from? What was it derived from? What’s the retention policy? Who can get rid of it?”

According to Adam Fuchs, the CTO at Sqrrl Data, the problem is based in part on the fact that most big data tools are built for the developer community, not for the customer-facing parts of a business. This means that sales and other units in a firm don’t have a practical way to understand the data their company controls nor how to turn it into business opportunities.

The good news is that the tide may be turning. Ron Bodkin, the CEO of Think Big Analytics, said the siloed view of data is waning:

“The traditional approach of analytics and warehousing — those days are numbered,” said Bodkin, adding that it’s not acceptable anymore for developers to keep data in a “glass house” and only hand it over once it’s been perfectly curated.

The better approach is to treat data in a more agile manner, and create teams that consist of both an IT person and a business executive. Together they can scan the data for opportunities that will yield a quick and concrete success — like producing a sale, or discovering and purging a malicious botnet. This type of success can, in turn, change the conversation about data and inspire more teams to start brainstorming.

Bodkin added that, historically, IT departments don’t regard themselves as obligated to add value to a firm but that, today, they should be as committed to experimenting and finding revenue opportunities as everyone else.

Some companies, especially those where big data is a core competency, already know this. One example is LinkedIn:

“Within the LinkedIn eco-system, we have done an innovative thing for sales people and marketing,” said Bhaskar Ghosh, Senior Director of Engineering and Data Infrastructure, explaining that the company’s engineers supply “interest graphs” that make it easier for the salesforce to identify and sell to customers.

The panel was moderated by Tim Moreton, the CTO of Acunu, who, like Bodkin, stressed the importance of getting “quick wins” with data as way to ensure the data lakes don’t turn into landfills.

Check out the rest of our Structure:Europe 2013 coverage here, and a video embed of the session follows below:

A transcription of the video follows on the next page

You’re subscribed! If you like, you can update your settings

page of 2
  1. Great post! We mention it and link back to our latest blog post here – http://www.matillion.com/insight/is-your-business-intelligence-strategy-it-or-business-led/ Take a look!

Comments have been disabled for this post