By now, so much has been said about Hadoop that the amount of discussion itself could probably qualify as a data set for petabyte-scale analysis. And if you analyzed sentiment, you’d see a very clear “good news, bad news” story emerge. The platform brings tremendous value in handling massive and diverse amounts of data, but the complexity makes Hadoop challenging to all but the most seasoned (and scarce) technologists and data scientists.
For the subset of companies whose core operations involve data engineering, this may not be a problem. But the rest of the corporate world has been stuck on the outside looking in, envious of the benefits but spooked by the intricacy. Worse yet, some firms have suffered through poorly understood, do-it-yourself Hadoop implementations that fail to leverage the investment and can even create unreliable results.
Teradata knows these challenges well. That’s why, at his keynote address at the Hadoop Summit, Scott Gnau, the president of Teradata Labs, announced the release of Teradata Portfolio for Hadoop, an offering that takes the complexity and cost out of deploying and managing Hadoop. Teradata Portfolio for Hadoop is a single source for all things Hadoop that removes the engineering and operations guesswork for non-programmers, business analysts and business consumers looking to extract value and competitive advantage from their data in a range of commercial environments.
With this news, which Scott explains further on his blog, “main street” organizations now have easy, intuitive and flexible options to use Hadoop.