As Hadoop moves towards establishing itself as a key data-management platform for the enterprise there is a new set of challenges it must meet in order to be a true contender in the field. Virtually all research and forecasts are pointing towards huge market growth in the big data domain, and the Hadoop ecosystem is now considered the foundation for this growth.
However, as key Hadoop vendors are pointing out, it normally takes about a decade for a new market to consolidate and reach maturity status, and Hadoop is no exception. It has been around since 2005, and one of the areas of this rapidly growing and maturing ecosystem that remains lackluster is that of security.
The problem stems from Hadoop’s origins: It is a platform designed and developed primarily for data-specialist use-case scenarios. Hadoop hits its mark best when used by small, cohesive teams of experts working in isolated environments on data sets explicitly assigned to them and under their control.
But now that Hadoop’s application is transitioning increasingly to that of serving as a backbone for enterprise data management those implicit assumptions are no longer valid. In fact, this very realization has led the evolution of Hadoop from its initial design based on the MapReduce batch file-processing paradigm to its more recent incarnation, broadening its scope to accommodate other data processing approaches.
Thumbnail image courtesy of vitanovski/Thinkstock.