In a new Forrester report titled “2011 Top 10 IaaS Cloud Predictions For I&O Leaders,” authors James Staten and Lauren E. Nelson highlight widespread convergence of big data products and cloud computing. Specifically, the authors advise infrastructure and operations (I&O) leaders to encourage their data analysts to get hip to cloud-based analytics tools if they aren’t already, and to seriously consider the bottom-line value of making their organizational data available to the public as a cloud resource. This is pretty progressive advice coming from a large analyst firm, and some evidence suggests it’s right on.
The two trends — cloud computing and big data –already appear to be coming together. A third-quarter 2010 Forrester survey has cloud adoption picking up significantly in 2011, particularly among large large enterprises and high-tech companies. These are prime targets for cloud-based data tools, as well, because they likely have large amounts of data to analyze, some of which might actually be valuable to other organizations. In fact, these are often the same types of organizations cited as early adopters of Hadoop, NoSQL and other products targeting big data.
Staten and Nelson highlight a number of tools, including Amazon Elastic MapReduce, Splunk and GoodData, that can help these types of organizations, or any other organization so inclined, get started with cloud-based analytics tools and reap the rewards of early adoption. As the authors point out, cloud-based offerings often are simple, low-cost ways to get familiar with advanced analytics, often in a customized manner. Specifically, they offer this advice to I&O teams:
Here’s another empowerment opportunity. If your information and knowledge management professionals aren’t considering these solutions, introduce them to the concept and demonstrate your proactive willingness to help them get started. If cost is a significant barrier to BI, consider following the 12,000 enterprises, including British Telecom, USDA, Tata Communications, and QED Financial Systems, that have already downloaded Jaspersoft, an open source BI solution, as a starting point.
Jaspersoft, coincidentally, just announced technology integrations with just about every noteworthy big data product on the market, including Hadoop and a large number of NoSQL databases.
Staten and Nelson also advise organizations to consider whether they could make money from their data by making it publicly available a la the Associated Press, Dun & Bradstreet and Esri. Although placing any proprietary data into a public data marketplace (Windows Azure DataMarket or InfoChimps, for example), is a decision beyond the realm of IT, the latter group can play an important role in the process by developing self-service cloud portals to access the data, as well as by ensuring proper security protocols, permissions and the like are in place.
It’s unclear to me that too many organizations, especially large businesses, are keen on giving away any advantage derived from their analytical efforts, but I think the proposition has to be compelling. Especially if a data set has already served its primary purpose, there might be relatively little harm in making it available to others that might be interested in it for entirely different purposes, and that might be willing to pay for it. I’m thinking specifically of a conversation I had with IBM last year, in which CTO for Emerging Internet Technologies David Boloker told me about about an IBM analysis of 10 years worth of patents that could be a valuable resource for other organizations.
The rest of the report’s predictions focus on more traditional cloud computing concerns such as delivery models, standards and security, but it’s the advice on taking data analysis to the next level that steals the show. It’s great to see even analyst firms with largely prudent clients pushing them not only to step up their data efforts, but to do so in the cloud.
Related content from GigaOM Pro (sub req’d):