Cloud computing — where mega-data centers serve up webmail, search results, unified communications, or computing and storage for a fee — is top of mind for enterprise CIOs these days. Ultimately, however, the future of cloud adoption will depend less on the technology involved and more on strategic and economic factors.
On the one hand, Nick Carr, author of “The Big Switch,” posits that all computing will move to the cloud, just as electricity — another essential utility — did a century ago. As Carr explains, enterprises in the early industrial age grew productivity by utilizing locally generated mechanical power from steam engines and waterwheels, delivered over the local area networks of the time: belts and gears. As the age of electricity dawned, these were upgraded to premises-based electrical generators — so-called dynamos — which then “moved to the cloud” as power generation shifted to hyper-scale, location-independent, metered, pay-per-use, on-demand services: electric utilities. Carr’s prediction appears to be unfolding, given that some of the largest cloud service providers have already surpassed the billion dollar-milestone.
On the other hand, at least one tech industry luminary has called the cloud “complete gibberish,” and one well-respected consulting group is claiming that the cloud has risen to a “peak of inflated expectations” while another has found real adoption of infrastructure-as-a-service to be lower than expected. And Carr himself does admit that “all historical models and analogies have their limits.”
So will enterprise cloud computing represent The Big Switch, a dimmer switch or a little niche? At the GigaOM Network’s annual Structure conference in June, I’ll be moderating a distinguished panel that will look at this issue in depth, consisting of Will Forrest, a partner at McKinsey & Company and author of the provocative report “Clearing the Air on Cloud Computing;” James Staten, principal analyst at Forrester Research and specialist in public and private cloud computing; and John Hagel, director and co-chairman of Deloitte Consulting’s Center for the Edge, author and World Economic Forum fellow. We may not all agree, but the discussion should be enlightening, since ultimately, enterprise decisions and architectures will be based on a few key factors:
Drivers and Barriers: These can include enhanced flexibility and agility from on-demand, scalable resources; reduced total cost via optimal hybrid solutions; accelerated time-to-market via ready-to-ware applications and innovation platforms; application architecture constraints and requirements; security and more.
Development and Deployment Options: Solutions can be built in-house from scratch; use pre-built software on owned, dedicated infrastructure; go all cloud; be outsourced or be the result of a combination of such approaches.
Metrics and Models: A true apples-to-apples comparison of financials, risk, compliance, customer experience and competitiveness is tricky, because among the factors that need to be taken into account are the roles of sunk costs and depreciated assets, migration costs, marginal capital investments or ancillary costs required to implement and transition to robust solutions, power, cooling, space, management, administration, certification and training. Stair-step effects — where provisioning of new capacity is done in large blocks which at first are underutilized, or the need for one additional quantum of compute capacity drives construction of an entire new data center — complicate things further.
Trends: Differences between enterprise and cloud cost structures will shift over time based on competitive intensity, scale economies and learning curve effects, as well as technology and best practices diffusion. There is a widespread belief that larger cloud providers have dramatic scale economies, but these may be illusory or unsustainable, since the same building blocks — servers, storage, automation tools, even containerized data centers — and potentially the same options (such as optimal site location) may be available to both enterprises and cloud providers.
Demand: Variable and unpredictable customer demand — due to macroeconomic factors, bullwhip effects in the supply chain, and fads and floods — impact the total cost/benefit equation, while the bottom-line benefits of pay-per-use services vary based on demand curve differences.
Consider all of these factors together. Nick Carr’s predictions will likely be realized in cases where a public cloud offers compelling cost advantages, enhanced flexibility, improved user experience and reduced risk. On the other hand, if there is a high degree of technology diffusion for cloud enablers such as service automation management, limited cost differentials between “make” and “buy,” and relatively flat demand, one might project a preference for internal solutions, comprising virtualized, automated, standardized enterprise data centers.
It may be overly simplistic to conclude that IT will recapitulate the path of the last century of electricity; its evolution is likely to be more far more nuanced. Which is why it’s important to understand the types of models that have already been proven in the competitive marketplace of evolving cloud offers — and the underlying factors that have caused these successes — in order to see more clearly what the future may hold. Hope to see you at Structure 2010.