It’s been about three years since Amazon made its risky bet on delivering computing and storage via the cloud. It started by offering commitment-free, pay-as-you-go storage, enabling startups to start scaling their businesses without significant investment in capital equipment. It later added compute cycles to its services and today has a host of other offerings, including a content delivery network.
Both the hype and perceived value around cloud computing has expanded since that first shot was fired, but enterprises remain cautious. What’s becoming clear is that the best way to get them to join the cloud revolution is to introduce private or internal clouds for corporate IT and then gradually merge or offload data from those private corporate clouds into public ones. In other words, for cloud vendors, the big opportunities in the cloud space are in helping enterprise customers deploy their own internal clouds (like Elastra or the larger vendors do), helping them manage multiple clouds, and figuring out how to transfer data between internal and external clouds.
Earlier this week I moderated a panel at the South by Southwest Interactive festival in Austin, Texas, that included Werner Vogels, CTO of Amazon’s Web Services. He said some of his enterprise clients are using the cloud for testing and developing software and some for high-performance computing, such as Eli Lilly. But many, he acknowledged, still need convincing.
Yousef Khalidi, a distinguished engineer at Microsoft helping to build the software giant’s Azure cloud platform, had a cautious assessment of how enterprises are likely to view the cloud: He thinks corporate IT will want to keep its cloud under corporate control, and that as a result the creation of hybrid clouds — or “cloudbursting” — where certain jobs move from the internal servers to an external cloud, are going to be a preferred model.
Most of the large companies that have enterprise customers agree that while security and regulatory compliance issues can be dealt with, there are legal hurdles that require a company to know where their data is physically stored.
“Solving security is a trust issue that can be surmounted, but the legal issues around location cannot be,” said Scott McClellan V-P and chief technologist of scalable computing at HP during an interview. “There are also items, such as corporate data for financial results during a quiet period, that aren’t going to leave the enterprise walls.”
Inside those walls HP and companies such as Elastra, Sun Microsystems and IBM are pitching highly virtualized and automated environments that mimic the agility of the public clouds. However, all of the people at enterprise-oriented companies I’ve spoken with believe that their customers should start turning over some computing tasks to external clouds, be they infrastructure providers like those offered by Amazon, Rackspace’s new CloudServers business, Sun’s planned cloud or platforms such as Microsoft’s Azure.
Many believe enterprise customers will source their computing to multiple clouds, both to avoid vendor lock-in and because some clouds will be optimized for certain types of computing tasks. That’s why tools to manage multiple clouds will be important. RightScale, Aptana, Sun and others are all trying to help manage multiple clouds.
And once an enterprise sends out data to external clouds, they will need to find ways to manage, secure and actually deliver this data from inside the corporation to a cloud. Some providers, like Rackspace and Voxel, are banking on customers using their hosting and cloud products as a way to keep the data inside the same company (and maybe data center). Vogels says Amazon uses VPNs with enterprise clients, while startups such as Aspera are creating private highways to deliver data between clouds.
So while enterprises reach for the clouds, they’ll keep some of their IT firmly on the ground. This gives plenty of a companies opportunities to deliver and manage the data as it goes from cloud to corporate data center.
This article also appeared on BusinessWeek.com.