Google’s (s goog) core philosophy about opening up access to the world’s information is the reason behind the company’s pro-net neutrality stand, the building of its own fiber network and its search for protocols for moving information between cloud providers. Google discussed its information liberation efforts at the search giant’s Atmosphere event held today as part of its efforts to push enterprise adoption of cloud computing.
One of the biggest issues holding back cloud computing is the lack of protocols and a standard vocabulary around moving data from one cloud to another, said Vint Cerf, Google’s chief Internet evangelist. “Inter-cloud interaction is still in the formative stage,” he said. Cerf spoke on a panel at the all-day event held at Google’s headquarters with some 400 attendees representing corporate America. I’m watching via webcast, and my colleague Liz is contributing notes and photos from the event in person.
Cerf likened the cloud today to corporate networks back in the pre-Internet days, explaining that the Internet emerged as the way to bring disparate networks together and to enable folks to move data from one network to another. That’s an especially interesting comparison given that both IBM (s ibm) and HP (s hpq) have said they view the cloud as open in so much as there are already existing protocols such as TCP/IP and Http to move data between different clouds.
Cerf didn’t sound satisfied by this, and I don’t imagine he should be given some of the security needs of data moving between clouds and the amount of bandwidth such information can require. Surely for sharing such large amounts of sensitive data, different protocols that are open and standardized might make sense. Look at companies like Aspera, which is offering a proprietary protocol to shift huge volumes of information between data centers.
Plus Cerf called for ways to move information between clouds in ways that preserves the metadata that makes the data itself useful. If you think of the data as a piece of meat — say, salami — the metadata is the additional ingredients and bread that determine if the salami ends up as a mufaletta instead of a Italian sub when traversing from one cloud to another.
In addition to protocols for moving information between clouds, Cerf said the industry needs to keep paying attention to IPv6 as the number of devices connected to the web increase, and it also needs to develop protocols to send information via broadcast, rather than sending everything via a one-to-one unicast connection. Other protocols or standards should focus on authentication and knowing who a person is on the web. Interestingly, none of the Googlers mentioned protocols for protecting data privacy or anonymity.
The panel also touched on non-cloud issues such as the importance of net neutrality, with Cerf reiterating that Google isn’t calling for every packet to be treated the same, but rather making sure the owners of the pipe don’t behave anticompetitively toward content flowing over their pipes. Prioritizing the flow of information for legitimate network management means is fine, but blocking them to stifle competition isn’t.
It wasn’t just Cerf speaking. Alan Eustace, SVP of engineering & research, and Jeff Huber, SVP of engineering, also shed light on Google’s plans. When asked about native or platform-specific apps vs. the browser, Huber said, “The app model doesn’t scale well across different devices, and that’s why the browser and HTML 5 is important.”
He also discussed a new native client that Google is developing that allows web apps to run at native speeds while keeping those apps partitioned off from all of the resources of the hardware, saying it will “raise the bar on what a web app will do.”
Finally Cerf and Huber explained why Google is building out its experimental fiber network to bring 1-gigabit-per-second speeds to 50,000-500,000 Americans. Simply put, Google needs data. “What does [a fiber network buildout] take technologically, and what does it cost not only to deliver it but to maintain it?” Cerf said. “Our business model isn’t to replicate that all over the world, but to understand it.” Later he added that Google might be able to bring new knowledge to the table, something that could help drive innovation in broadband (GigaOM Pro sub req’d).
Google’s search for data and the trek to catalogue information can obviously disrupt entire industries, but it’s clear that the company is stepping up to take a greater role when it comes to cloud computing and hosted applications. We’ve seen its effort to get enterprise customers on board and it sounds like we’re going to see it attempt to drive standards as well. As it does so, Huber offered a reminder to those in the Atmosphere audience, “The more fundamental or structural thing is our commitment to openness…it’s your data, not something we’re trying to capture and keep.”
I’m not sure that’s a message that’s getting through to some people so far, especially given Google’s reliance on proprietary code for its Google Apps platform to enable programs to scale across its infrastructure. But perhaps for enterprise customers, Google is saying enough of the right things to drive interest and eventual adoption. We’ll have to see.