1 Comment

Summary:

To determine the future of computing, it may be worth taking a look at the past — if only to find out that there are no simple answers.

Bryan Cantrill Joyent Structure 2014

It may seem like everyone and their mother is using Amazon’s public cloud these days, but Joyent CTO Bryan Cantrill recently had a conversation with a big enterprise company where the mere mention of Amazon resulted in stone-faced silence. Cantrill initially didn’t understand this stark response, and then he remembered the business that company was in: retail.

Cantrill recalled this episode at Gigaom’s Structure conference in San Francisco on Wednesday, where he gave a presentation about the future of computing. The key take-away: There is no all-encompassing drive towards the cloud. Instead, there are plenty of companies that prefer to keep data on premise, while others mix and match.

Turns out that’s not exactly a new trend. “If you look at the history of computation, it is one of oscillation,” Cantrill said, elaborating that computation has been going back and forth between centralization and decentralization for the last fifty or so years, from the vision of utility computing in the sixties to federated systems to giant mainframes to small PCs running Linux.

Cantrill predicted that this oscillation process will continue with regards to cloud versus on-premise computing. “We are pointed towards a heterogeneous future,” he said, adding that decentralization can be good to disrupt pricing, but that companies increasingly also want to control over their data and not hand it over to their competitors or other parties. “If you have too much freedom, it’s called anarchy,” he quipped.

Photo by Jakub Mosur

Structure 2014 ticker

  1. Did he mean Chaos? You should have corrected him.

    Reply Share