Bursting the Cloud Bubble: 5 Reasons It's Not Just Hype

With all the hype about cloud computing, it’s easy to label it as the latest fad, especially when everyone whose application talks Internet is trying to rebrand themselves as a cloud. But the long view shows that this really is an important change, one of several major shifts in computing that have taken place over the last 40 years, each of them driven by costs and shortages.

Once upon a time, computing was expensive. As a result, programmers carried their stacks of punched cards into basements late at night, and ran them on the mainframe. The CPU was always busy; humans were cheap.

When computing became cheap, bandwidth and storage remained expensive. The CPU was idle, but the links were full. This gave us the PC and client-server architectures. A wide range of clients on a variety of networking protocols kept things complicated, and WAN prices meant most network traffic was local.

Eventually, we settled on browsers, HTTP and TCP/IP. This was web computing, with a simple, standard edge and a tiered core. Client-side broadband access and persistent storage were relatively cheap. (Don’t believe they’re cheap? Go into an enterprise and you’ll find their networks and storage systems have plenty of extra capacity. The same is true for the Internet — if you ignore the impact of spam and P2P traffic.)

Now here’s the cloud. It’s driven by five big things, none of which are hype, and all of which are changing the way we compute.

  1. Power and cooling are expensive. Today, it costs far more to run computers than it does to buy them in the first place. To save on power, we’re building data centers near dams; for cooling, we’re considering using decommissioned ships. This is about economics and engineering.
  2. Demand is global. Storage itself may be cheap, but data processing at scale is hard to do. With millions of consumers using a service, putting data next to computing is the only way to satisfy them.
  3. Computing is ubiquitous. We’ve lost our desktop affinity. Most of the devices in the world that can access the Internet aren’t desktops; they’re cell phones. Keeping applications and content on a desktop isn’t just old-fashioned — it’s inconvenient.
  4. Applications are built from massive, smart parts. Clouds give developers building blocks they couldn’t build themselves, from storage to authentication to friend feeds to CRM interfaces, letting coders stand on the shoulders of giants.
  5. Clouds let us experiment. By removing the cost of staging an environment, a cloud lets a company try new things faster. This is also true of virtualization in general, but by billing on demand the cloud means anyone can experiment.

This truly is a fundamental change in computing, even if its title has been diluted by marketing agendas. We have to be careful not to throw the innovation baby out with the cloud hype bathwater.

loading

Comments have been disabled for this post