22 Comments

Summary:

The future of cloud computing is the availability of more computing power at a much lower cost.

bezos-law
photo: Gigaom Illustration/Steve Jurvetson

Cloud providers Google, AmazonWeb Services (AWS) and Microsoft are doing some spring-cleaning, and it’s out with the old, in with the new when it comes to pricing services. The latest cuts make it clear there’s a new business model driving cloud that is every bit as exponential in growth — with order of magnitude improvements to pricing — as Moore’s Law has been to computing.

If you need a refresher, Moore’s Law is “the observation that, over the history of computing hardware, the number of transistors on integrated circuits doubles approximately every two years.” I propose my own version, Bezos’s law. Named for Amazon CEO Jeff Bezos, I define it as the observation that, over the history of cloud, a unit of computing power price is reduced by 50 percent approximately every three years.

I’ll show the math below, but if Bezos’ law reflects reality, the only conclusion is that most enterprises should dump their data centers and move to the public cloud, thus saving money. Some savings occur over time by buying hardware subject to Moore’s Law, plus the fixed cost of maintenance, electrical power, cooling, building and labor to run a data center. In the end, I’ll show how prices are reduced by about 20 percent per year, cutting your bill in half every three years.

How we got here

Google was first to announce “deep” cuts in on-demand instance pricing across the board. To make the point that cloud pricing has been long overdue, Google’s Urs Hölzle showed in March just how much cloud pricing hasn’t followed Moore’s Law: Over the past five years, hardware costs decreased by 20 to 30 percent annually, but public cloud prices fell by just 8 percent annually:

Slide from Urs Hölzle's keynote at Google Cloud Live, March 25, 2014

Slide from Urs Hölzle’s keynote at Google Cloud Live, March 25, 2014

Having watched AWS announce, by my count, 43 price cuts during the past eight years, the claim of merely a 6 to 8 percent drop for public cloud seems off. (That would be a 2 percent reduction 43 times to get an 8 percent trend line.)

Nevertheless, applying a Moore’s law approach to capture the rate of change for cloud, one would hold constant the compute unit, while the gains are expressed in terms of lower price. Thus, Bezos’s law is the observation that, over the history of cloud, a unit of computing power price is reduced by X percent approximately every Y years.

A bit of digging on Amazon’s Web Services blog shows how Amazon determined the percentage in computing power (X) and time period (Y) on May 29, 2008. The data from 2008 and the Amazon EC2 Spot Instances on April 1, 2014, shows that in six years, similar compute instance types have declined by 16 percent for medium instances and 20 percent for extra-large instances. Assuming a straight line, the pricing would have tracked as follows:

AWS cloud price reduction
Year Price Reduction Comment
20%
2008 $0.800 $0.640
2009 $0.640 $0.512
2010 $0.512 $0.410
2011 $0.410 $0.328 3 years, 50% reduction
2012 $0.328 $0.262
2013 $0.262 $0.210
2014 $0.210 3 years, 50% reduction from 2011
April 1, 2014 $0.210 6 years, 75% reduction from 2008

For the AWS public cloud, X = 50 percent when Y = 3 years, supporting my claim: Bezos’ law is the observation that, over the history of cloud, a unit of computing power price is reduced by 50 percent approximately every three years.

What’s next

Clearly, cloud, as opposed to building or maintaining a data center, is a much better economic delivery approach for most companies.

And how can an enterprise datacenter possibly keep up with the hyper-competitive innovation from Amazon, IBM, Google and Microsoft? Enterprising tech pros know how this is going to play out. They’re way ahead in asking: “Why should we continue to saddle our company with a huge cost anchor called a datacenter or private cloud?”

It looks as though being a cloud provider isn’t going to be like a retail business when it comes to profits, but it may be too early to tell. It’s a bit like the x86 server business IBM recently sold to Lenovo. There will likely be innovation above the core cloud platform for a long time, which might alter the profitability outlook.

Opinions aside, the math doesn’t lie. It’s not question of if we’re moving to the cloud but how — and when.

Greg O’Connor is CEO of AppZero, which specializes in migrating enterprise software applications to and from cloud computing services. Follow him on Twitter @gregoryjoconnor.

Feature image illustration adapted from Steve Jurvetson/Wikimedia Commons

  1. It would only make sense for enterprise users to move to a cloud provider if they still had (economies of scale minus vendor profit) to attain. And even for those smaller enterprises that might still not make sense, as not all other factors are equal (not every enterprise user can afford Gbit Internet connections to their servers, yet they get that for free when their servers are on their premises).

    The cost of cloud computing should track Moore’s law, as the cost of computing is related to the cost of hardware. However, the cost of electricity does not follow Moore’s law, not does the cost of Internet bandwidth (at least not for small users at the mercy of monopolies or duopolies).

    Share
  2. Reblogged this on @stevebanfield and commented:
    50% price reduction every three years — definitely an interesting idea to map against future computing needs for any business, and product planning requirements in datacenter and online service build outs

    Share
  3. Cloud Insider Saturday, April 19, 2014

    You can bet that with GCE and AWS going at each other, trying to outdo each others’ low prices, the ROI case for on-premise gets even weaker. Their cost basis is lower than any enterprise, and they appear to be willing to live with thin margins. The writing is on the wall for private cloud proponents.

    Share
  4. Since we have no visibility into AWS margins, we have no understanding if these price reductions are the result of marketing, competition, or technological efficiency. It’s likely a combination of all of these and as pointed out in the drop in hardware prices, these same forces are applied to enterprise IT. Due to this lack of visibility, we have no understanding if this trend can continue or if Amazon will simply have to raise prices in the future.

    Share
  5. Biznezz Izgoot Saturday, April 19, 2014

    There are still many who believe the core infrastructure that an enterprise relies upon should be in-house and under corporate control.

    Share
    1. Your point is the important one. The concept of the cloud has one very serious flaw…control and security. Any company that would delegate its mainstream and fundamental information to a third party data center is well deserving of the potential and expected problems that will, in time, occur.

      Share
  6. Hi Greg, great perspective and I think there’s something to your argument.

    That said though cost of processing power isn’t the only cost variable that enterprises consider before moving to the public cloud. Most enterprises (that we work with day to day) are extremely risk averse to potential security breaches and are willing to ‘invest’ more in data centers within their four walls. For them the potential risk of taking a hit to the brand is worth the differential compute cost. Take the most recent Target breach, despite the fact that their own data center was hacked, a VP of data we’ve been working with at a Fortune100 company thinks the private cloud is more secure than the public cloud. “If only they’d invested more money in their private cloud…”

    There’s no doubt about it the public cloud’s becoming more secure everyday ala the bare metal providers, the single-tenancy offerings, etc. But until the perception of security changes, cost will be a lesser variable in public cloud adoption by enterprises.

    Share
    1. John,

      There are many variables for an enterprise to consider when choosing a platform. In the long run the low cost producer (public cloud; in this case) will over come the short-term objections (security). I am sure there are those that still use security to justify running workloads/apps on mainframes.

      Go to where the puck is going to be.

      The trend line is clear, innovation and competition will ensure the trend line continues. Security concerns to justify private cloud choice is seeing where puck is today and rationalizing why not to change. Private cloud and Hybrid cloud will be view as an expensive transition step toward the end game of public cloud by most in the next couple of years.

      GregO

      Share
    2. Cloud Insider Sunday, April 20, 2014

      If only Target had been all-in with AWS the data breach would probably never have happened. Take it from an inside – AWS is absolutely maniacal about ensuring it is secure. The amount of scrutiny on any potential security breach is unreal. Every service has to go through extensive security reviews. And then external pen testers spend weeks trying to find every little hole. Only after all security tests pass is the service allowed to go live. The amount of security reviews and test s much more than I have seen during my many years working with enterprise data centers before joining AWS. Sometimes it seems a bit much, but security is global #1 priority for AWS.

      Share
  7. Interesting data to read. Thanks.

    Share
  8. In concept, the proposed Bezos law sounds good, but needs a lot more underpinning.

    Cost of a “compute unit” might come down by X% over Y years, but thanks to Moore’s law, the need for computing units will keep increasing. Stated differently, the dimensions of this “compute unit” will keep changing. For example, increasing resolution of cameras in phones will lead to higher data + higher compute needs for social networks. So the number of compute units needed (from a end-customer perspective) to accomplish a certain / well-defined need (such as watching a movie, posting b’day pics of a kid etc), keeps changing over time.

    Changing unit dimensions for cloud computing will have to be taken into consideration when formulating the Bezos’ law. It is unfair to argue that changing unit dimensions will likely affect computing costs identically regardless of whether it is a private or a public cloud because scalability costs are *not* identical.

    My own view is that Moore’s law already applies in various fields (see; http://en.wikipedia.org/wiki/Moore's_law), well beyond the number of transistors on a chip. Would be great if someone would analyse the data on cloud computing and see if Moore’s law applies almost as-is, in this field as well.

    Share
  9. Nice read. As shown in the graph, the public cloud prices are still much higher than the hardware costs and will need to drop further down if they need to become competitive on a price basis with the private cloud. How many years do you think that will take?

    Also, what you are suggesting here goes against another view http://searchcloudcomputing.techtarget.com/feature/Cloud-computing-timeline-illustrates-clouds-past-predicts-its-future which says that Hybrid Cloud is almost as popular as Public Clouds, your comments?

    Share
    1. Cloud Insider Monday, April 21, 2014

      Public clouds are already cheaper, as hardware is just one component of the costs of running a data center. Other aspects such as land, building, power and personnel salaries are not getting cheaper. Hybrid clouds are just transitionary, as it is not feasible to immediately go from private to public clouds. Public clouds are the future. Just look at the increasing number of production workloads being moved to public clouds. Makes best ROI sense, and companies anyway would prefer having others take care of running data centers and instead focus on their core competencies.

      Share
      1. Plenty of colo facilties that will rent by the U or the rack, or whatever and using those with commodity hardware for private clouds makes the best ROI sense. Public clouds still have a long ways to go to catch up price wise.

        Share
  10. @giridharlv

    I would echo the thoughts and feedback provided by @Cloud Insider. The capital required for building a data center, paying for the electricity, staffing the operations for the enterprise are generally fixed or sometime escalating costs.

    Look at the innovation Amazon is doing around energy from this youtube video from 2010. James Hamilton, “Datacenter Infrastructure Innovation”. https://www.youtube.com/watch?v=kHW-ayt_Urk

    Public cloud trend line being slight slower then Moore’s law is because the total cost includes many components beyond CPU, disk and networking. Public cloud providers are innovating beyond the hardware. Public cloud providers can manage 100,000+ machines per admin while often enterprise ratios are closer to 400 to 1,000. The long term cost advantages of building data center at scale and calling it a public cloud is something enterprises cannot come close to today. Over time the gap will widen favoring public cloud providers.

    My view is that today the total cost is less with a public cloud approach then a do it yourself data center approach.

    MySpace, Bean babies and smoking cigarettes all were popular at one time as well. There is a difference between a fad and a trend.

    GregO

    Share

Comments have been disabled for this post