Intel blew out its 2010 and fourth quarter financial results last week, which inspired one technology and business blogger to ask whether or not Intel’s incredible growth (or in general microprocessor growth) will continue next year and in the decades following. Andrew McAfee, a principal research scientist at the Center for Digital Business at the MIT Sloan School of Management, and a fellow at the Harvard’s Berkman Center for Internet and Society, graphed out a trend to answer these questions, and noticed that as computers got cheaper, companies kept spending more on computers. McAfee’s blog post (you should read it) relies on the ideas of William Stanley Jevons, who argued that greater energy efficiency leads to higher aggregate consumption as opposed to less, to argue that the demand for more compute will continue for a long time to come.
Jevons recently got big play in a New Yorker article arguing that greater overall energy efficiency won’t help solve our energy usage problems, but will instead lead to more ways to use the fruits of low-cost energy consumption, thus leading to more overall consumption. Perhaps as a society that loves to consume without guilt, we’re just really attuned to his message. Anyhow, in applying it to computation, McAfee states:
Rather than wading into the energy debate, though, I want to wonder out loud if computation is like energy. Both are necessary inputs to many productive activities. Both are consumed by every company in every industry. Both are amplifiers of human ability: energy amplifies or replaces our muscles; computation does the same for our brains and senses. Both come from physical things, yet are themselves ethereal; you can hold a lump of coal or a transistor in your hand, but not a joule or a megaFLOP. And the devices that generate both are getting more efficient over time.
If the analogy is a tight one, if Moore’s Law continues to hold true, and if Jevons was right about energy-intensive processes, then one conclusion seems inescapable: the trends visible in both of the graphs above are going to continue. Computers are going to keep getting cheaper, and aggregate demand for them is going to continue to rise.
I can’t argue with that, but I can argue that the beneficiaries of continued compute demand will change over time. Much like the beneficiaries of cheaper, more abundant energy changed from railroads and transport companies to home appliance or air conditioning makers, the beneficiaries for the lower cost of computing will change.
Intel’s x86 chips may give way to ARM (s armh) or GPUs or something completely different, and I’d also push the argument out for radios and other silicon that enables connectivity since I fundamentally believe that in the future computation and connectivity will be forever linked. That means processors and radios will always be in demand (even if the purveyors and even form factors change) as well as the hardware used in networking, be it fiber or base stations.
Related content from GigaOM Pro (subscription req’d):
- Supercomputers and the Search for the Exascale Grail
- Pushing Processors Past Moore’s Law
- Nvidia Arms Itself: Who Has Most to Lose