Blog Post

Is a Post-x86 World "Preposterous"?

Stay on Top of Enterprise Technology Trends

Get updates impacting your industry from our GigaOm Research Community
Join the Community!

Over at CNET, Peter N. Glaskowsky, a technology analyst for The Envisioneering Group, is calling my post, Can Intel Thrive In a Post x86 World?, “preposterous.” He argues that Intel can thrive, and that my idea that we are entering a post-x86 world is wrong. Perhaps “post x86? isn’t the most elegant way of summing up the trends of mobility and graphics, but I think it works. For the record, I’m not counting Intel (s INTC) out when it comes to driving its x86 chips into newer markets (and I said so in the comments on my post), but I don’t think it’s “preposterous” to have a debate on this topic. Given the thorough response by Glaskowsky, those commenting on his post, and on mine, it’s something we should be talking about. Join in here and at CNET!

10 Responses to “Is a Post-x86 World "Preposterous"?”

  1. Stacey, this is an excellent and timely topic for a debate. But what is a post x86 world ? and what it isn’t ? Is the post x86 world a post Wintel world ? Is it the world which respects Moore’s law for retail pricing (and not just for feature up)? Is it the world where finally the “network is the computer” and computers become lean-and-mean (rather than fatter and richer) ? Or, is it, as can be poorly imagined, “merely” the death of the x86 instruction set?

    Let me start with the easy one.
    The x86 instruction set – This one will live. And prosper. It might not be the computer in your back pocket. But it is “the computer” as in the “network becomes the computer”.

    However, the post x86 world is likely to shakeup the wintel business model – if/after the involved people digest the new emerging reality. What is going to be the engine of growth? It has been CPU hungry software, churned out with regularity by Redmond. But will this take a backseat as consumers (and apps) turn to the “cloud”? Will innovation take a new direction – from being apps oriented in the past to user-experience oriented ? Microsoft excel has been called as the “killer app” on PC. What would you vote for if there is such a poll today – twitter? facebook? multitouch on iPod ? Yes, this is an apples and oranges comparison. The PC software package created a market in last decade. Now that market exists, but new ones are being created in the cloud. Will intel, with deteriorating drive/attention from it’s siamese twin (microsoft), play in the new market? And win ?

    Note that both microsoft and intel will live to see the lights of the next decade – or more. The only question is, quite like cobol and mainframes that lived to see the y2K, but didn’t occupy center-stage in the 90s, will wintel recede to the back (read “cloud”) while consumers get excited by new handhelds, running power-optimized cores with a browser, flash and modern UI?

    I’ll put my money on the following:
    1. The x86 instruction set survives, for a long time to come (Think server farms).

    2. Intel will do it’s best to reinvent itself as the computing company for the new decade (2010 to 2020). But it will suffer significantly (as it has, in the recent past with “consumer” technologies) because it fundamentally lacks a credible software partner and device makers for the consumer market (distinct from the PC market – where Intel has succeeded together with and because of a Microsoft and an IBM/HP/Compaq/Dell..).

    3. Intel’s silicon pricing policy will be confronted with the new reality – where silicon price erosion will not be offset with the help of exponential growth in software or apps.So, Intel will focus on new process technologies to retain the edge in the chip industry. And this will be a new niche – a likely cash cow for Intel.

    4. The consumer market will witness an extremely interesting power tussle (or alliances) between companies having credibility in the consumer space (read Nokia, samsung, sony, apple), User Experience (read graphics, low power, quick startup, interface technologies, 3D..), ad spends and content/app services + delivery infrastructures.

  2. Interesting anecdotes, certainly a good debate to have. It may depend on what embedded buyers want. With x86, there are only few vendor options to purchase chips from. There are more buying options for Arm chips, which could lead to cheaper prices. People don’t care about their cell phone chips, leaving the Arm and Intel to battle on chip prices.

    In any case, even if x86 takes over obile, Intel will need to keep competition alive in the form of Arm. With AMD out, Intel and Arm could co-exist in the space for a while. As for laptops and servers, x86 isn’t going anywhere.

  3. farmerwu

    I think your comments were spot-on. Don’t let him bully you. You are absolutely right to question the way the world is. His argument seems to be that Intel and x86 have faced challenges before therefore they’ll beat the next set of challenges. Its silly. His argument is preposterous. There is so much more to the debate than instruction set complexity, which is his focus. To name one, the power consumption of Atom is not a simple matter to fix.

    If Intel is so all-powerful, why have they struggled so long to get into wireless (we are not talking about Wi-Fi)? They’re great at building processors, but everything they have done in the cellular world has ended badly. If x86 had so much going for it, why did they spend all that money buying the business that became X-Scale, which they then fobbed off on Marvell at a discount? And what about WiMAX? For years, Intel has been telling the world that WiMAX will overtake cellular networks. Suddenly, he would have us believe that Atom is going to be the driving force behind Intel’s resurgence in mobility. Yes Atom will work with WiMAX, but it seems to me that Intel is hedging its bets with Atom, implying a decreasing amount of confidence in WiMAX.

    I’m not trying to bash Intel, its a great company. But even great companies have their limits. New technology can swamp even the biggest giant. If it didn’t, how many of us would even pay attention. That’s what makes the tech industry exciting and thriving.

  4. Jesse Kopelman

    Stacey, I took your post x86 comment to mean not that something like ARM would take over the existing x86 markets, but that these existing x86 markets would become increasingly irrelevant (from a user device perspective) as pretty much everything becomes a cloud device. I think there is quite a bit of potential for that to happen. 10 years ago, a lot of people would have said it would be preposterous to imagine a world where a PC is basically useless without a high speed Internet connection, and yet here we are. Post-x86 is by no means a forgone conclusion, and Intel is doing the right things to forestall such an eventuality, but it is far from preposterous — especially from a user device perspective. It is important to make that qualifier. x86 isn’t going anywhere on the server side, but the user device side is really what you are talking about.

    • Stacey Higginbotham

      Jesse, exactly. The server market won’t abandon the x86 CPU (although it may bring in reinforcements for certain workloads) but focusing on x86’s relevance in the consumer markets as devices go mobile and play more video is a good way of framing the question.

  5. Nicholas

    I’m curious why Apple was able to port OS X to the iPhone in a limited manner? Acorn did pretty well using such RISC processors in personal computers as well. Is it a matter of scaling processors rather than a processor?

    I have stated that the future of personal computing outside of the need for specialized tasks is going to both mobile centric and utilize smaller processor architectures. They will collaborate as needed to scale. What comes to mind when you think about such requirements, Intel or ARM? I bet on ARM years ago.

  6. It is mostly “preposterous”, for many of the reasons he states. The ARM architecture is an embedded architecture with many intentional tradeoffs and limits that reflect its markets. It scales down better than x86, but scales up poorly as well whereas modern incarnations of x86 compatible silicon, and particularly AMD64 architectures, scale up very well.

    As silicon gets increasingly efficient, I suspect the pressure will be more on extracting the maximum amount of computational power for a given transistor and power budget that will ultimately favor x86. Also, ARM has some quirky design characteristics that make it a pain to port some kinds of x86 code base. Intel/AMD architectures will never push ARM out of the ultra-low power market because of optimization tradeoffs, but I expect the delineation between those two markets to creep downward over time.