Updated at the end: The way we use computers is changing, as device makers and users emphasize mobility and incredible graphics. I’ve argued that these trends signal the end of x86 computing, but what I’ve ignored is Intel’s (s INTC) drive to bring its brand of x86 computing to these markets, which are traditionally based on other instruction sets. If it succeeds, we may see Intel inside everything from our mobile phones to our set-top boxes.
Rivals such as ARM (s ARMH), which is licensing its intellectual property to a variety of chip firms that build application processors for mobile phones, or Nvidia (s NVDA), which is helping devleopers write code to run more applications on its graphics processing chips, are showing how little the CPU matters when it comes to popular new computing paradigms such as delivering HD video or controlling battery-powered smartphones.
However, Intel (s INTC) is also expanding into graphics processors and low-power computing for wireless devices and embedded systems with x86 based chips. Atom is a low-power x86 chip that Intel will use to duke it out against ARM in low-power and embedded systems, and Larrabee will be an x86 GPU that will compete with graphics chips from AMD (s AMD) and Nvidia.
Broadpoint AmTech analyst Doug Freedman thinks Intel can do it, especially given the chipmaker’s aggressive push into these new markets — mobile phones, embedded processors and even GPUs. In a note published today upgrading Intel, he said, “We believe long-term fundamentals of multi-year share gains in non-PC markets through Atom-based SoC solutions will enable significant (10 percent or more incremental market opportunity) new revenue opportunities.”
He also points out that Intel’s very control of the x86 chips inside computers could hinder GPU rivals once Larrabee comes out, as those GPU makers may not have the appropriate licenses to tie their chips to Intel’s CPUs. But the PC will be just one small (and shrinking) battleground to keep x86 relevant, amid a more mobile, visual and power sensitive world.
What do you guys think? Can Intel do it?
Update: Peter N. Glaskowsky, a technology analyst for The Envisioneering Group calls this post preposterous over at C|Net. He argues that Intel can thrive, and that my idea of a post x86 world is wrong. Perhaps “post x86” isn’t the most elegant way of summing up the trends of mobility and graphics, but I think it works. I’m also not counting Intel out when it comes to driving its x86 chips into newer markets (and say so in the comments), but I don’t think it’s “preposterous” to have a debate on this topic. Given the thorough response by Glaskowsky, those commenting on his post and the comments below, it’s something we should be talking about.