Blog Post

Why Computing's Future Is Graphic

Stay on Top of Enterprise Technology Trends

Get updates impacting your industry from our GigaOm Research Community
Join the Community!

Two almost contradictory pieces of news came out today that prove that the next wave of computing is visual. Good graphics were once a mainstay of heavy industry for 3-D or seismic modeling, but in today’s world of digital everything and the coming 3-D web, rich graphics are becoming a need-to-have capability on every machine.

That emphasis on graphics is what’s leading to a rise in the number of GPUs shipped — they achieved a 17.8 percent growth rate for the third quarter of 2008, the highest quarter-over-quarter growth rate seen in the last six years, according to Jon Peddie Research. In the third quarter, more than 111 million GPUs were shipped, compared with 91 million shipped during the same period last year. Nvidia (s NVDA), Intel (s INTC) and AMD (s AMD) are the top makers of GPUs.

While the sales of specialty graphics chips are growing, one company that focused solely on making specialized graphics-rendering systems is shifting away from its niche hardware. Silicon Graphics Inc. (s SGIC) said today that it would move away from specialized graphics hardware and build software that can better render graphics on personal computers and servers. SGI, which had filed for bankruptcy in 2006, today launched a series of products aimed at rendering and delivering large and compute-intensive graphics files to x86 servers, mobile phones and laptops.

This shift is partly a reponse to consumer GPUs cornering its market for high-end graphics — mimicking the functionality of its high-end systems with off-the-shelf PCs. In the computing market, yesterday’s wants become today’s necessities; graphics are in the middle of that transition.

12 Responses to “Why Computing's Future Is Graphic”

  1. Stacey Higginbotham

    Adam, GPUs are running more mainstream programs today than just hard-core games, making them and their abilities more apparent (and important) to the average user. Notebooks are also on the rise and replacing desktops. These two trends converging mean that more notebooks containing GPUs are selling. As for my categories such as heavy industry, I should have included gaming. My bad.

  2. What products did Nvidia build their business on? What market did AMD build their graphics business on? For the last *ten years*, what has been the annual spend on hardware Graphics-only solutions for core industries – “heavy industry”, and “seismic modelling” that you point out, or the other traditional ones of “physical and chemical modelling” and “computer games”?

    Perhaps, when talking in your linked articles about how all these apps now run on the GPU instead of the CPU, you should ask Intel for *how many DECADES* this has been true or mostly true already? This is nothing new. It’s not in any way a change from 5+ years ago. As long ago as Windows 3.1 (hint: that was in the early 1990’s) we had hardware acceleration for window-drawing primitives to support these kinds of apps.

    Just looking at the report you reference, the growth came, basically, entirely from Notebooks. Nothing to do with “3D web”. That could be notebooks that previously shipped without GPUs being shipped with GPUs. Or maybe it’s that Notebook sales have increased hugely recently? Maybe it’s a direct side-effect of the fact that Apple saw an amazing 20 per cent market share in notebooks during July and August – and that Apple notebooks *all* have GPUs?

    It might be interesting to, you know, look at what has changed that means that notebooks now have GPUs?

    Or mabye to look at how much of the observed bump comes from Intel changing the habits of a lifetime and startig to ship notebook GPUs that actually support core computer-games graphics features from 2-3 years ago, instead of only supporting ones from 7-10 years ago (which they have been consistently doing until very recently)?

    Or maybe it’s simply that the GPU providers have felt the need to diversify away from the (now saturated) desktop GPU business, and have in the last few years put more of their money into making power-efficient/heat-efficient GPU’s which are finally viable in notebooks? Because again this is a recent change.

    Personally, such a massive, *sudden*, leap strikes me as caused more by a single product / family change than anything else.

    So, I’m sorry: this article seems to miss the obvious questions and conclusions entirely. I don’t know what the real causes are, but at least I’m not going around drawing arbitrary conclusions plucked out of thin air that diverge radically from the “most likely” explanations.