27 Comments

Summary:

The two companies that make the brains found in today’s computers, Intel and AMD, are both pushing hard to get into graphics, just as the top graphics chip maker, Nvidia, is aiming squarely at the CPU space. It’s not an identity crisis so much as a testament to how important graphics have become in the consumer computing experience — and how much money can be made crunching numbers on the corporate side.

Store this one away in your “Grass is always greener” file: The two companies that make the brains found in today’s computers, Intel and AMD, are both pushing hard to get into graphics, just as the top graphics chip maker, Nvidia, is aiming squarely at the CPU space. It’s not an identity crisis so much as a testament to how important graphics have become in the consumer computing experience — and how much money can be made crunching numbers on the corporate side.

It’s also a sign of the end of the graphics processor, found on a separate card plugged into high-end machines. In order to survive, Nvidia needs to find an end market that values graphics processors for something beyond graphics. Or push graphics processors into compute-intensive applications in hopes of relegating x86 chips to running the OS and nothing else.

The battle between Intel and AMD has raged for years. When AMD purchased ATI Technologies back in 2006, the plan was to amp up AMD’s processors with an integrated graphics processor and CPU on a single chip. Project Fusion, as it’s known, was scheduled to start turning out its first chips by late 2009, but since AMD’s CTO just walked, who knows if Fusion will become as snakebit a project as Barcelona. In the meantime, AMD is settling for integrating a graphics processor on the motherboard with a CPU.

Meanwhile, Intel is scratching the graphics itch with Larrabee, a multicore chipset designed to compete with Nvidia as a graphics processor. The Larrabee chips are due out in late 2008 or early 2009.

Nvidia’s hop over the fence is a little more novel, and certainly worth noting. It’s no secret that graphics chips can perform a helluva lot of computations to deliver the ultimate in 3D gaming, but that same power can be harnessed for crunching numbers or running simulations. To that end, Nvidia last year launched technology called CUDA, which allows developers to build programs that run on graphics processors using the familiar C programming language instead of the more esoteric graphics programming languages.

With that move, Nvidia put Intel and AMD on notice. Today Nvidia’s CEO fired the first shot by introducing what it calls “The World’s Most Affordable Vista Premium PC,” a low-cost platform containing a Nvidia graphics processor and a lower-end CPU from Via Technologies. Nvidia isn’t only snubbing Intel, it’s trying to prove that PC buyers are better off with functional CPUs, and that high-performance tasks can be trusted to a graphics processor.

It’s a bold move, but if it works, Nvidia will have upended several decades of chip design. And it will likely take its success all the way to the bank.

Photo from Nvidia

  1. Perhaps, especially for certain HPC apps, but it’s not quite as trivial as NVidia would like you to think. As Joe Landman often points out there are several challenges.

  2. Larrabee will kill Nvidia before Nvidia kills X86.

  3. Can Nvidia Kill the x86 Architecture? » Lilu Drivers Blog Saturday, April 12, 2008

    [...] James Allan Brady: [...]

  4. Hahaha, John, you’re a funny guy. Do some research friend and you will see that performance comparisons of task running on a GPU, that typically run on a CPU show a more than 300% increase. (ripping CDs/DVDs, archiving files, etc.) And this is early in development, so y’know it can only get better. Shameless intel fan.

  5. Top Posts « WordPress.com Saturday, April 12, 2008

    [...] Can Nvidia Kill the x86 Architecture? [image]Store this one away in your “Grass is always greener” file: The two companies that make the brains [...] [...]

  6. Interesting strategy by Nvidia. They have always made a strong case for more computing power going into graphics processing and we’d agree with that for some time to come (gaming, 3D, etc.)

    The fact that many tasks can be done faster with special chips is nothing new. (remember the Intel 8087 math co-processor?)

    It’s not that the x86 market is going away but I think most would agree the market could use a little disruption. If Nvidia can stir the pot a little bit everyone would win, probably even Intel thanks to new applications.

  7. Comment on Can Nvidia Kill the x86 Architecture? by Kris Tuttle » Lilu Drivers Blog Sunday, April 13, 2008

    [...] Stacey Higginbotham: [...]

  8. Stacey Higginbotham Monday, April 14, 2008

    You guys are right, the x86 market won’t die, but somehow the headline, “Can Nvidia Marginalize the x86 Architecture?” sounded so lame.

  9. Graphics Processors Grow Up, Go Corporate – GigaOM Friday, May 16, 2008

    [...] Nvidia doesn’t actually want to kill CPUs so much as have its GPUs shoulder some of the load in corporate data centers that are providing [...]

  10. Supercomputers: Now Less Super, More Computer – GigaOM Tuesday, June 17, 2008

    [...] trend in the high performance computing world, with players such as Nvidia bragging about its ability to crunch scientific data faster than general pupose [...]

Comments have been disabled for this post