Intel's Larrabee Aims to Take on Nvidia and AMD

8 Comments

Last week, Intel offered up a sneak peak of its Larrabee graphics processor, due out in 2009 or 2010 and guaranteed to raise the competitive pressure on graphics chip makers Nvidia and AMD. Unlike its existing integrated graphics chips, Larrabee will be a standalone processor, but don’t expect that it will be a success.

As computing has required faster chips, Intel and other chip makers have added more cores, a tactic that GPU makers have used for years in order to increase parallel processing. GPUs from Nvidia contain as many as 240 cores while those from AMD, which that company acquired when it purchased ATI Technologies in 2006, have hundreds. So they’re faster.

But they’re also harder to program, something Nvidia is trying to solve with more flexible chips and a new programming tool called CUDA. But most enterprise and consumer software runs on x86 chips and needs adaptations to take advantage of GPUs. Intel’s Larrabee chip has multiple cores, but is not a GPU. Intel claims this offers people the performance gains and ability to render graphics much like a GPU does, but that Larrabee’s x86 architecture allows for easier programming.

It’s nearly impossible to judge a chip until you’ve seen it in action and tracked whether OEMs want to put it in their devices, but my bet is that Intel won’t be able to compromise with a many-cored CPU and believe it will beat a GPU at its own game. Nvidia and AMD are hoping as much, especially Nvidia, which has the dominant GPU market share — right behind Intel’s integrated chips — and wages an almost constant battle again Intel’s PR on this front. As Nvidia’s small but fierce marketing team faces off against Intel’s Goliath, grab some popcorn, because it’ll be a graphics showdown worth watching.

8 Comments

jct

i believe that this will just be a new integrated graphics card, it will just be better since Intel’s current IGU is somewhat lacking the bare average of power to take on today’s common applications, i mean even Mac switched to Nvidia. OMG!

AndrewZela

Adding more multi-core compute is not what we need.

Current trends in microprocessor evolution indicate that more and more applications will become memory-bound and be unable to benefit from the ample compute resources offered by the next generation of computers.

This increasing disparity between a system’s compute capabilities and the available memory bandwidth needed to fully utilize that compute is the real problem.

Until that is solved, we’re not addressing the real bottleneck for improving application performance.

Jesse Kopelman

Sorry. I retract the previous comment. Apparently Larrabee is just a multicore CPU trying to be a GPU. Well, that does seem like a recipe for failure. I wonder if Nvidia’s Huang knew this all along — making his anti-Larrabee rant less about bravado/marketing and more about plain incredulity.

Jesse Kopelman

There is certainly a good chance that Larrabee will be a failure, but I don’t think you understand what it is. It is not a multicore CPU trying to do the work of a GPU; it is an actual GPU that just happens to be x86 compatible. Will it actually be a good GPU from a price/performance perspective? We’ll have to wait and see.

Comments are closed.