16 Comments

Summary:

These days, thanks to a visually intensive style of computing, a good GPU can improve the user experience much better than a fast CPU. In the data center certain tasks are moving from commodity CPU boxes to GPUs, meaning that over the next year or two, more of them will be sold for corporate computing use.

Remember when CPU processor speeds were the driving force behind new computers? Going from a 500 MHz to 1 GHz then 2 GHz machine meant noticeable improvements. Then chip vendors started adding more cores. But for the style of computing consumers use today, it’s not about the CPU anymore.

It’s all about graphics processors. Thanks to today’s visually intensive style of computing, a good GPU can improve the user experience much better than a fast CPU. In the data center certain tasks are moving from commodity CPU boxes to GPUs, meaning that over the next year or two, more of them will be sold for corporate computing use.

That’s why Intel is pushing graphics chips such as Larrabee, while AMD is set to unveil integrated chipsets that combine CPUs with GPUs, the result of its acquisition of ATI in 2009. All of this was driven home for me during a trip to Nvidia a few weeks ago, where I saw, side-by-side, the difference between a computer with a super-fast CPU and a computer with a slower CPU but a high-end GPU.

Of course, the demo was optimized for graphics-intense programs (I didn’t see any spreadsheets), but the movies, games and transcoding were all impressive, and more akin to the things I use my laptop for nowadays anyhow. And then the Nvidia guys dropped a bomb on me.

All PDF documents now run through the graphics processor, they told me, as does Google Earth and multiple other web applications. The same goes for PowerPoint slides, Word and other parts of Microsoft Office, starting with Office 2007. On Macs, the visual interface on the file system is handled through the GPU, which makes flipping through thousands of photos and movies much easier. On the consumer side, the rise of the such graphical interfaces helps people visually navigate through ever-increasing amounts of information.

Nvidia and AMD probably have the most to gain from this shift in the consumer field, but Intel won’t be sitting out. However, on the enterprise side is where a GPU might offer a lot more value when it comes to rapid information processing. GPUs are good for applications that require a processor to crunch a lot of data in parallel; they’re not good for step-by-step processes that require decision-making at each step.

So Nvidia doesn’t actually want to kill CPUs so much as have its GPUs shoulder some of the load in corporate data centers that are providing transcoding services and running database queries and Monte Carlo simulations. This heterogeneous computing environment will be more expensive than the Google-like x86 server farms, but certain industries have already shown they will pay for specialized processing in certain areas. Financial institutions, for example, that have deployed servers using Sun’s Niagara chips or Azul Systems’ many-core boxes for high-end computing pay more for faster processing.

As the large content vendors and even carriers try to deploy media content in multiple formats for televisions, personal computers and mobile phones over IP networks, they’ll either have to pay more for storing those multiple versions or pay for real-time transcoding, either in the data center or on the network. The increasing delivery of visual media over an IP network and the increasing amount of electronics data stored in corporate databases all represent an opportunity for GPUs that mean the chips might move out of the graphic niche.

  1. [...] growing laptop market and justify its $5.4 billion acquisition of ATI Technologies back in 2006. As graphics become more important to the PC user, both Intel and AMD are shoring up their expertise in that department. AMD bought ATI while Intel [...]

    Share
  2. [...] the meantime, the $4.1 billion-a-year graphics chip maker is battling Intel to bring more focus and computing jobs to the graphics processor that in the past may have been handled by a computer’s main processor, also known as [...]

    Share
  3. [...] the meantime, the $4.1 billion-a-year graphics chip maker is battling Intel to bring more focus and computing jobs to the graphics processor that in the past may have been handled by a computer’s main processor, also known as [...]

    Share
  4. [...] scale. The real battle will be whether AMD’s dual-chip strategy takes business away from Nvidia for specialty graphics computers and high-performance technical computing. If that occurs, Nvidia will have to be on guard: Intel’s planning to follow the same [...]

    Share
  5. [...] claim to fame — and a few pending patents — derive from its use of the GPU for a non-graphics process. Elemental is working with Nvidia to optimize the transcoding software on its graphics chips. Below [...]

    Share
  6. [...] PM PT Comments (0) Intel, the world’s largest chip maker, is spreading its R&D efforts far outside of its server and PC kingdom. The company has just launched a new line of products that will combine four processors onto a [...]

    Share
  7. [...] from Nvidia or AMD and Cell (which was designed originally for the PlayStation 3) from IBM — moving into the enterprise. To sum it up, the x86 processor, the workhorse of corporate computing, can do a lot, but [...]

    Share
  8. [...] computing has required faster chips, Intel and other chip makers have added more cores, a tactic that GPU makers have used for years in [...]

    Share
  9. [...] allows software coded in C languages to run on the multiple cores in a GPU. It helped the company make inroads in the scientific computing community, and thanks to software from startups such as Elemental Technologies, the goal is to bring that [...]

    Share
  10. [...] emphasis on graphics is what’s leading to a rise in the number of GPUs shipped — they achieved a 17.8 [...]

    Share

Comments have been disabled for this post