1 Comment

Summary:

Computers that can deliver an exaflop of performance — producing a billion billion calculations per second — aren’t the stuff of science fiction. But according to one researcher, an exascale supercomputer could require 7 GW of power. Clearly that needs to come down.

jaguar500

Computers that can deliver an exaflop of performance — producing a billion billion calculations per second — aren’t the stuff of science fiction. The supercomputer industry wants to hit that mark by 2017 (see Supercomputers and the Search for the Exascale Grail, GigaOM Pro, subscription required). Currently, one of the world’s fastest supercomputers (Jaguar) runs at 2.3 petaflops, which performs more than 2 million billion calculations in a second, and your laptop likely processes around 2 billion calculations a second.

No, the biggest hurdle for exascale computing continues to be power, and a project manager at Belgian research institute IMEC is quoted as saying, “Energy is number one. Right now we need 7,000 MW for exascale performance.” That’s 7 GW of power for a single exascale capable computer. Other experts put the figure closer to 1 GW.

As Stacey points out in her report, supercomputers are designed to run at optimal speeds, and like a race car, anything superfluous is stripped out for the sake of better performance. That means energy use concerns that have led to the construction of greener data centers have been mostly absent from the supercomputing race. The IMEC project manager says the goal is to cut that 7 GW down to 50 MW — clearly a massive drop.

IBM VP of Deep Computing David Turek, told Stacey that when supercomputers are built at the exascale level, there needs to be a rethinking of the design, because just making them bigger multiplies the inefficiencies. One way IBM is looking to solve this problem is by using more efficient chips for various work loads, including graphics processors from Nvidia, the Cell processor built by IBM, or even ARM-based processors found inside cell phones.

More powerful supercomputers could be key to helping solve the world’s pressing problems like climate change. James Hack, who heads up climate research at ORNL and directs the National Center for Computational Sciences, told me once that he thinks more powerful supercomputers for climate research will “improve the fidelity of global climate modeling simulations.”

There’s also been a recent effort underway to share computing power for climate research. Both the Department of Energy and Google recently announced computing sharing projects focused on climate change data. The DOE is donating space on two of the world’s supercomputers for dozens of projects focused on energy innovation. At the same time, Google is donating parallel processing power to help groups in developing countries build environmental maps based on its new Google Earth Engine tool.

For more research on supercomputing check out GigaOM Pro (subscription required):

Image courtesy of Cray.

You’re subscribed! If you like, you can update your settings

You're subscribed! If you like, you can update your settings

  1. The Hot New Sector in Greentech: Adaptation: Cleantech News « Monday, December 13, 2010

    [...] and faster supercomputers that can crunch data and make important predictions in real time. Can exascale computing save [...]

Comments have been disabled for this post