Blog Post

Sharing Super Computing for Energy Innovation

Stay on Top of Emerging Technology Trends

Get updates impacting your industry from our GigaOm Research Community
Join the Community!

Could the equivalent computing power of 135,000 laptops, and 1.7 billion processor hours, put a dent in the fight against climate change? That’s what the Department of Energy is hoping. This week, the DOE unveiled a program to donate computing power from two supercomputers (some of the world’s fastest and most powerful) for dozens of projects working on energy innovation.

Projects that will utilize the super computers stem from industry — like Boeing (s ba) and GE (s GE) — as well as universities like Iowa State University and University of Utah, and groups like the National Oceanic and Atmospheric Association (NOAA). DOE Secretary Steven Chu announced 57 research projects in total that would benefit from the program, including projects that look at nuclear technology, solar innovation, battery prototypes, and the use of hydrogen as fuel. (There were quite a few non-energy projects, too, which will focus on solving problems in the fields of health and natural disasters.)

Boeing will use its 45 million supercomputing hours to crunch numbers for developing quieter and safer technologies around air flow, which could lead to better wind turbines and aircraft landing gear. GE will use its 20 million super computing hours for a project that looks at more energy-efficient propulsion systems.

The researchers will use two super computers: “Jaguar” (a Cray XT5), which is housed at the Oak Ridge National Laboratory, and “Intrepid,” via IBM (s ibm), which is at the Argonne National Labs. Jaguar is one of the world’s fastest supercomputers (updated), which runs at 2.3 petaflops and is able to perform more than 2 million billion calculations a second (see Supercomputers and the Search for Exascale Grail on GigaOM Pro, subscription required).

In a separate project, supercomputer maker Cray (s cray) will also be building more powerful computers for crunching data on climate change specifically, and this spring, won a contract with the DOE to provide next-generation supercomputing power to NOAA and Oak Ridge Labs. Cray plans on providing a next-gen XT6 supercomputer in the second half of 2010, followed by an even more powerful supercomputer with the code name “Baker” in 2011, and more high-performance computing gear in 2012. The XT6 will likely be able to perform around 1 million billion calculations in a second (a petaflop). ZOMG.

Supercomputers have been used to calculate, model and forecast weather patterns, ice melt, sea level rise and other climate-changing elements for years. But the faster and more powerful the computer, the more data it can crunch and the more accurate the predictions and forecasts can be. James Hack, who heads up climate research at ORNL and directs the National Center for Computational Sciences, told me he hopes that more powerful supercomputers for climate research will “improve the fidelity of global climate modeling simulations.”

Supercomputing is just getting faster and more powerful. As Stacey has explained on GigaOM Pro: “The supercomputing industry strives to triple performance every 10 years, which means the industry needs to deliver an exaflop of performance (a billion billion calculations per second) by 2017,” (subscription required). Does the answer to the world’s warming problems lie in exascale computing? Given that humans will need to get on the path to reducing carbon emissions significantly by 2020, it very well could.

For more research on supercomputing check out GigaOM Pro (subscription required):