A supercomputer that will likely be able to perform around 1 million billion calculations in a second (a petaflop) will be solely dedicated to climate change research. This week supercomputer maker Cray said it won a $47 million contract with the Department of Energy (DOE) to provide next-generation supercomputing power to the National Oceanic and Atmospheric Administration (NOAA) and the Oak Ridge National Laboratory (ORNL). Cray says the supercomputer it will build — dubbed the Climate Modeling and Research System (CMRS) — will be the world’s most powerful high-performance computer dedicated to climate research.
For the close to $50 million, which will be funded through the stimulus package, the labs will receive one of Cray’s XT6 supercomputers in the second half of 2010, followed by an even more powerful supercomputer with the code name “Baker” in 2011, and more high-performance computing gear in 2012. Cray, which reported pretty weak first-quarter earnings results earlier this month, says its XT6 was the first to break the petaflop barrier. While Cray didn’t say how fast the CMRS would be, the Register predicts that for $47 million, the labs will be able to buy on the order of 1 petaflop.
Cray is no stranger to barrier-breaking machines — it built the Jaguar, the world’s fastest supercomputer, which runs at 2.3 petaflops and is able to perform more than 2 million billion calculations a second (see Supercomputers and the Search for Exascale Grail, subscription required). The Jaguar is used by the Oak Ridge National Laboratory for researching various problems of world importance, including climate change research, but the CMRS will be the fastest first computer solely devoted to climate change.
Supercomputers have been used to calculate, model and forecast weather patterns, ice melt, sea level rise and other climate-changing elements for years. But the faster and more powerful the computer, the more data it can crunch and more accurate the predictions and forecasts can be. James Hack, who heads up climate research at ORNL and directs the National Center for Computational Sciences, hopes that the new climate supercomputer will “improve the fidelity of global climate modeling simulations.” The UN’s Intergovernmental Panel on Climate Change has recently come under increased scrutiny (warranted or not) and better modeling and prediction data will be able to help shrink the level of uncertainty and public backlash.
And supercomputing is just getting faster and more powerful. As Stacey has explained on GigaOM Pro: “The supercomputing industry strives to triple performance every 10 years, which means the industry needs to deliver an exaflop of performance (a billion billion calculations per second) by 2017,” (subscription required). Does the answer to the world’s warming problems lie in exascale computing? Given that humans will need to get on the path to reducing carbon emissions significantly by 2020, it probably does.
For more research on supercomputing check out GigaOM Pro (subscription required):
Image courtesy of Cray.