6 Comments

Summary:

A supercomputer that will likely be able to perform around 1 million billion calculations in a second (a petaflop) will be solely dedicated to fighting climate change and used by the National Oceanic and Atmospheric Administration and the Oak Ridge National Laboratory.

A supercomputer that will likely be able to perform around 1 million billion calculations in a second (a petaflop) will be solely dedicated to climate change research. This week supercomputer maker Cray said it won a $47 million contract with the Department of Energy (DOE) to provide next-generation supercomputing power to the National Oceanic and Atmospheric Administration (NOAA) and the Oak Ridge National Laboratory (ORNL). Cray says the supercomputer it will build — dubbed the Climate Modeling and Research System (CMRS) — will be the world’s most powerful high-performance computer dedicated to climate research.

For the close to $50 million, which will be funded through the stimulus package, the labs will receive one of Cray’s XT6 supercomputers in the second half of 2010, followed by an even more powerful supercomputer with the code name “Baker” in 2011, and more high-performance computing gear in 2012. Cray, which reported pretty weak first-quarter earnings results earlier this month, says its XT6 was the first to break the petaflop barrier. While Cray didn’t say how fast the CMRS would be, the Register predicts that for $47 million, the labs will be able to buy on the order of 1 petaflop.

Cray is no stranger to barrier-breaking machines — it built the Jaguar, the world’s fastest supercomputer, which runs at 2.3 petaflops and is able to perform more than 2 million billion calculations a second (see Supercomputers and the Search for Exascale Grail, subscription required). The Jaguar is used by the Oak Ridge National Laboratory for researching various problems of world importance, including climate change research, but the CMRS will be the fastest first computer solely devoted to climate change.

Supercomputers have been used to calculate, model and forecast weather patterns, ice melt, sea level rise and other climate-changing elements for years. But the faster and more powerful the computer, the more data it can crunch and more accurate the predictions and forecasts can be. James Hack, who heads up climate research at ORNL and directs the National Center for Computational Sciences, hopes that the new climate supercomputer will “improve the fidelity of global climate modeling simulations.” The UN’s Intergovernmental Panel on Climate Change has recently come under increased scrutiny (warranted or not) and better modeling and prediction data will be able to help shrink the level of uncertainty and public backlash.

And supercomputing is just getting faster and more powerful. As Stacey has explained on GigaOM Pro: “The supercomputing industry strives to triple performance every 10 years, which means the industry needs to deliver an exaflop of performance (a billion billion calculations per second) by 2017,” (subscription required). Does the answer to the world’s warming problems lie in exascale computing? Given that humans will need to get on the path to reducing carbon emissions significantly by 2020, it probably does.

For more research on supercomputing check out GigaOM Pro (subscription required):

Supercomputers and the Search for the Exascale Grail

Image courtesy of Cray.

By Katie Fehrenbacher

Related stories

  1. … so they’ll be able to get their false results faster.
    Thanx to all taxpayer …

    Share
  2. Aztecwarrior25 Friday, May 21, 2010

    Lets call it The ManBearPig 2000

    Share
  3. DocForesight Friday, May 21, 2010

    It all comes back to “GI-GO”. If the assumptions are faulty or the “unknown-unknowns” remain as unpredictable and elusive as they’ve been then the UN IPCC climate models and reports – like AR4 (justfiably under scrutiny, scorn and ridicule for their use of “grey literature”) – will be of no greater use or value than what could be produced on your average laptop, saving millions of taxpayer dollars.

    Idea for ORNL, use your Cray to run tests or models on how a MSR or LFTR would function. You’d deal with something of real value: using Thorium as atomic fuel; recycling “used” uranium fuel; providing vast, affordable energy to grids of varying development and capacity; provide fresh water through desalination and the gradual replacement of coal-fired power plants will allow the environment to take care of itself.

    Share
  4. Having worked in this field, I’m always amazed on how little critical analysis goes into these claims. Starting with ASCI Red at Sandia, the national labs have rewarded themselves with round-robin purchases of ever more expensive “super” computers, yet the nation has little to show for these expenditures of taxpayer dollars other than what the researchers have themselves described as “pretty pictures.”

    The binge started with an attempt to replace testing with simulation for the country’s nuclear stockpile. One of the problems were the unrealistic speed claims made for the computational hardware, which often neglected real world issues like disk-transfer rates require to store results. The machines, some provided by Cray, never were up to the challenge.

    Unfortunately, as long as our representatives are willing to load bills with earmarks to fund the latest “super” computer project in their home state, the race to the most “flops” will continue to be a race to nowhere.

    Share
  5. I concur, it’s now officially the ManBearPig.

    Share

Comments have been disabled for this post