I See Your Petaflop and Raise You 19 More

Just a year after the world’s fastest supercomputers broke the petaflop barrier by performing one thousand trillion calculations per second, nuclear physicists are planning a 20-petaflop machine in conjunction with IBM. Nicknamed Sequoia, the Department of Energy computer will most likely be the most powerful in the world when it is released. If it were […]

Graphsupercomputer_4

Just a year after the world's fastest supercomputers broke the petaflop barrier by performing one thousand trillion calculations per second, nuclear physicists are planning a 20-petaflop machine in conjunction with IBM.

Nicknamed Sequoia, the Department of Energy computer will most likely be the most powerful in the world when it is released. If it were running today, it'd be more than 10 times faster than any machine in existence.

When it's installed at Lawrence Livermore National Laboratory in 2012, it could make new kinds of calculations possible, but initially, that power will be primarily used to simulate nuclear explosions, as many of its supercomputer forebears have done.

All that power will help scientists understand the uncertainty inherent in their models of the world.

"Every time you do predictive science, the next question is: How confident are you in that prediction? It turns out that's a very easy question to ask and a very profound question to try to answer," said computer scientist Mark Seager of Lawrence Livermore National Laboratory. "The way that we do that is by running a whole bunch of simulations. Instead of just one simulation, you do 4,000."

The progress they make in quantifying the range of error in their models could have far reaching impacts in the branches of science that use predictive models extensively, like modeling climate change or the protein interactions inside cells.

Ultrafast computers are integral to simulating those kinds of complex systems. They continue to progress, thanks to the well-known — though often questioned — Moore's Law, which has allowed chip makers to pack twice as much power into the same amount of space about every two years. More power has meant more flops, a common measurement of computing speed. Ten years ago, Sandia's ASCI Red became the first teraflop computer, and in December 2000, Wired called 100-teraflop performance "unheard of." But today, multiple computers have broken the petaflop (or 1,000 teraflop) barrier.

By almost any standard, the new computer will be staggering. It will have 1.6 million processing cores, 1.6 petabytes of memory, 96 racks and 98,304 computing nodes. Yet, the new computer will have a much smaller footprint at 3,400 square feet than the current fastest computer's 5,200 square feet. And it will be much more energy efficient than its predecessors, only drawing 6 megawatts of power a year. That's about how much energy 500 American homes use in the same period.

Since the Comprehensive Nuclear-Test-Ban Treaty was signed in 1996, U.S. scientists have sought ways to continue to test and develop weapons without actually detonating them. That's where the big-rig computers come into play: They can simulate the complex physics involved when, say, billions of hydrogen atoms fuse together into helium, releasing enormous amounts of energy.

Of course, those computers could be used to simulate other things, like the world's climate. Livermore and other Department of Energy labs have allowed their supercomputers to be used for a wide variety of sciences. As bigger, faster machines are built to simulate weapons, the previous machines are turned toward other interests.

Initially, Sequoia will be dedicated to the National Nuclear Safety Administration work. That means some of the exciting weather science will have to wait. With 20 petaflops of computing power, meteorologists could predict local weather down to the 100-meter range. For an event like a tornado, that could mean being able to predict the path that the twister takes through a town, allowing for targeted evacuations that save lives.

It's performance like that, Seager argues, that's changing the way that science is done, making simulation another branch of the scientific method along with theory and experiment.

"Scientific simulation is the telescope of the mind," Seeger said. "We work with highly non-linear systems that have very complicated mathematics and models. It's just too difficult to hold all that in our brain and analyze it. So by simulating them, we're extending our brains' capabilities."

See Also:

WiSci 2.0: Alexis Madrigal's Twitter , Google Reader feed, and project site, Inventing Green: the lost history of American clean tech; Wired Science on Facebook.