Climate change research gets petascale supercomputer

ScienceRocks

Democrat all the way!
Mar 16, 2010
59,455
6,793
1,900
The Good insane United states of America
Climate change research gets petascale supercomputer

1.5-petaflop IBM Yellowstone system runs 72,288 Intel Xeon cores

By Patrick Thibodeau

October 16, 2012 06:00 AM ET

.
.

Computerworld - Scientists studying Earth system processes, including climate change, are now working with one of the largest supercomputers on the planet.

The National Center for Atmospheric Research (NCAR) has begun using a 1.5 petaflop IBM system, called Yellowstone, that is among the top 20 supercomputers in the world, at least until the global rankings are updated next month.


For NCAR researchers it is an enormous leap in compute capability -- a roughly 30 times improvement over its existing 77 teraflop supercomputer. Yellowstone is a 1,500 teraflops system capable of 1.5 quadrillion calculations per second.

The NCAR-Wyoming Supercomputing Center in Cheyenne, where this system is housed, says that with Yellowstone, it now has "the world's most powerful supercomputer dedicated to geosciences."

More in Supercomputers

Along with climate change, this supercomputer will be used on a number of geoscience research issues, including the study of severe weather, oceanography, air quality, geomagnetic storms, earthquakes and tsunamis, wildfires, subsurface water and energy resources.

The supercomputer gives researchers new capabilities. They can run more experiments with increased complexity and at a higher resolution, according to interviews with researchers.

Scientists will be able to use the supercomputer to model the regional impacts of climate change. A model that is 100 km (62 miles) is considered coarse because the grid covers a large distance. But this new system may be able to reduce resolution to as much as 10 km (6.2 miles), giving scientists the ability to examine climate impacts in greater detail.

People "want to know what [climate change] is going to do to precipitation in Spain or in Kansas," said Rich Loft, the director of technology development at the center.

Loft said they plan to give 11 research projects first crack at the machine "to try to do some breakthrough science straight away and try to shake the machine."

"We want to see what happens when users beat on it instead of just doing acceptance testing," said Loft.

Yellowstone is running in a new $70 million data center. The value of the supercomputer contract was put at $25 million to $35 million. It has 100 racks, with 72,288 compute cores from Intel Sandy Bridge processors.

Scientists have been able to run some of their work on larger systems at other facilities, but they are competing for time with other sciences.

Among the scientists who will be using the NCAR system, is Marika Holland, whose research includes studying climate change in the polar region. The earlier systems are running models at "more of an approximation than we would like," she said.

Similar to understanding precipitation in temperate regions, Holland said the higher resolutions enabled by this new system will allow them to look explicitly at the influence of storms on Arctic Sea ice, as well as ice reductions along the coast and coastal erosion.

The Arctic Sea ice set a new minimum this year, 300,000 square miles less than the previous satellite record in September 2007, of 1.61 million square miles, according to NASA last month.

The loss of sea ice covering during the summer, as well as over the last several years, "has been pretty extreme and more extreme than most of our climate models predict," said Holland.

The work accomplished by scientists through observation, theoretical studies and other scientific efforts builds knowledge that is incorporated into computer models, which then become better predictive tools, said Holland.

There is a lot of understanding and fundamental research that needs to go on, said Holland, "but we also need bigger computers."

Climate change research gets petascale supercomputer - Computerworld


Climate research just got a HUGE improvement in forecast power!!!! :eusa_shhh:
 
Last edited:
Now, if only they could write a computer model worthy of that capability. I fear that is impossible however.
 
Granny says dem flyboys'll think up anything...
:clap2:
Turning PlayStations into power stations
January 28, 2010 - Air Force will link 2,000 PS3s to make a cheaper brand of supercomputer
Once thought to be just a part of home entertainment systems, Sony’s PlayStation 3 is proving itself to be more than just an online death-match machine. The console’s price-to-performance ratio inspired one Air Force research team to place an order for 1,700 of them to go with the 336 they already have. The brains behind the Air Force Research Laboratory in Rome, N.Y., are clustering the consoles, along with some off-the-shelf graphic processing units, to create a supercomputer nearly 100,000 times faster than high-end computer processors sold today. The research group was awarded a $2 million grant for the PlayStation 3 cluster.

Key to the whole idea is the console’s cell processor, which was designed to easily work in concert with other cell processors to combine processing power and has been critically acclaimed for its number crunching ability. This lets the researchers leverage power toward running such applications as Back Projection Synthetic Aperture Radar Imager formation, high definition video image processing, and Neuromorphic Computing, which mimics human nervous systems. “With Neuromorphic Computing, as an example, we will broadcast an image to all PS3s and ask if it matches an image it has in its hard drive,” said Dr. Richard Linderman, the senior scientist for Advanced Computing Architectures at the laboratory.

Mimicking humans will help the machine recognize images for target recognition, said Mark Barnell, the high performance computing director for the laboratory’s information directorate. “Humans can routinely do these things, but a computer struggles to do it,” Barnell said. “In a general sense, we are interested in making it autonomous.” He added, however “this is not the Holy Grail of supercomputers.” Because of the way the consoles connect online or to each other is relatively slow compared to regular supercomputing setups, the group is limited in what type of programs can be efficiently run on the PS3 supergroup they call the 500 TeraFLOPS Heterogeneous Cluster. Linderman said the entire system is using mostly off-the-shelf components, and will be a relatively cheap, green machine.

Keeping with the off-the-shelf mentality, the Air Force is using metal shelves found at most department stores to house the PS3 cluster. They are also using Linux, which is a free, open source operating system. The system will use 300 to 320 kilowatts at full bore and about 10 percent to 30 percent of that in standby, when most supercomputers are using 5 megawatts, Linderman said. However, much of the time the cluster will only be running the nodes it needs and it will be turned off when not in use. “Supercomputers used to be unique with unique processors,” Linderman said. “By taking advantage of a growing market, the gaming market, we are bringing the price performance to just $2 to $3 per gigaFLOPS.” As a point of reference, 10 years ago the University of Kentucky claimed the record of the first unit capable of 1 billion floating point operations per second or gigaFLOPS, to cost less than $1,000. The cost per gigaFLOPS was $640.

MORE
 
I'll settle for running Protein Folding on my PS3 at home. Distributed computing.

You can also use the spare CPU cycles on any computer at home to run a variety of projects -- SETI, prime number searches, climate simulations, take your pick.

Choosing BOINC projects
 

Forum List

Back
Top