Search This Blog

Sunday, November 11, 2007

Supercomputers Make Safer Nuclear Reactors


Supercomputers Make Safer Nuclear Reactors



Using Supercomputers To Make Safer Nuclear Reactors


Rensselaer Polytechnic Institute is leading a $3 million research project that will pair two of the world's most powerful supercomputers to boost the safety and reliability of next-generation nuclear power reactors.



The three-year project, funded by the U.S. Department of Energy, will call upon a diverse team of researchers and institutions to create highly detailed computer models of a new proposed type of nuclear reactor. These models could play a key role for the future development of the new reactors, which meet stringent safety and nonproliferation criteria, can burn long-lived and highly radioactive materials, and can operate over a long time without using new fuel.


Watch an interview with engineering physics professor Michael Podowski, a world-renowned nuclear engineering and multiphase science and technology expert, who is project director and principal investigator of the new study.


Running simulations of such a vast virtual model, where scientists can watch the reactor system perform as a whole or zoom in to focus on the interaction of individual molecules, requires unprecedented computing power. To undertake such a task, researchers will use both Rensselaer's Computational Center for Nanotechnology Innovations, or CCNI - the world's seventh most powerful supercomputer - and Brookhaven National Laboratory's New York Blue - the world's fifth most powerful supercomputer. body>


The research program, titled "Deployment of a Suite of High Performance Computational Tools for Multiscale Multiphysics Simulation of Generation-IV Reactors," is unique in scale as well as its geographic concentration. Along with Rensselaer and Brookhaven, the partnership includes researchers from Columbia University and the State University of New York at Stony Brook, all New York state-based institutions. Another Empire State connection is computer giant IBM, headquartered in New York and the maker of Blue Gene supercomputers. The company developed, designed, and built both CCNI and New York Blue.


Rensselaer nuclear engineering and engineering physics professor Michael Podowski, a world-renowned nuclear engineering and multiphase science and technology expert who also heads Rensselaer's Interdisciplinary Center for Multiphase Research, is project director and principal investigator of the new study.


Podowski said nuclear power should likely gain traction and become more widespread in the coming decades, as nations seek ways to fulfill their growing energy needs without increasing their greenhouse emissions. Nuclear reactors produce no carbon dioxide, Podowski said, which gives this energy source an advantage over coal and other fossil fuels for large-scale electricity production.


The main challenge of nuclear power plants, he said, is that they produce radioactive waste as a byproduct of energy production. But several governments around the world, including the United States, are working tirelessly with universities, research consortia, and the private sector to design and develop new, so-called "fourth generation" nuclear reactors that are safer and produce less waste. These reactors will be necessary in the coming decades as nuclear reactors currently in use reach the end of their life cycle and are gradually decommissioned.


The type of reactor that Podowski's team will be modeling, a sodium-cooled fast reactor, or SFR, is among the most promising of these next-generation designs. The primary advantage of the SFR is its ability to burn highly radioactive nuclear materials, which today's reactors cannot do, Podowski said.


Whereas current reactors source their power from uranium, SFRs can also source their power from fuel that is a mixture of uranium and plutonium. In particular, SFRs will be able to burn both weapons-grade plutonium and pre-existing nuclear waste, Podowski said. Thanks to their high temperatures, SFRs will also produce electricity at higher efficiency than current nuclear reactors.


So along with producing less toxic waste, SFRs should be able to actively help reduce the amount of existing radioactive materials by burning already-spent nuclear waste, he said. SFRs also offer a viable, productive way to start getting rid of the world's stockpile of weapons-grade nuclear fuel.


"The idea is to design reactors that can use this material and that are safe," Podowski said. "With this project, we are trying to improve the understanding of the physics of the system in order to provide the necessary advancements for the design of new, safer, and better reactors."


To expedite this understanding, Podowski's team will construct an incredibly detailed computer model of an SFR. The model will allow researchers to zoom in and watch as individual molecules of fission gas and fuel material interact with other molecules inside the reactor, or zoom out to simulate and test the behavior of the reactor as a whole. Creating such a model, not to mention running hundreds or thousands of simulations with slightly modified models and conditions, requires a tremendous amount of computing power and would not be possible without the help of supercomputers, Podowski said.


In order to construct the model and run these massive simulations, Podowski's team will develop and deploy a suite of powerful, high-performance software tools capable of performing such a task. Since no one computer code or technology is robust enough to model the wide variety of systems that comprise an SFR, the team will use different computer codes for different parts of the model and then develop new ways of linking those differently coded segments together into a single, cohesive, seamless package.


The researchers will use simulations to study fuel performance, local core degradation, fuel particle transport, and several other aspects of the SFRs. By better understanding how design and operational issues will affect the reactor at different stages in its life cycle, Podowski said, the new study will help to dramatically improve the design and safety of SFRs long before the first physical prototype is ever built.


"Nuclear reactors are safe, but nothing is perfect," Podowski said. "So the issue is to anticipate what could happen, understand how it could happen, and then take actions to both prevent it from happening and, in the extremely unlikely instance of an accident, be able to mitigate the consequences."


Podowski will lead a team of more than 10 researchers on the three-year project. Rensselaer associate professor Kenneth Jansen, assistant professor Li Liu, and research assistant professor Steven Antal - all of the Department of Mechanical, Aerospace, and Nuclear Engineering - are listed as co-PIs and will contribute to the study. Podowski said he also expects to hire a postdoctoral researcher and at least three doctoral students to work on the project.


The rest of the team includes James Glimm from Stony Brook University; David Keyes from Columbia University; as well as Lap Cheng and Roman Samulyak from Brookhaven National Laboratory.


Supercomputer


With 4700 gigaflops and 950 processors, Horseshoe is one of the world's most powerful computers. Developed at the Department of Mathematics and Computer Science, the supercomputer enables researchers at the University to carry out complex calculations.



In just one week, the supercomputer can conduct calculations and simulations that would take a year on a normal computer. Few researchers in the world have access to this type of computer, giving Danish researchers a rare opportunity to develop world-class expertise.



Using the supercomputer to create complex simulations, researchers at the University have studied the interaction of fat and protein in living organisms' cell membranes. This knowledge enables our researchers to contribute to a better understanding of several diseases.



Horseshoe is the result of advanced research in computer technology. The supercomputer is built from standard computer components available in any high-street shop. By linking the many components together, researchers created a cluster with signifi cant capacity. The University's research team is one of few in the world to have mastered this technology. Computer science students gain valuable knowledge about cluster computing, an area in increasing demand.



supercomputer is a computer that is considered, or was considered at the time of its introduction, to be at the frontline in terms of processing capacity, particularly speed of calculation.



Supercomputer challenges, technologies


A supercomputer generates large amounts of heat and must be cooled. Cooling most supercomputers is a major HVAC problem.
Information cannot move faster than the speed of light between two parts of a supercomputer. For this reason, a supercomputer that is many meters across must have latencies between its components measured at least in the tens of nanoseconds. Seymour Cray's supercomputer designs attempted to keep cable runs as short as possible for this reason: hence the cylindrical shape of his Cray range of computers. In modern supercomputers built of many conventional CPUs running in parallel, latencies of 1-5 microseconds to send a message between CPUs are typical.
Supercomputers consume and produce massive amounts of data in a very short period of time. According to Ken Batcher, "A supercomputer is a device for turning compute-bound problems into I/O-bound problems." Much work on external storage bandwidth is needed to ensure that this information can be transferred quickly and stored/retrieved correctly.

Technologies developed for supercomputers include:



.Vector processing


.Liquid cooling


.Non-Uniform Memory Access (NUMA)

.Striped disks (the first instance of what was later called RAID)


.Parallel filesystems .




Technorati : ,

No comments:

Find here

Home II Large Hadron Cillider News