Simulating nuclear catastrophe: Virtual practice for real disasters

[Image:RPI doctoral student, Yiming Gao wears a motion capture suit in a room as an array of 12 3D motion capture cameras with infrared LEDs capture his movement which is seen on the computer screen at the RPI Nuclear Engineering Lab on Thursday, March 8, 2012 in Troy, NY. Members of the Rensselaer Radiation Measurement & Dosimetry Group have been working on the virtual reality and motion capture systems for use with radiation dose analysis. (Paul Buckowski / Times Union)]

Virtual practice for real disasters

RPI research centers on how to keep repair crews safe in a nuclear catastrophe

TROY — When the Fukushima nuclear plant in Japan began to melt down last year after being struck by a tsunami, technicians entered damaged radioactive areas to attempt repairs without knowing exactly what they would face.

But what if those workers had a virtual reality system based on the plant, including areas that had become dangerously radioactive, and could have first practiced repairs in safety before exposing themselves to potentially lethal missteps?

Such a system is the goal of research at Rensselaer Polytechnic Institute. Progress is being watched with interest by an industry group representing nuclear plant owners, the Electric Power Research Institute.

“The nuclear industry is catching up with the use of virtual reality, which can be used to visualize an environment, both during a crisis or during normal operations,” said George Xu, head of RPI’s Radiation Measurement & Dosimetry Group.

As the Fukushima plant spiraled toward meltdown, some workers volunteered to enter damaged areas to make critical repairs without knowing how much radiation they might be exposed to.

“Some of these workers went in prepared to die. It was a very brave act,” said Xu.

As Xu watched the disaster unfold, he thought about using virtual reality to recreate the specific layout of the plant and its emerging leaks of radiation.

That simulation could then be coupled with avatars linked to actual human subjects, who would move through the virtual plant and determine which actions could be taken safely — and which could not.

“This technology is certainly of interest to us,” said Brian Schimmoller, a spokesman on nuclear issues with EPRI, based in Washington, D.C. “It would be a good potential application of advanced technology,”

Using virtual reality to practice for risky situations is much like the simulators that train pilots or allow surgeons to practice complicated procedures beforehand.

Within a week of the Fukushima disaster, Xu and a graduate student, Justin Vazquez, had created a virtual model of the plant.

Since then, they have developed an avatar — called a “phantom human” — that is programmed to mimic the physical impact of radiation on human organs and tissues.

In the lab at the RPI Nuclear Engineering Facility on Tibbits Avenue, Vazquez and another student, Yiming Gao, demonstrated a motion capture suit that is used to translate the movement of a human onto an avatar on a computer monitor.

Gao wore a black bodysuit, dotted with 34 spherical sensors at his head, torso, arms and legs that were tracked by a dozen three-dimensional cameras. As Gao moved, kicked and jumped, the avatar followed.

Knowing precisely how a worker moves in a radioactive environment is critical because workers can “self-shield” against overexposure by the way they move around, said Vazquez. Something as simple as stooping over a puddle of radioactive liquid can increase exposure.

“By using an avatar first, a worker can move through a potentially hazardous environment, and do a mock-up of the repair, before (he has) to go in and actually do it,” he said. “The position, distance from the source, and posture can be critically important.”

Xu said a virtual world would also give emergency workers, like firefighters, a chance to explore a damaged part of the plant and plan strategy before having to enter the danger zone.

And potential Homer Simpsons, the cartoon everyman who is possibly the world’s most widely-known nuclear technician, would have something to consider should the virtual plant come to pass.

A virtual system could help plant owners train workers in proper routine techniques, and create a permanent record that showed how each worker performed — or in the case of Homer, known for his hilarious screw-ups — misperformed in the virtual world.

But Homer is safe for now. Xu said he and Vazquez likely have several years of research to complete.

One Comment

Using simulators for scenarios in which real-world training would be costly (e.g., the costs in fuel and aircraft wear and damage to train pilots) is fairly obvious to me. However, I hadn’t really considered before how much simulations might help in training for high-cost, low-frequency scenarios, such as a nuclear meltdown–scenarios for which realistic training opportunities are rare. Those kinds of scenarios occur in all sorts of domains and present very serious problems. It seems to me that this sort of training is likely to be most beneficial for situations in which performance is heavily influenced by perceptual processing and motor skills, such as those mentioned in the article. Of course, that probably means that perceived presence is an important design consideration. Hard to think of examples, though, so it’s really interesting to see an application like this.