One of the chief complaints as the crisis at the Fukushima Dai-ichi nuclear plant unfolds is the paltry amount of information that the Tokyo Electric Power Co. and the Japanese government have shared with the citizens of Japan and of the world.

In an attempt to understand the severity of the crisis, the nuclear community is piecing together a forensic analysis from thousands of miles away. It does not need to be this way.

Using cloud-computing technology to store data on faraway networks that are accessible to all stakeholders will help us all make decisions on the cleanup. Fukushima is not just Japan's problem: We're all going to be dealing with the fallout from this situation for years.

Yet the tsunami of data needed makes collection and management a challenge, unless the proper information management system is put in place. Samples of air, soil, groundwater and seawater, as well as of crops and fish, will be collected from potentially affected areas. That data will need to be evaluated for both short- and long-term impacts on humans and the environment.

BP never did this during the gulf oil spill, and Soviet and Russian authorities never did this for Chernobyl, so the public still does not know the exact extent of those disasters' effects on human health and the environment.

Placing all data in a centralized management system in the cloud would allow us to know where samples were taken, who collected them, how the samples were analyzed, what the levels of radionuclides were and what the long-term effects of each isotope are likely to be. The public lacks the sophistication to deal with most of this data, but scientists and engineers could use their combined knowledge to advise on the next steps at Fukushima and improve safety at the rest of the planet's nuclear power plants.