Futurity FEBRUARY 19TH, 2010NERISSA HANNINK-MELBOURNEMany people believe that cockroaches could survive a nuclear bomb and the subsequent radiation exposure, but is that actually true? The creepy crawlies do have a reputation for resilience, which media reports have suggested may stem from rumors that insects thrived in the aftermath of the atomic bombings of Hiroshima and NagasakiBut Tilman Ruff, a Nobel Laureate and professor in the School of Population and Global Health at the University of Melbourne who studies the health and environmental consequences of nuclear explosions, says he has yet to see any documented evidence that there were cockroaches scuttling through the rubble.

“I’ve certainly seen photographs of injured people in Hiroshima that have lots of flies around, and you do imagine some insects would have survived,” Ruff says. “But they still would have been affected, even if they appear more resistant than humans.”

ROACHES’ BAD WRAP

The TV series Mythbusters tested the cockroach survival theory in 2012 when they exposed cockroaches to radioactive material. The roaches survived longer than humans would have, but they all died at extreme levels of radiation.

Mark Elgar, a professor at the School of Biosciences, says Mythbusters tests are incomplete because they only looked at how many days the cockroaches lived after exposure. They didn’t look at the cockroaches’ ability to produce viable eggs, thus ensuring the continued survival of the species.

“There is some evidence that they seem quite resilient to gamma rays, although they are not necessarily the most resistant across insects.”

“You could argue,” Elgar adds, “that some ants, particularly those that dig nests deep into the ground, would be more likely to survive an apocalypse than cockroaches.”

Previous tests of insects subjected to radiation found that cockroaches, though six to 15 times more resistant than humans, would still fare worse than the humble fruit fly.

Elgar says the feral American and German species of cockroach—the ones you might recognize from your kitchen nooks and crannies—have given the rest of the species a bad rap.

“I think our view of cockroaches is informed by our frequent interaction with the American and German cockroaches, which have spread throughout the world,” Elgar says. “Their habit of basically acting as an unpaid house cleaner horrifies people.”

There are more than 4,000 species of cockroaches, however, including native Australian cockroaches marked by iridescent colors and patterns……….

For a while they’ll be able to eat dead bodies and other decaying material but, if everything else has died, eventually there won’t be any food. And they’re not going to make much of a living,” Elgar says.

“The reality is that very little, if anything, will survive a major nuclear catastrophe, so in the longer term, it doesn’t matter really whether you’re a cockroach or not.”…….

“The evidence from a disaster like Chernobyl is that every organism, from insects to soil bacteria and fungi to birds to mammals, would experience effects in proportion to the degree of contamination,” Ruff says. …….

uff says that focusing on a single species misses the complexity of the biological environment and how we relate to one another, as well as interactions between multiple stresses at the same time.

“There’s all sorts of factors we have to look at. There are environmental factors. There are chronic exposures, effects across generations, and food shortages, for example,” he says. “The magnitude of effects of a nuclear explosion is far greater than what you might see in carefully controlled experiments and laboratory conditions.”

Some history: Moffett says that “first use” ended World War II. That was hardly the principal cause of Japan’s surrender.

Most historians now attribute the end to the Soviet entry on Aug. 8. That immoral and illegal first use was also unnecessary. I’ve made the case in this paper many times, but I’ll merely quote Gen. Dwight Eisenhower, the supreme allied commander): “Japan was ready to surrender, and there was no need to use that awful thing.” Virtually all the top military leaders agreed.

But apart from its illegal and immoral despicability “common to Dark Age barbarians” (as Adm. William Leahy put it), that first use alienated our Soviet ally and started a long and dangerous Cold War.

What Moffett doesn’t say is that the first-use option, while not necessitating first use, does require preparation and willingness to do it. In a time of crisis, Nation X, knowing that Enemy Y has the first-use option and fearing imminent first use from Y, may pre-empt with the strike first – better to use ’em than lose ’em. This is equally dangerous with nukes kept on “hair trigger” alert, which first-use nuke nations do (but not the no-first-use nations: India, China and North Korea). It’s a recipe for an accidental nuclear launch.

–We’ve long held first use, even during the 1980s when the Soviets (and China) espoused a no-first-use policy. It was a main driver of the dangerous and often nearly catastrophic super power arms race. There were hundreds of nuclear accidents and near misses, some after the Cold War ended, as we now know from Eric Schlosser’s shocking 2014 book, Command and Control. By pure luck we survived decades of military inattention to nuclear safety and our (still ongoing) deference to the “we’re falling behind” cries of the dollar-seeking military-industrial-complex. (We are the world’s No.1 arms merchant, with many undemocratic customers.) For some frighteningly close calls see my review of Schlosser’s book: bit.ly/2SCQUO5.

First use has also been used by every president since Harry Truman as a threat to force concessions, as Daniel Ellsberg (nuke adviser to the Pentagon and several presidents in the 1960s and ’70s) has pointed out, with many examples in his recent Doomsday Machine.

Moffett also says Ronald Reagan showed “wisdom” by retaining the first-use option. Eventually Reagan wised up, but not until Mikhail Gorbachev (Nobel Peace Prize, 1990) came along in the mid-1980s. Earlier Reagan had little understanding of nukes. In fact he and his vice president, George H.W. Bush, were both insisting that a nuclear war was survivable and winnable.

By 1986, Reagan and Gorbachev, at their first summit, nearly agreed to the abolition of all nukes. But Reagan’s “Star Wars” (a proposed anti-ballistic missile system then outlawed by treaty and thought to be “pie in the sky”) killed the deal. But in 1987 we fortunately got the INF Treaty destroying 3,000 medium-range missiles – a treaty the United States is threatening to leave.

Moffett said our local leftists should “leave defense policy to national security and military experts.”

Surely Moffett knows that many such experts are today advocating exactly what the “local leftists” are – urging our state Legislature to urge Congress and the president to adopt no first use and halt funds for new low-yield nukes. They include: Gen. Lee Butler (Air Force), commander of Strategic Air Command (1984-1991) and first of the Strategic Command (1991-1994); Gen. James Cartwright (USMC), commander of the Strategic Command (2004-07) and vice chair of the Joint Chiefs of Staff (2007-2011); Secretary of State George Shultz (under Reagan); and Secretary of Defense William Perry (under Bill Clinton).

There are moral problems with nukes and even with nuclear deterrence of any form. Even deterrence (with no first use) requires the preparation for possible use and a willingness to use nukes “if necessary.” As such, all nuclear deterrence runs the risk of nuclear war and the killing of millions of innocent human beings or worse, given the possibility of nuclear winter. As science knows, but apparently not the Pentagon, even a small nuclear exchange – for example, India versus Pakistan, each firing 50 low-yield weapons – could bring on a 10-year nuclear winter and global famine killing over a billion people (2014 study by Physicians for Social Responsibility). Such a risk is morally unacceptable – a concern central to creating the Nuclear Non-Proliferation Treaty in 1968 – now with 189 parties and as important as ever.

The Non-Proliferation Treaty (Art. 6) requires a swift end to the nuclear arms race and the bringing to conclusion a treaty for “general and complete disarmament under strict and effective international control.” In 1996 the World Court rendered an opinion on the legality of nuclear weapons, saying: “The threat or use of nuclear weapons would generally be contrary to the rules of international law applicable in armed conflict.”

Meeting our treaty obligations will be a very long and difficult journey. But we must recover the progress that slowed soon after the end of the Cold War and recently threatens to stop – or worse.

In the meantime, the United States can encourage the non-proliferation treaty’s many non-nuke parties to show that the United States is still serious about its treaty obligations. We N.H. folks – as many other states are doing – can and should take the small but positive steps to support our state government to urge Congress and the president to adopt a no-first-use pledge, and to decline funding for any new costly and “more usable” low-yield nukes.

(Ray Perkins Jr. of Concord is professor of philosophy, emeritus, at Plymouth State University and vice chairman of the Bertrand Russell Society board of directors.)

The U.S. Department of Energy recently released new estimates for the cost of cleaning up the Hanford nuclear site in central Washington state. That number could now reach a staggering $677 billion, with active cleanup ending in the year 2079. Under this scenario the federal government would spend, on average, more than $11 billion dollars every year for 60 years.

As leaders in the Tri-Cities — the community closest to and most impacted by the Hanford site — we believe that the United States simply must find a way to effectively address this problem at a price that taxpayers can afford. One clear step in the right direction is to begin managing the waste based on its actual contents and risks rather than an arbitrary definition developed decades ago.

To summarize, DOE is responsible for the cleanup of waste left over from decades of nuclear-weapons production, including approximately 53 million gallons in underground tanks at Hanford. Federal laws passed in 1954 and 1982 guide the agency’s management of this waste but do not clearly specify how the waste should be categorized. Rather than making a determination, the agency simply decided in the early 1980s to manage much of our nation’s defense nuclear waste as high-level, requiring the highest standards, regardless of the actual amount of radioactivity it contains or risk it poses.

DOE is now considering moving away from this well-intentioned, but overly costly and inaccurate approach. Instead of arbitrarily making decisions based solely on the origin of the waste, agency officials are proposing to manage this waste based on its actual physical characteristics. This is the same method that countries like France and Germany use to guide their waste-management decisions, and would bring the U.S. closer to international standards established by the International Atomic Energy Agency.

Why does this matter? A risk-based approach would allow DOE to manage, treat and dispose of defense waste in a manner that accurately reflects its contents and the potential risks it poses to human health and the environment. Doing so could reduce cleanup costs by tens of billions of dollars, and has the potential to significantly speed up remediation efforts at Hanford and elsewhere.

DOE has been accused of proposing this change in order to save money and shirk its responsibilities, but this new approach would not mean that the federal government can simply walk away from its cleanup obligations. The federal government has committed to many billions of dollars’ worth of remediation work at Hanford and elsewhere, and budget shortfalls mean that important cleanup projects often don’t get started soon enough, or take too long to complete.

Treating waste based on its actual contents would allow DOE to direct the resources they save toward other important cleanup efforts that would otherwise languish, potentially for years to come. It could also open up pathways to get some waste out of Washington state more quickly. These waste streams would otherwise remain at Hanford for many more years, or even permanently.

In their letter to DOE opposing this proposed change, Gov. Jay Inslee and Attorney General Bob Ferguson stated, “our communities deserve to be heard on this dangerous idea.” We find it frustrating that in this case the governor and AG aren’t listening to the community that is most directly impacted by Hanford cleanup.

We do not feel that it is a dangerous idea and, to the contrary, believe that it will allow other important cleanup work at the Hanford site to happen faster.

Ultimately, there is high-level defense nuclear waste at Hanford and elsewhere that does need to be treated and disposed of in a deep geological repository. It is some of the most challenging and expensive material that our country has to address. We should not, however, delay cleanup progress and waste taxpayer funds by unnecessarily managing lower-level waste, which scientists agree can be safely disposed at permitted sites, in the same manner. After all, how can we expect to effectively address this problem if we aren’t even willing to accurately define it?

The Tri-City community wants the Hanford site remediated as quickly and effectively as possible, but we see no need to make an already difficult job even harder. Our hope is for DOE to meaningfully engage with the appropriate regulatory bodies, including the Washington State Department of Ecology, to determine, in a technically justified manner, that more waste can be managed as low-level.

Importantly, this will require the state government and our elected officials to keep an open mind and make a genuine effort to reach a reasonable consensus. If they are successful, it will open the door for faster, less costly remediation outside of Washington state while still allowing the work to be accomplished safely and responsibly.

We can then turn our attention and resources to other high-priority cleanup efforts at Hanford, and we will all be better off for it.

Robert Thompson is mayor of the City of Richland, the city closest to the Hanford site.

Carl Adrian is president of the Tri-City Development Council, which has advocated for the Tri-Cities on Hanford-related matters since 1963.

Late last year, the Energy Department (DOE), began work on a new flagship nuclear project, the Versatile Test Reactor (VTR), a sodium-cooled fast reactor. If completed, the project will dominate nuclear power research at DOE. The department’s objective is to provide the groundwork for building lots of fast-power reactors. This was a dream of the old Atomic Energy Commission, DOE’s predecessor agency. The dream is back. But before this goes any further, Congress needs to ask, what is the question to which the VTR is the answer? It won’t be cheap and there are some serious drawbacks in cost, safety, but mainly in its effect on nonproliferation.

Congress has to ask hard questions: Is there an economic advantage to such reactors? Or one in safety? Or is it just what nuclear engineers, national laboratories, and subsidy-hungry firms would like to do?

The answer of DOE’s Idaho National Laboratory, which would operate the reactor, is cast in terms of engineering and patriotic goals, not economic ones: “US technological leadership in the area of fast reactor systems . . . is critical for our national security. These systems are likely to be deployed around the globe and U.S. leadership in associated safety and security policies is in our best national interest.” In other words, we need to build fast reactors because DOE thinks other people will be building them, and we need to stay ahead.

In the 1960s, when the Atomic Energy Commission concentrated on fast reactors (“fast” because they don’t use a moderator to slow down neutrons in the reactor core), it argued with a certain plausibility that uranium ore was too scarce to provide fuel for large numbers of conventional light-water reactors that “burned” only a couple percent of their uranium fuel. Fast reactors offered the possibility, at least in principle, of using essentially all of the mined uranium as fuel, and thus vastly expanding the fuel supply. To do this you operate them as breeder reactors—making more fuel (that is, using excess neutrons available in fast reactors to convert inert uranium to plutonium) than they consume to produce energy. The possibility of doing so is the principal advantage of fast reactors.

But we then learned there are vast deposits of uranium worldwide, and at the same time many fewer nuclear reactors were installed than were originally projected, so there is no foreseeable fuel shortage. Not only that, the reprocessing of fuel, which is intrinsic to fast reactor operation, has turned out to be vastly more expensive than projected. Finally, by all accounts fast reactors would be more expensive to build than conventional ones, the cost of which is already out of sight. In short, there is no economic argument for building fast reactors.

When it comes to safety, sodium-cooled fast reactors operate under low pressure, which is an advantage. But fast reactors are worrisome because, whereas a change in the configuration of a conventional nuclear core—say, squeezing it tighter—makes it less reactive, the corresponding result in a fast reactor is to make it more reactive, potentially leading to an uncontrolled chain reaction.

With regard to nonproliferation, the issue that mainly concerns us is that the fast reactor fuel cycle depends on reprocessing and recycling of its plutonium fuel (or uranium 233 if using thorium instead of uranium). Both plutonium and uranium 233 are nuclear explosives. Widespread use of fast reactors for electricity generation implies large quantities of nuclear explosives moving through commercial channels. It will not be possible to restrict such use to a small number of countries. The consequent proliferation dangers are obvious. And while it is doubtful the U.S. fast reactor project will lead to commercial exploitation—few, if any, projects from DOE ever do—U.S. pursuit of this technology would encourage other countries interested in this technology, like Japan and South Korea, to do so.

One should add that one of the claims of enthusiasts for recycling spent fuel in fast reactors is that it permits simpler waste management. This is a complicated issue, but the short answer is that rather than simplifying, reprocessing and recycling complicate the waste disposal process.

With all these concerns, and the lack of a valid economic benefit, why does the Energy Department want to start an “aggressive” and expensive program of fast reactor development? It’s true that so far only exploratory contracts have been let, on the order of millions of dollars (to GE-Hitachi). But the Department is already leaning awfully far forward in pursuing the VTR. It estimates the total cost to be about $2 billion, but that’s in DOE-speak. We’ve learned that translates into several times that amount.

But beyond that, the nuclear engineering community, and the wider community of nuclear enthusiasts, have never given up the 1960s AEC dream of a fast breeder-driven, plutonium-fueled world. Such reactors were to have been deployed by 1980 and were to take over electricity generation by 2000. It didn’t even get off the ground, in part because of AEC managerial incompetence, but mainly because it didn’t make sense.

After the 1974 Indian nuclear explosion and the realization that any country with a small reactor and a way to separate a few kilograms of plutonium could make a bomb, proliferation became a serious issue. In 1976 President Gerald Ford announced that we should not rely on plutonium until the world could reliably control its dangers as a bomb material. The plutonium devotees never accepted this change. Jimmy Carter froze construction of an ongoing fast-breeder prototype, the Clinch River Reactor, about three time the size of the proposed VTR. Ronald Reagan tried to revive it but, as its rationale thinned and its cost mounted, Congress shut it down in 1983. The plutonium enthusiasts thought they got their chance under George W. Bush with a fast reactor and a reprocessing and recycling program under of the rubric of Global Nuclear Energy Partnership. But it was so poorly thought out it didn’t go anywhere. More or less the same laboratory participants are now pushing the VTR.

The DOE advanced reactor program has many irons in the fire, mostly in the small reactor category. But do not be misled. They are mostly small potatoes without much future. Only the fast reactor project is the real thing, bureaucratically, that is. Although at this point DOE has only contracted for conceptual design, the follow-up will cost many millions and take many years. Nothing attracts national laboratories, industrial firms, and Washington bureaucracies as much as the possibility of locking into a large multiyear source of funding.

Congress needs to look hard at the rationale for a fast reactor program. This means getting into the details. At a Senate Appropriations hearing last month on advanced reactors, Sen. Dianne Feinstein said rather plaintively, “We cast the votes, and cross our fingers hoping nothing bad will happen.” That’s not good enough.

Victor Gilinsky is program advisor for the Nonproliferation Policy Education Center (NPEC) in Arlington, Virginia. He served on the Nuclear Regulatory Commission under Presidents Ford, Carter, and Reagan. Henry Sokolski is executive director of NPEC and the author of Underestimated: Our Not So Peaceful Nuclear Future (second edition 2019). He served as deputy for nonproliferation policy in the office of the U.S. secretary of defense in the Cheney Pentagon.

The rise and demise of the Clinch River Breeder Reactor, Bulletin of the Atomic Scientists, By Henry Sokolski, February 6, 2019This year marks the 36th anniversary of the termination of the Clinch River Breeder Reactor Project, a federally funded commercial demonstration effort. In the very early 1980s, it was the largest public-works project in the United States. Japan, South Korea, China, France, Russia, and the United States are now all again considering building similar plants. For each, how and why Clinch River was launched and killed is a history that speaks to their nuclear future. This history involves more than cost benefit analysis. For the public and political leadership, facts and arguments rarely close an initial sale of a large government-funded, high-tech commercialization program. Nor do they generally goad officials to abandon such projects. Such acts are fundamentally political: Fears and hopes drive them. Certainly, to understand why the US government launched and subsequently killed Clinch River requires knowledge not just of what the public and its political leadership thought, but also of how they felt.

Unwarranted fears of uranium’s scarcity fueled interest in fast-breeder reactors. …….in 1945, uranium 235, a fissile uranium isotope that can readily sustain a chain reaction, was believed to be so scarce, it was assumed there was not enough of it to produce nuclear electricity on a large scale. Scientists saw the answer in fast-breeder reactors………

The Atomic Energy Commission publicly promoted their commercialization with confident, cartoonish optimism. In one publication, the commission asked the upbeat question: “Johnny had three truckloads of plutonium. He used three of them to power New York for a year. How much plutonium did Johnny have left?” The answer: “Four truckloads.”

Unfortunately, this pitch glossed over two stubborn facts. First, because plutonium is so much more toxic and difficult to handle than uranium, it is many times more expensive to use as a reactor fuel than using fresh uranium. Second, because plutonium fast-breeder reactors use liquid metal coolants, such as liquid sodium, operating them safely is far more challenging and expensive than conventional reactors.

When private industry tried in the early 1960s to operate its own commercial-sized fast-breeder, Fermi I, the benefits were negative. Barely three years after Fermi 1 came online, a partial fuel meltdown in 1966 brought it down. It eventually resumed operations before being officially shut down in 1972.

These facts, however, are rarely emphasized. Those backing breeders—whether it be in 1945, 1975, or today—focus not on reliability and economics, but rather that we are about to run out of affordable uranium. For the moment, of course, we are not. Uranium is plentiful and cheap as is enriching it. This helps explain why the United Kingdom, France, Germany, Japan, and the United States, no longer operate any commercial-sized fast-breeder reactors and are in no immediate rush to build new ones………

When the Atomic Energy Commission argued the case for building a breeder reactor in the late 1960s and early 1970s, it projected 1,000 reactors would be on line in the United States by the year 2000 (the real number turned out to be 103) and that the United States would soon run out of affordable uranium. Also, by the mid-1960s, the commission needed a new, massive project to justify its continued existence. Its key mission, to enrich uranium for bombs and reactors, had been completed and was overbuilt. The commission was running out of construction and research projects commensurate with its large budget. A breeder-reactor- commercialization program with all the reprocessing, fuel testing, and fuel fabrication plants that would go with it, seemed a worthy successor.

But the most powerful political supporter of Clinch River, then-President Richard Nixon, focused on a different point. Nixon saw the project less as a commercial proposition than as a way to demonstrate his power to secure more votes by providing government-funded jobs while at the same time affirming his commitment to big-science, engineering, and progress……….

the Energy Department videotaped safety tests it had conducted of how molten sodium might react once it came in contact with the reactor’s concrete containment structure. Concrete contains water crystals. Molten sodium reacts explosively when it comes in contact with oxygen, including oxygen contained in water. What the test demonstrated and the video showed was concrete exploding when it came in contact with liquid sodium.

This set off waves of worry at the department………

Just weeks before the final vote, the Congressional Budget Office released its financial assessment of the Energy Department’s last ditch effort to use loan guarantees to fund the project. Even under the most conservative assumptions, the budget analysts determined that the loan guarantees would only increase the project’s final costs. This helped push the project over a political cliff. The final Senate vote: 56 against, 40 for. All of the 16 deciding votes came from former Clinch River supporters.

No commercial prospects? Militarize. Nixon backed numerous science commercialization projects like Clinch River, including the Space Shuttle Program and the supersonic transport plane……… While the Space Shuttle Program won congressional support, the envisioned satellite contracts never materialized. The program became heavily dependent on military contracts. Finally, our national security depended upon it.

Although Clinch River never was completed, as its costs spiraled, it too attracted military attention. …….

Essentially, it didn’t matter when you asked–1971 or 1983—Clinch River was always another seven years and at least another $2.1 billion away from completion. ……

With Clinch River, what we now know, we may yet repeat. Fast-reactor commercialization projects and support efforts, such as Argonne National Laboratory’s Small Modular Fast Reactor, the US-South Korean Pyroreprocessing effort, the Energy Department’s Virtual (Fast) Test Reactor, France’s Astrid Fast Reactor Project, the PRISM Reactor, the TerraPower Traveling Wave reactor, India’s thorium breeder, Russia’s BN-1200, China’s Demonstration Fast-Breeder Reactor, continue to capture the attention and support of energy officials in Japan, China, Russia, South Korea, France, the US, and India. None of these countries have yet completely locked in their decisions. How sound their final choices turn out to be, will ultimately speak to these governments’ credibility and legitimacy.

In the case of Clinch River, the decision to launch the program ultimately rested on a cynical set of political calculations alloyed to an ideological faith in fast reactors and the future of the “plutonium economy.” Supporters saw this future clearly. As a nuclear engineer explained to me in 1981 at Los Alamos National Laboratory, the United States technically could build enough breeder reactors to keep the country electrically powered for hundreds of years without using any more oil, coal, or uranium. When I asked him, though, who would pay for this, he simply snapped that only fools let economics get in the way of the future.

This argument suggests that the case for fast reactors is beyond calculation or debate, something mandatory and urgent. That, however, never was the case, nor is it now. Instead, the equitable distribution of goods, which is a key metric of both economic and governmental performance (and ultimately of any government’s legitimacy and viability), has always taken and always must take costs into account. In this regard, we can only hope that remembering how and why Clinch River was launched and killed will help get this accounting right for similar such high-tech commercialization projects now and in the future. https://thebulletin. org/2019/02/the-rise-and- demise-of-the-clinch-river- breeder-reactor/?utm_source= Bulletin%20Newsletter&utm_ medium=iContact%20email&utm_ campaign=ClinchRiver_February6

From mining the uranium rich ore, to nuclear abandonment – a dozen by-products more radio toxic than the ore mined to fuel the reactor are discarded. These products are the raffinates culminating in 85% of the total radioactivity that goes directly into the tailings only to migrate throughout the environment.

Products like Radon gas, Polonium-210 with a 140 day half life, Radium-226, with a 1600 year half life, Thorium-230 with a 76,000 year half life are released , and yet only 1kg of Uranium oxide is recovered in every 4,000 kilos mined.

Uranium-238 subjected to neutron bombardment in the reactor becomes Uranium-239 with a 23 minute half life, then that becomes Neptunium-239 with a 2.3 day half life, and that goes on to become Plutonium-239 with a 244,000 year half life, then this spent fuel finally decays to become Uranium- 235 with a half life of 700 million years.

Moreover, x that by no less than 10 to get the life of the radioactive hazard, which equates to no less than 7 billion years, and here we have only just crossed the nuclear industries threshold within the last 76 years with many thousands of nuclear events, and accidents recorded, and yet this is not the only wastes these machines produce with one Canadian CANDU reactor that recorded 100 trillion becquerels of radiation from the Tritium released in just one year.

The nuclear embracing coterie tell us they can safely manage these radioactive wastes, yet there containment vessels are only guaranteed for 25 years not 7 billion years, and a director of Holtec has stated there is no way to remedy a breach of containment. Moreover these nuclear wastes are a gamble and risk that only grows exponentially with every generation.

An inadequate focus on researching and understanding the role of the environment in cancer prevention is a failure for public health.Laura N. Vandenberg 8 Feb 19,

In his 2019 State of the Union address on Tuesday night, President Trump called for $500 million over the next 10 years to fund research on childhood cancers.

Such funding is crucial to continue tackling the devastating disease. However, missing from the State of the Union—and most other conversations about tackling cancer—is a focus on prevention, specifically the need to research, understand and communicate the role environmental exposures play in cancer risk.

The numbers on cancer incidence and deaths are complex. Although childhood cancer mortality rates have dropped considerably from the 1960s, data from the American Cancer Society shows that incidence rates have increased 0.6 percent per year since 1975.

In this way, childhood cancers are like several others. Between 2005 and 2014, yearly cancer incidence rates rose for several types: thyroid cancer by 4 percent; invasive breast cancer by 0.3 percent in black women; leukemia by 1.6 percent; liver cancer by 3 percent; oral and pharynx cancers by 1 percent in Caucasians; pancreatic cancer by 1 percent in Caucasians; colon cancer by 1.4 percent in individuals younger than 55 years of age; rectal cancer by 2.4 percent in individuals younger than 55; and melanoma by 3 percent in individuals aged 50 and older.

While these cancer rates have increased, overall rates of cancer deaths have started to fall. In fact, since the 1990s, improved detection and treatment, as well as decreased smoking rates, have contributed to significant reductions in cancer mortality.

Reduced deaths from cancer are a great public health victory. These statistics prove that public health interventions like educational programs designed to curb smoking can have dramatic effects.

They also suggest that investments in improved detection and diagnosis are money well spent. A focus on treatments has also improved quality of life for cancer patients and their likelihood of remission.

But where is the call for better cancer prevention? As rates of numerous cancers continue to rise, the failure to identify the causes of cancer remains a disappointment for public health officials and researchers alike.

We know that environmental factors can contribute to cancer risk. Some, like smoking, are avoidable. Others are lifestyle factors that people can change like drinking less alcohol, decreasing consumption of processed meats, using protection from the sun, and increasing exercise.

Yet, other environmental factors like exposures to chemicals in the environment, including endocrine disruptors, have received little attention. While some NIH-funded programs like the Breast Cancer and Environment Research Program have worked to identify chemicals in the environment that promote cancer, funding for cancer prevention initiatives has stagnated.

Despite the limited resources invested in studies of environmental risk factors for cancer, we know enough to take action on some chemicals of concern.

For example, communities contaminated with perfluorinated chemicals, several of which are known to cause cancer, have demanded attention from government officials in addition to asking for more research.

Individuals living in these communities have the right to know how they are being exposed, and what their risks might be – for cancer and other diseases.

It is great that cancer research was raised in the President’s State of the Union speech, and that the difficulties associated with caring for a family member with cancer was mentioned in Stacey Abrams’ rebuttal.

But a failure to focus on prevention, a failure to acknowledge the role of the environment in causing cancer, and a failure to allocate funds to prevention research, are all failures for public health.

Dr. Vandenberg is an Associate Professor of Environmental Health Sciences at the University of Massachusetts Amherst School of Public Health and Health Sciences. Her work on endocrine disrupting chemicals has been funded by the National Institutes of Health including the BCERP program, which focuses on the environmental causes of breast cancer

By chief economics correspondent Emma Alberici Maria Teresa Farci’s legs start to shake as she reads aloud from the diary she kept that describes, in heartbreaking detail, the last moments of her 25-year-old daughter’s tortured life.

Key points:

Eight former commanders of a bombing range are before Italian courts

Locals living near Quirra firing range describe multiple cases of deformities and cancer as “Quirra syndrome”

Italy’s army has dismissed a report linking exposure to Depleted Uranium to disease suffered by the military

“She died in my arms. My whole world collapsed. I knew she was sick, but I wasn’t ready.”

Her daughter, Maria Grazia, was born on the Italian island of Sardinia with part of her brain exposed and a spine so disfigured her mother has never allowed her photo to be published.

This was only one of many mysterious cases of deformity, cancer and environmental destruction that have come to be called the “Quirra syndrome”.

Eight Italian military officers — all former commanders of the bombing range at Quirra in Sardinia — have been hauled before the courts.

It’s unprecedented to see Italian military brass held to account for what many Sardinians say is a scandalous coverup of a major public health disaster with international consequences.

Bombs and birth defects — is there a link?

In the year baby Maria Grazia was born, one in four of the children born in the same town, on the edge of the Quirra firing range, also suffered disabilities.

Some mothers chose to abort rather than give birth to a deformed child.

In her first television interview, Maria Teresa told Foreign Correspondent of hearing bombs exploding at the Quirra firing range when she was pregnant.

Enormous clouds of red dust enveloped her village.

Later, health authorities were called in to study an alarming number of sheep and goats being born with deformities.

Shepherds in the area had routinely grazed their animals on the firing range.

“Lambs were born with eyes in the back of their heads,” said veterinary scientist Giorgio Mellis, one of the research team.

“I had never seen anything like it.”

One farmer told him of his horror: “I was too scared to enter the barn in the mornings … they were monstrosities you didn’t want to see.”

Researchers also found an alarming 65 per cent of the shepherds of Quirra had cancer.

The news hit Sardinia hard. It reinforced their worst fears while also challenging their proud international reputation as a place of unrivalled natural beauty.

The military hit back, with one former commander of the Quirra base saying on Swiss TV that birth defects in animals and children came from inbreeding.

“They marry between cousins, brothers, one another,” General Fabio Molteni claimed, without evidence.

“But you cannot say it or you will offend the Sardinians.”

General Molteni is one of the former commanders now on trial.

Years of investigation and legal inquiry led to the six generals and two colonels being charged with breaching their duty of care for the health and safety of soldiers and civilians.

After repeated attempts, Foreign Correspondent was refused interviews with senior Italian military officials and the Defence Minister.

Governments earning money by renting out ranges

Sardinia has hosted the war games of armed forces from the west and other countries since sizable areas of its territory were sectioned off after World War II.

Rome is reported to make around $64,000 an hour from renting out the ranges to NATO countries and others including Israel.

Getting precise information about what has been blown up, tested or fired at the military sites and by which countries is almost impossible, according to Gianpiero Scanu, the head of a parliamentary inquiry that reported last year.

Many, including current Defence Minister Elisabetta Trenta, have previously accused the Italian military of maintaining a “veil of silence”.

Speaking exclusively to the ABC, chief prosecutor for the region, Biagio Mazzeo, said he is “convinced” of a direct link between the cancer clusters at Quirra and the toxicity of the elements being blown up at the defence base.

But prosecuting the case against the military comes up against a major hurdle.

“Unfortunately, proving what we call a causality link — that is, a link between a specific incident and specific consequences — is extremely difficult,” Mr Mazzeo said.

What is being used on the bases?

A recent parliamentary inquiry revealed that 1,187 French-made MILAN missiles had been fired at Quirra.

This has focussed attention on radioactive thorium as a suspect in the health crisis.

It’s used in the anti-tank missiles’ guidance systems. Inhaling thorium dust is known to increase the risk of lung and pancreatic cancer.

Another suspect is depleted uranium. The Italian military has denied using this controversial material, which increases the armour-piercing capability of weapons.

But that’s a fudge, according to Osservatorio Militare, which campaigns for the wellbeing of Italian soldiers.

“The firing ranges of Sardinia are international,” said Domenico Leggiero, the research centre’s head and former air force pilot.

Whatever is blown up on the island’s firing ranges, it’s the fine particles a thousand times smaller than a red blood cell that are being blamed for making people sick.

These so-called “nanoparticles” are a new frontier in scientific research.

They’ve been shown to penetrate through the lung and into a human body with ease.

She has suggested a possible link between disease and industrial exposure to nanoparticles of certain heavy metals.

The World Health Organisation says a causal link is yet to be conclusively established and more scientific research needs to be done.

Dr Gatti said armaments had the potential to generate dangerous nanoparticles in fine dust because they are routinely exploded or fired at more than 3,000 degrees Celsius.

Inquiry confirms causal links

In what was labelled a “milestone”, a two-year parliamentary investigation into the health of the armed forces overseas and at the firing ranges made a breakthrough finding.

“We have confirmed the causal link between the unequivocal exposure to depleted uranium and diseases suffered by the military,” the inquiry’s head, then centre-left government MP Gianpiero Scanu, announced.

The Italian military brass dismissed the report but are now fighting for their international reputation in the court at Quirra where the eight senior officers are now on trial.

The ABC understands commanders responsible for another firing range in Sardinia’s south at Teulada could soon also face charges of negligence as police conclude a two-year investigation.

By Kitty commenting on Abe makes sales pitch for Fukushima sake at Davos:

Prime Minister Shinzo Abe and other Japanese officials toast with sake produced in Fukushima Prefecture during the Japan Night …

The real killers, the strong beta and gamma-emitting, high level radionuclides like 90Sr, 137Cs, 99Tc and 129I , cobalt 60, Iridium are present in the soil in concentrations, hundreds of times higher than what they are saying in Japan. That is easy to see by the Geiger counter readings. Fukushima radionuclides can be in found very high concentrations across Japan from Fukushima to Yokohama, based on Busby and kaltofen sampling and analysis..

It is not simply cesium 137 that exists there.

An absorbed bolus of 80 billionths of a gram of any one of these beta-gamma radionuclides, causes acute systemic poisoning and radiation poisoning. The results can be either acute death or prolonged agony and death. There will be death, If there is a massive bolus ingested. These are the most poisonous and dangerous substances on earth.

If 1 ounce of any of these radionuclides- substance : st90, 137Cs, 99Tc and 129I , cobalt 60, were dumped on a group of people it would be like the cesium 137 exposure in Brazil or worse.

If any one of these radionuclides :90Sr, 137Cs, 99Tc and 129I Iridium, cobalt 60 was diluted in an inert powder for example, that diffused the RADIONUCLIDE onto 10,000 people, gathered for a festival or event , 3 quarters of them would die horrible deaths in 2 weeks and the rest would have tumors and organ damage that would kill them in a few months.

Obviously the sailors on board the Ronald Reagan did not get such a dose but it came close for some of them.

Radioactivity decreases, with the square of distance. Chronic ionized radiation-wave exposure is dangerous but , those the high level of those and other RADIONUCLIDEs present do not bode well for Japan in the concentrations that exist from Fukushima to tokyo that have been recorded by Busby and kaltofen.

Nucleoapes like to keep the eyes off the lethal radionuclides that are actually emitting the radiation.

There are also the highly potent alpha emitting, uranic and Transuranic alpha emitters like u235, u238, plutonium, AMERICIUM and actinides like Californium that are destroying the human genome in Japan. The beta-gamma emitters do too, but are not as effective and as potent, as mutagens and acute carcinogens because of their solubility and other chemical properties.

The Uranics, transuranics, actinides, are causing lung cancers, pancreatic cancers and sharp increases in birth defects from mutagenesis, and teratogenesis across Japan now.

A great deal of Japan’s water supply is probably heavily contaminated with tritium by now. TRITIUM is a strong teratogen, that is known to substantially increase incidence of leukemia. Tritium actually covalently bonds to DNA, protein, fat tissue and muscle tissue, unlike other radionuclides tritium acts exactly like hydrogen does in the body and the body is constantly doing chemical conversions of proteins using hydrogen and tritium ions in metabolic, acid-base, and enzyme reactions in the body.

The nucleoapes have gone out of their way, to obscure the deadly, insidious-effects of tritium on the human genome, chromosomes and the human body.

We are bags of mostly saline water solutions, proteins, fat, with some bone in us. When we ingest radionuclides they are sometimes diluted enough by our water and protoplasm, to not cause recognizable or apparent damage and acute symptoms. It is so with the highly water soluble saline analogs like cesium and strontium.

Dr Chris Busby:

Einstein, politics, physicists-nuclear physicists, and reality

The Uranics, transuranics, actinides are not so soluble because they are heavy metals. Particles of these radionuclides, that get stuck in the lungs and gi tract are particularly deadly. Many of these radionuclides can be biotransformed or chemically transformed into sulfates and organometallics that are easily absorbed into the body.

Then there are the evil-monkeys that says that some radionuclides increase our resistance to RADIONUCLIDE exposure and bioccummulation. Don’t ya know radioactive tritium increase incidence of leukemia, as has been shown in rigorous studies and case studies, its hormetic!

Question. What are the Four most poisonous substances known to humans that are not radionuclides?

Answers

1. Sarin gas is an organophosphate chemical weapon.

20 micrograms will kill you

2. Botulin toxin: Used cosmetically as a neuromuscular block agent, to get rid of wrinkles is lethally toxic in a bolus of 150 micrograms.

Botulin toxin is used to relax muscles and give the illusion that wrinkes are gone cosmetically. Botulin is used because of its extreme potency and length of duration,of action.

Botulin toxin has to be highly diluted and administered by and expert, for any purpose in the human body.

Botulin toxin is lethaly toxic in millionths of a gram concentrations. You can barely see a millionth of a gram with a powerful microscope.

Drugs are dosed at thousands of a gram,that is milligrams. A milligram is a barely detectable spec on a piece of paper to the human eye.

3. 220 micrograms of Ricin toxin from castor beans can kill a child

4. 300 micrograms of fentanyl can kill an adult. Fentanyl analogs are even more potent.

The Moscow theater hostage crisis (also known as the 2002 Nord-Ost siege) was the seizure of a crowded Dubrovka Theater by 40 to 50 armed Chechens on 23 October 2002 that involved 850 hostages and ended with the death of at least 170 people.

It is known that the Russians used a fentanyl-like agent to try to sedate the Chechens, who were holding the hostages in the theater. Unfortunately fentanyl is very hard to dose and disperse as an aerosol. A highly toxic agent like Fentanyl, has to be prepared in such a very special way, so that only its sedative effects are manifested.

Many of the innocent hostages in Nord-Ost, siege died from fentanyl poisoning from the compounded-fentanyl gas, used by the Russians to try to sedate the chechens, before they stormed the theater.

On the flip side of the coin, Sarin, when aerosolized with a suspending agent that works and diffuses the poison in high enough concentrations, is a deadly nerve gas that will kill thousands, in a few square miles with only a few, weaponized Cannisters, detonated.

The Tokyo subway sarin attack-Subway Sarin Incident was an act of domestic terrorism perpetrated on 20 March 1995, in Tokyo, Japan, by members of the cult movement Aum Shinrikyo. In five coordinated attacks, the perpetrators released sarin on three lines of the Tokyo Metro (then part of the Tokyo subway) during rush hour, killing 12 people, severely injuring 50 (some of whom later died), and causing temporary vision problems for nearly 1,000 others. The attack was directed against trains passing through Kasumigaseki and Nagatachō, where the Diet (Japanese parliament) is headquartered in Tokyo

The Aum sarin attack in the Tokyo subways only killed 12 people. They used relatively large amounts of sarin in closed, relatively small areas, with sealed spaces.

They absolutely did not know what they were doing, otherwise they would have known that high doses of sarin have to be aerosolized in a suspending agent like a gas that is liquid under pressure, to properly disperse enough of the agent for it to be widely, dispersed and effectively lethal to a large group of people.

Many radionuclides, and especially the corrosive salt beta-gamma emittors and halogens like I131 and I129 are lethal in billionths of a gram . It even says so in toxicology profiles because, some of these radionuclides are used as radiopharmaceutical agents to treat cancer.

Bllionths of a gram, of any substance, is not even visible with a high powered microscope.

Radionucides are ionizing radiation emitters, as well as being the most poisonous substances to living things on earth, in the universe.

Billionths of a gram concentrations of these elements are highly detectable in billionth of a gram concentrations with scintillometers, gamma spectrometers, and decent pancake Geiger counters.

One of the main difficulties with proving how acutely lethal or chronically damaging RADIONUCLIDE are after nuclear accidents, or with chronic exposure to nuclear waste, are the chaotic mechanisms of dispersion of the radionuclides after catastrophes or in-situ.

Think of the Russian, poisoned with polonium, in London. He was dosed with a nanogram amount of polonium that caused him to die a slow painful death,from systemic organ failure for which there was no cure. He died days after the poisoning.

Boluses of cesium 137, and iodine 131 can kill quite quickly or at lower doses, can kill like the polonium did the murdered Russian in prolonged agony.

Who will be there, to prove what caused people dying a days, weeks or a month, after a.large exposure. Who will speakup for causative agents, after years of bioaccumuted exposure, when no one is even properly looking for the causative agent-RADIONUCLIDE or radionuclides?

How Direct Democracy Went Nuclear in Taiwan, A contentious vote on Taiwan’s nuclear future showed how the country’s public referendums went haywire. The Diplomat , By Nick Aspinwall, January 18, 2019 It only took one month for Huang Shih-hsiu, a 31-year-old nuclear energy advocate, to upend a core energy policy of Taiwan President Tsai Ing-wen. The policy, prior to its downfall, stated that Taiwan would decommission its three active nuclear power plants by 2025.

It makes for an entangled web of policy which, ideally, a direct democracy would sort out through a patient and measured process of public debate, consultation with experts, and consensus-building to avoid polarization and finger-pointing. Everyone does seem to agree on one thing, however: This did not happen in Taiwan.

Will the World Learn From Taiwan?

Matt Qvortrup, a professor of political science at Coventry University and leading referendum expert, has watched referendums surge in popularity throughout Europe and, gradually, to corners of the world like Taiwan, whose large-scale plebiscites provided lessons for global democracies in what, or what not, to do.

Qvortrup is a believer in referendums, but with conditions. “Democracy is discussion and deliberation,” he says, and that does not happen when voters are rushed to the polls. “To have meaningful democracy,” he says, “you need to have time to debate things.” Taiwan’s CEC-sanctioned TV debates were held in a cramped three-week window – five public forums each for 10 referendum questions.

He noted that debate on the high-interest issue of same-sex marriage dominated much of Taiwan’s already congested pre-referendum discourse, drowning out interest in the intricacies of energy policy. “That’s bad, because people will be voting on things they haven’t had the opportunity to talk about,” Qvortrup says.

Chao of RSPRC agrees, saying there was far from enough time for voters to have an informed debate. Shortly after the referendum, his center published a study showing that voters were not informed on nuclear power – most were unaware of the details of Tsai’s phaseout proposal, and 44 percent believed nuclear power provides most of the island’s energy. (It produces just over 8 percent, far behind coal-fired power.)

“For democracy to work, it has to be limited to relatively few issues,” says Qvortrup. “If you have too many issues on the ballot, people just get saturated. They turn off, they can’t be bothered. You need to save up your civic reserves.”

Taiwan’s nuclear power plebiscite was not even the only energy-related measure on the ballot: Two separate measures, both successful, called for Taiwan to reduce thermal power and stop expansion of coal-fired power plants. A measure to maintain Taiwan’s ban on food imports from the Fukushima disaster area also passed, angering Japan.

The team at Cofacts, a collaborative social media fact-checking platform that monitored online discussion leading up to the referendums, says it observed a combination of disinformation and voter apathy ahead of the energy plebiscites. “In comparison to other issues, nuclear power was one of the less popular topics,” writes Rosalind, a Cofacts editor, in an open response to questions from The Diplomat. “Even when people talked about it, they were actually talking about air pollution, reducing thermal power generation plants, new alternative energy, and polluted foods.” This did not allow voters to consider the nuances of the issues, such as whether Taiwan does in fact face a looming electricity shortage, says Rosalind.

“The people wanted to be on the ‘winning’ side of these yes/no questions, even though most of them did not know the referendum topics until the day of the election,” says Cofacts founder Johnson Liang. He notes that online discussion on nuclear power paled in comparison to talk of the same-sex marriage referendums. “There were way too many topics to vote [on] within a timespan that is too short, and they did not have time to follow the television debates.”

It takes a resonant message to cut through an overload of information and mangled discourse, and Huang Shih-hsiu had one: Nuclear Mythbusters ran with the slogan “Nuclear energy is green energy,” sizing it up against a future coal-fired dystopia and dismissing the present-day viability of affordable renewables, all while cutting through the opposing stance that nuclear power is an environmental crisis waiting to happen.

Nuclear Regulatory Commission ex-Chairman Gregory Jaczko is adamantly opposed to the idea of keeping existing nuclear reactors running as a way to offset climate change, because each reactor is like a time bomb ready to explode if the cooling is cut off by a total station blackout, by equipment failure, by major pipe breaks, or by acts of warfare, sabotage, or terrorism. The societal dislocation caused by the spread of radioactive material over wide areas, affecting drinking water, food and habitation for decades or centuries, is as bad as the ravages of climate change for the communities so affected.

As Chairman of the US Nuclear Regulatory Commission at the time of the Fukushima disaster, Jaczko has a unique insight into the factors that make nuclear power plants dangerous even after so-called “safe” shutdown. The Ex-NRC regulator argues against nuclear energy as a tactic to fight climate change 4 knows, too, that the arguments levied against renewables are ultimately incorrect, as technology to store energy and to rechannel it is growing by leaps and bounds. Investing tens or hundreds of billions of dollars into maintaining old nuclear reactors, which are becoming increasingly dangerous as they age, is simply stealing money away from investments in the renewable revolution that is our best hope for a sustainable energy future.

Ex-NRC regulator argues against nuclear energy as a tactic to fight climate change 1 Background: by Dr Gordon Edwards, http://www.ccnr.org/Jaczko_nixes_nukes_2019.pdf January 11, 2019 Commercial nuclear power plants are water-cooled. They are fuelled by ceramic uranium fuel pellets stacked inside long narrow rods made of zirconium metal. A number of these rods are bound together into a fuel assembly — in Canada such an assembly is called a fuel bundle.

Heat is produced by splitting uranium atoms. That heat is transported by the liquid water coolant which flows past the zirconium tubes containing the fuel. The heat is used to produce steam that will turn the blades of a steam turbine to generate electricity.

As the uranium fuel undergoes nuclear fission (splitting uranium atoms), hundreds of varieties of intensely radioactive byproducts build up inside the fuel. These are (1) broken fragments of uranium atoms, called “fission products”; (2) heavier-than-uranium elements, including plutonium, called “transuranic actinides”. These byproducts are millions of times more radioactive than the original fuel.

Loss of Cooling During a severe nuclear accident, the cooling is lost. Even if the reactor has been safely shut down just beforehand, and the fission process has been totally arrested, the temperature of the fuel will still soar to destructive levels without adequate cooling.

The problem is that radioactivity cannot be shut off. The radioactive byproducts created during nuclear fission remain in the fuel, and they continue to generate heat. In the case of a 1000 megawatt reactor, immediately following shutdown, over 200 megawatts of heat continue to be generated by the ongoing atomic disintegrations of the radioactive waste byproducts. After one hour this drops to about 30 megawatts of heat, which is still a tremendous rate of thermal energy release.

If the coolant is no longer circulating — perhaps because of a station blackout, as at Fukushima, or due to a large pipe break followed by a failure of emergency cooling — that “residual heat” or “decay heat” will not be removed from the core of the reactor.

Make no mistake, even 30 megawatts is a lot of heat — unless it is rapidly removed, that heat is more than enough to melt the fuel and surrounding structural materials of a nuclear reactor at a temperature of 2800 degrees C (5000 degrees F). That’s more than twice the melting point of steel. It’s the beginning of a partial or total core meltdown.

Hydrogen Gas Buildup At about 1800 degrees C (3300 degrees F), long before the fuel melts, the solid zirconium “cladding” surrounding the fuel starts to melt. Any failure of the zirconium cladding allows the escape, under high pressure, of dozens of radioactive waste byproducts that were previously trapped inside the fuel. The superheated steam that now fills the reactor vessel is suddenly infused with a multitude of radioactive gases, vapours, aerosols and ashes, all ready to be expelled into the atmosphere if there is any failure of containment.

At an even lower temperature, 700-800 degrees C, steam reacts chemically with the zirconium metal. Recall that water molecules are combinations of hydrogen and oxygen atoms (H2O). The blistering hot zirconium metal strips the oxygen out of the steam, forming zirconium oxide, while releasing all the left-over hydrogen. Hydrogen gas mixes with the steam-filled radioactively contaminated air to form an explosive mixture. Any spark will detonate the hydrogen in a devastating blast, more powerful than a natural gas explosion.

Such hydrogen gas explosions almost always accompany a nuclear meltdown. There were several such explosions during the partial meltdown of the NRX reactor at Chalk River, Ontario, in 1952; during the Three Mile Island partial meltdown in Pennsylvania in1979; and during the triple meltdown at Fukushima Dai-ichi in Japan in 2011. Such explosions will often damage the containment envelope of the nuclear reactor, spewing highly radioactive materials into the outer atmosphere.

Radioactive Exposures People, animals and plants are irradiated from above by “skyshine” from gamma-radiation-emitting gases passing overhead. Metallic radioactive vapours such as cesium-137, iodine-131 and strontium-90 will condense on vegetation, soil, buildings, skin, clothing, and surfaces of all kinds, leaving a lasting legacy of radioactive contamination, irradiating living things by “groundshine”. And these radioactive materials gradually work their way into the food chain, sometimes re-concentrating along the way, yielding contaminated crops, meat, fish, water, milk, mushrooms, berries, and much else besides. Ingesting or inhaling such materials will lead to the internal irradiation of people and animals by radioactive materials that lodge in the lungs, the bones, the blood, or the soft organs of the body.

For example, radioactive iodine condenses on pastureland, and the concentration of radioactive iodine in the grass becomes about 100 times greater than in the air above the pasture. The concentration of radioactive iodine in cow’s milk is about 100-1000 times greater than it is in the grass they eat. Then, when a young child drinks the cow’s milk, the concentration of radioactive iodine in the child’s thyroid gland is about 7-10 times greater than it is in the contaminated milk. So, a child’s thyroid can be exposed to radioactive iodine levels that are several orders of magnitude greater than that found in the contaminated air that they might breathe.

Radioactive cesium accumulates in meat and fish, often making them unsuitable for human consumption. Even today, hunters in Germany and the Czech Republic are compensated by their respective governments if they kill a wild boar, because they cannot eat the meat due to radioactive cesium contamination from the Chernobyl accident 33 years ago. In Japan, wild boars in the Fukushima forested areas have levels of radioactive cesium in their bodies that are 10 to 150 times greater than the maximum permissible levels for human consumption. Boars love mushrooms, and fungi are especially adept at concentrating radioactivity.

Nuclear Regulatory Commission ex-Chairman Gregory Jaczko is adamantly opposed to the idea of keeping existing nuclear reactors running as a way to offset climate change, because each reactor is like a time bomb ready to explode if the cooling is cut off by a total station blackout, by equipment failure, by major pipe breaks, or by acts of warfare, sabotage, or terrorism. The societal dislocation caused by the spread of radioactive material over wide areas, affecting drinking water, food and habitation for decades or centuries, is as bad as the ravages of climate change for the communities so affected.

As Chairman of the US Nuclear Regulatory Commission at the time of the Fukushima disaster, Jaczko has a unique insight into the factors that make nuclear power plants dangerous even after so-called “safe” shutdown. The Ex-NRC regulator argues against nuclear energy as a tactic to fight climate change 4 knows, too, that the arguments levied against renewables are ultimately incorrect, as technology to store energy and to rechannel it is growing by leaps and bounds. Investing tens or hundreds of billions of dollars into maintaining old nuclear reactors, which are becoming increasingly dangerous as they age, is simply stealing money away from investments in the renewable revolution that is our best hope for a sustainable energy future.

” …….Plutonium may remain in the lungs or move to the bones, liver, or other body organs. It generally stays in the body for decades and continues to expose the surrounding tissues to radiation. Lung, liver, and bone cancer You may develop cancer depending on how much plutonium is in your body and for how long it remains in your body. The types of cancers you would most likely develop are cancers of the lung, bones, and liver…….

The risks of mortality and morbidity from bone and liver cancers have also been studied in Mayak workers. Increasing estimated plutonium body burden was associated with increasing liver cancer mortality, with higher risk in females compared to males…….

Cardiovascular Effects. Epidemiological Studies in Humans. Possible associations between exposure to plutonium and cardiovascular disease have been examined in studies of workers at production and/or processing facilities in the United Kingdom (Sellafield)…….. within a cohort of Sellafield workers morality rate ratios for plutonium workers were significantly elevated for deaths from circulatory disease and ischemic heart disease . ….

Risks of mortality and morbidity from bone and liver cancers have also been studied in Mayak workers ….. Increasing estimated plutonium body burden was associated with increasing cancer mortality, with higher risk in females compared to males…..

The academy’s report found, “While advanced reactor designs are sometimes held up as a potential solution to nuclear power’s challenges, our assessment of the advanced fission enterprise suggests that no US design will be commercialized before midcentury.” That’s a chilling indictment for all advanced LWRs. The crux of the Morgan report is an assessment that the economic hurdles for nuclear in the U.S. are insurmountable.………

Peter Bradford, a veteran electric utility regulator and nuclear skeptic who served on the U.S. Nuclear Regulatory Commission (NRC) from 1977 to 1982, agrees that nuclear power in the U.S. is priced out of the market. “Even if, for once, they could contain or level out the costs,” he told POWER, “new nuclear is so far outside the competitive range. They have to cut costs and they can’t cut costs without building a bunch [of reactors]. That really isn’t in the cards.”

Nor does Bradford see new nuclear as a way to combat global warming. “Even if it is scaled up much faster than anything now in prospect, it cannot provide more than 10% to 15% of the greenhouse gas displacement that is likely to be needed by mid-century. Not only can nuclear power not stop global warming, it is probably not even an essential part of the solution to global warming,” he wrote in 2006. Since then, he argues, the declining costs of renewables and energy efficiency swamp nuclear economics even further.

While advocates call for setting a price on carbon to reward carbon-free generation, Bradford said that is a weak reed. “At any given level” of carbon prices, he said, “it is going to wind up benefiting renewables and storage,” not nuclear. A reasonable carbon price, he argued, “might not be enough to keep existing plants running.”

SMRs to the Rescue?….

while smaller nuclear reactors are an appealing technological approach to keeping nuclear in the generating mix, they come with their own set of problems.

On closer inspection, said the NAS panel, “Our results reveal that while one light water SMR module would indeed cost much less than a large LWR, it is highly likely that the cost per unit of power will be higher. In other words, light water SMRs do make nuclear power more affordable but not necessarily more economically competitive for power generation.”

Given the “economic premium” of SMRs, along with “the considerable regulatory burden associated with any nuclear reactor, we do not see a clear path forward for the United States to deploy sufficient numbers of SMRs in the electric power sector to make a significant contribution to greenhouse gas mitigation by the middle of this century,” the report says. Economist Kee echoed that conclusion. When it comes to SMRs, he said there “is a lot of work to do and not much time to do it.”

SMRs also face a challenge of demonstrating their viability: Making an economic or climate impact requires many reactors. Neil Alexander, a Canadian nuclear consultant, wrote recently, “Everything about SMRs such as the cost of construction, availability of fuel, cost of shared services, availability of trained operators, and cost of research needed to resolve emerging challenges, only work economically when the unit is in a fleet. A FOAK [first-of-a-kind] cannot stand alone and the barrier to entry that the industry faces is more akin to the ‘First Dozen of a Kind.’ ”

Portland, Oregon-based NuScale appears to be the leader in developing SMR technology (Figure 4 on original). It is taking Alexander’s advice. NuScale has a customer for a 12-unit (720-MW) station: Utah Associated Municipal Power System (UAMPS), which has a site at the Department of Energy’s (DOE’s) Idaho National Laboratory (INL). UAMPS will own the project and Energy Northwest, a municipal joint action agency that operates the Columbia nuclear station near Richland, Washington, will run the plant. Columbia is a 1,100-MW boiling water reactor.

NuScale recently selected BWX Technologies (BWXT) of Lynchburg, Virginia, to begin engineering work leading up to the manufacture of the 60-MW NuScale reactors. BWXT, created after reactor builder Babcock & Wilcox (B&W) emerged from bankruptcy in 2006, has deep experience in the U.S. naval reactor program. NuScale has received a commitment of some $200 million from the DOE. Global engineering firm Fluor Corp. is the majority investor in NuScale.

Ironically, BWXT was the early leader in the SMR race, with its 195-MW mPower pressurized water reactor design. After spending some $400 million on the mPower venture (including $100 million from the DOE), B&W declared it officially dead in March 2017. Rod Adams, who worked on the project for B&W, had this epitaph for the mPower project, “There was simply too much work left to do, too much money left to invest, and an insufficient level of interest in the product to allow continued expenditures to clear corporate decision hurdles.”

NuScale still has a long way to go to demonstrate the validity of its SMR. The company said it expects the Nuclear Regulatory Commission (NRC) will approve the NuScale reactor design in September 2020. UAMPS will also have to get NRC approval for a combined construction and operating license for the site at INL. Nonetheless, NuScale’s optimistic schedule projects commercial operation “by the mid-2020s.”

Past experience suggests that nuclear construction schedules are made to be broken. SMRs pose unique challenges to federal regulators, both in the reactor designs and in operational issues such as staffing levels and communications among 12 discrete units, particularly if they are used to follow load. Additionally, power prices in the Western U.S. are already low and natural gas is driving them lower.

Recognizing the challenges to deploying SMRs, the DOE in November issued a report suggesting state standards and incentives, modeled on those boosting renewables, be applied to SMR technology. But, as POWER reported, “To make a meaningful impact, nearly $10 billion in incentives would be needed to deploy 6 GW of SMR capacity by 2035.”

Beyond the LWR?

Several efforts are in place to replace conventional LWRs with other approaches to splitting atoms to generate power. Admittedly longshots, these build-on technologies go back to the early days of civilian nuclear power, and were previously abandoned in favor of the proven LWR designs.

The highest profile of the LWR apostates is TerraPower, based in Bellevue, Washington, and backed by Microsoft founder and multi-billionaire Bill Gates. [ Ed note: TerraPower has now abandoned this joint project with China] Founded in 2006, TerraPower is working on a liquid-sodium-cooled breeder-burner machine that can run on uranium waste, while it generates power and plutonium, with the plutonium used to generate more power, all in a continuous process.

Liquid sodium has advantages over pressurized water as a coolant, including better heat transfer. It also does not act as a moderator to slow neutrons, which allows for breeding plutonium. Sodium coolant has its own set of problems. Sodium catches fire when exposed to oxygen so coolant leaks can be devastating, as has happened in the past.

Nuclear power father Adm. Hyman Rickover, after a bad experience with the Seawolf-class submarine sodium-cooled reactor—the second subs to use LWR technology after the USS Nautilus—commented that sodium-cooled systems were “expensive to build, complex to operate, susceptible to prolonged shutdown as a result of even minor malfunctions, and difficult and time-consuming to repair.” TerraPower hopes to have commercial machines operating in the late 2020s, but industry insiders have reported that the company’s prototype reactor being built in China has experienced major problems.

Another approach to bypass LWRs is the molten salt reactor, long a favorite of nuclear pioneer Alvin Weinberg. A Canadian firm, Terrestrial Energy, is pushing a 190-MW SMR design using the technology Weinberg developed at Oak Ridge National Lab in the mid-1960s. Molten salt technology operates at close to atmospheric temperature and combines the fuel and the coolant. Terrestrial plans to use the technology to power an SMR, with a target date for the late 2020s. Molten salt poses new engineering challenges for nuclear reactors. One nuclear observer commented, “I prefer solid fuel” to the liquid fuel-coolant in the molten salt reactor.

Finally, developers are looking at abandoning uranium as the primary nuclear fuel. Instead, the idea is to use thorium, one of the most-common elements on the planet. Thorium is a slightly radioactive metal. But thorium is not fissile—able to undergo nuclear fission—so it has to be irradiated with enriched uranium in order to be transmuted into fissile U-233.

Thorium’s chief attribute is that the fuel is so plentiful. Terrestrial Energy has shown interest in using thorium in its molten salt reactors, along with low-enriched uranium that is used in the design it is pursuing in Canada. Skeptics suggest that thorium is an answer in search of a question, given the easy availability of uranium, particularly in seawater. Uranium shortages, forecast in the 1960s when advocates first suggested using thorium, have never materialized.

The Union of Concerned Scientists (UCS) is currently wrapping up a study of the new, non-LWR reactor designs. Physicist Ed Lyman, a veteran UCS staffer, told POWER, “Our overall conclusion is that vendors, DOE, and advocates are greatly exaggerating the benefits” of the technologies. “The whole landscape is not compelling. We question whether the best direction for nuclear power is to go off on these more exotic tangents,” rather than focus on making LWRs cheaper and safer. “That’s potentially a better near term” investment, he said.

The original generations of civilian nuclear power failed to live up to their promises. The U.S. nuclear industry stalled in the mid-1970s and has not recovered, despite repeated government and industry attempts at a restart.

Gen III reactors were aimed at overcoming the perceived safety and economic shortcomings of the original machines. As those new designs appear to be falling short, attention has shifted to SMRs or new approaches that abandon traditional light-water technology. Whether they will live up to their billing remains a serious, open question. ■

UNIVERSITY OF BRISTOL MELTING ICE SHEETS RELEASE TONS OF METHANE INTO THE ATMOSPHERE, STUDY FINDS

The Greenland Ice Sheet emits tons of methane according to a new study, showing that subglacial biological activity impacts the atmosphere far more than previously thought.

An international team of researchers led by the University of Bristol camped for three months next to the Greenland Ice Sheet, sampling the meltwater that runs off a large catchment (> 600 km2) of the Ice Sheet during the summer months.

As reported in Nature, using novel sensors to measure methane in meltwater runoff in real time, they observed that methane was continuously exported from beneath the ice.

They calculated that at least six tons of methane was transported to their measuring site from this portion of the Ice Sheet alone, roughly the equivalent of the methane released by up to 100 cows.

Professor Jemma Wadham, Director of Bristol’s Cabot Institute for the Environment, who led the investigation, said: “A key finding is that much of the methane produced beneath the ice likely escapes the Greenland Ice Sheet in large, fast flowing rivers before it can be oxidized to CO2, a typical fate for methane gas which normally reduces its greenhouse warming potency.”

Methane gas (CH4) is the third most important greenhouse gas in the atmosphere after water vapour and carbon dioxide (CO2). Although, present in lower concentrations that CO2, methane is approximately 20-28 times more potent. Therefore smaller quantities have the potential to cause disproportionate impacts on atmospheric temperatures. Most of the Earth’s methane is produced by microorganisms that convert organic matter to CH4 in the absence of oxygen, mostly in wetlands and on agricultural land, for instance in the stomachs of cows and rice paddies. The remainder comes from fossil fuels like natural gas.

While some methane had been detected previously in Greenland ice cores and in an Antarctic Subglacial Lake, this is the first time that meltwaters produced in spring and summer in large ice sheet catchments have been reported to continuously flush out methane from the ice sheet bed to the atmosphere.

Lead author, Guillaume Lamarche-Gagnon, from Bristol’s School of Geographical Sciences, said: “What is also striking is the fact that we’ve found unequivocal evidence of a widespread subglacial microbial system. Whilst we knew that methane-producing microbes likely were important in subglacial environments, how important and widespread they truly were was debatable. Now we clearly see that active microorganisms, living under kilometres of ice, are not only surviving, but likely impacting other parts of the Earth system. This subglacial methane is essentially a biomarker for life in these isolated habitats.”

Most studies on Arctic methane sources focus on permafrost, because these frozen soils tend to hold large reserves of organic carbon that could be converted to methane when they thaw due to climate warming. This latest study shows that ice sheet beds, which hold large reserves of carbon, liquid water, microorganisms and very little oxygen – the ideal conditions for creating methane gas – are also atmospheric methane sources.

Co-researcher Dr Elizabeth Bagshaw from Cardiff University added: “The new sensor technologies that we used give us a window into this previously unseen part of the glacial environment. Continuous measurement of meltwater enables us to improve our understanding of how these fascinating systems work and how they impact the rest of the planet.”

With Antarctica holding the largest ice mass on the planet, researchers say their findings make a case for turning the spotlight to the south. Mr Lamarche-Gagnon added: “Several orders of magnitude more methane has been hypothesized to be capped beneath the Antarctic Ice Sheet than beneath Arctic ice-masses. Like we did in Greenland, it’s time to put more robust numbers on the theory.”

This study was a collaboration between Bristol University, Charles University (Czechia), the National Oceanography Centre in Southampton, Newcastle University, the University of Toronto (Canada), the Université Libre de Bruxelles (Belgium), Cardiff University (UK), and Kongsberg Maritime Contros (Germany). It was funded by the Natural Environment Research Council (NERC), with additional funds from the Leverhulme Trust, the Czech Science Foundation, the Natural Sciences and Engineering Research Council of Canada, and the Fond de Recherche Nature et Technologies du Québec (Canada).

Highlights

•We respond to a recent article that is critical of the feasibility of 100% renewable-electricity systems.

•Based on a literature review we show that none of the issues raised in the article are critical for feasibility or viability.

•Each issue can be addressed at low economic cost, while not affecting the main conclusions of the reviewed studies.

•We highlight methodological problems with the choice and evaluation of the feasibility criteria.

•We provide further evidence for the feasibility and viability of renewables-based systems.

Abstract

A recent article ‘Burden of proof: A comprehensive review of the feasibility of 100% renewable-electricity systems’ claims that many studies of 100% renewable electricity systems do not demonstrate sufficient technical feasibility, according to the criteria of the article’s authors (henceforth ‘the authors’). Here we analyse the authors’ methodology and find it problematic. The feasibility criteria chosen by the authors are important, but are also easily addressed at low economic cost, while not affecting the main conclusions of the reviewed studies and certainly not affecting their technical feasibility. A more thorough review reveals that all of the issues have already been addressed in the engineering and modelling literature. Nuclear power, which the authors have evaluated positively elsewhere, faces other, genuine feasibility problems, such as the finiteness of uranium resources and a reliance on unproven technologies in the medium- to long-term. Energy systems based on renewables, on the other hand, are not only feasible, but already economically viable and decreasing in cost every year…………..

5. Conclusions

In ‘Burden of proof: A comprehensive review of the feasibility of 100% renewable-electricity systems’ [73] the authors called into question the feasibility of highly renewable scenarios. To assess a selection of relevant studies, they chose feasibility criteria that are important, but not critical for either the feasibility or viability of the studies. We have shown here that all the issues can be addressed at low economic cost. Worst-case, conservative technology choices (such as dispatchable capacity for the peak load, grid expansion and synchronous compensators for ancillary services) are not only technically feasible, but also have costs which are a magnitude smaller than the total system costs. More cost-effective solutions that use variable renewable generators intelligently are also available. The viability of these solutions justifies the focus of many studies on reducing the main costs of bulk energy generation.

As a result, we conclude that the 100% renewable energy scenarios proposed in the literature are not just feasible, but also viable. As we demonstrated in Section 4.4, 100% renewable systems that meet the energy needs of all citizens at all times are cost-competitive with fossil-fuel-based systems, even before externalities such as global warming, water usage and environmental pollution are taken into account.

1.This Month

This page has a very useful “SEARCH” function. If you scroll down the right side panel you can use either the categories listed, or the “Search” slot. Type in a name or a place or topic on this slot, and it’s easy to find a related post.

Changing climate change“2040” paints an optimistic picture of the future of the environment

The film focuses on technological and agricultural solutions that are already being implemented to help combat climate change, The Economist Feb 19th 2019

by C.G. | BERLIN ……….In “2040”, a documentary which premiered at the Berlinale, Mr Gameau seeks to wrest hope from the bleak reports of climate change. He was inspired by Project Drawdown, the first comprehensive plan to reverse global warming, and the film is intended as a “virtual letter to his four-year-old daughter to show her an alternative future”. “Many films,” Mr Gameau thinks, are too dystopian, and “paint a future that is really hard to engage and to connect with”. “2040” acknowledges that the Earth has set off down a hazardous path, but focuses on the work that is being done now to steer the right course. What, the film asks, could make 2040 a time worth living in?…. (subscribers only) https://www.economist.com/prospero/2019/02/19/2040-paints-an-optimistic-picture-of-the-future-of-the-environment