Monthly Archives: April 2014

Enenews has a story today which includes a summary of a presentation by Ted Wyka, chairman of Department of Energy’s Accident Investigation Board, at yesterday’s WIPP town hall meeting. A video of the meeting is available here.

Wyka was delivering a summary of a 300-page report which was released today in this large pdf.

Starting around the 1:04 mark in the video, Wyka says the bypass dampers of the ventilation system, which bypass the Station A filter, was the main source of the leak. This bypass system is left over from the initial excavation of the mine. It should have been removed when they starting putting transuranic waste in the mine.

Instead, it was leaking americium and plutonium until March 6, when workers applied foam and blocked the bypassing system.

Significant improvements in the safety strategy for WIPP are warranted to
address design basis accidents that lead to radiation releases. For example, neither the filtered ventilation system nor the underground air monitor that triggered the ventilation system to switch to filtered mode is a credited safety system. In fact, for six days after the fire, no underground air monitors were operational. Had there been a failure on February 14 of the air monitor or filtered ventilation system, or if the release event had occurred three days earlier, the release of radioactive material from the aboveground mine exhaust would have been orders of magnitude larger. (emphasis mine)

For those who don’t know, an order of magnitude is a power of ten. So:

One order of magnitude larger = 10 times larger
Two orders of magnitude larger = 100 times larger
Three orders of magnitude larger = 1,000 times larger
Four orders of magnitude larger = 10,000 times larger

And so on. Since there was a failure of the ventilation system, then according to Winokur, the plutonium and americium release would have been at least 100 times larger than we have been told. Maybe even thousands of times larger.

Now, according to the Wyka report, the exhaust air from the bypass dampers would have been routed through Station B, which contains filters that are used to test for the presence of radionuclides. (It doesn’t block or filter the exhaust, though.) So the plutonium and americium from the unblocked stream of contaminated air should have shown up at Station B.

But the Station A and B results show that the concentrations of americium and plutonium are consistent with the exhaust blowing through Station A, with its HEPA filter removing 99.97% of these particles, and this reduction was reflected in Station B, which measures what leaked out into the environment. In fact, this fact was touted as giving evidence that the ventilation system was working properly.

Where is the unblocked plutonium???

It sounds to me like, there is either another hole that the exhaust came up through, or the Station B results are fabricated.

This whole thing is a result of gross negligence and incompetence on the part of the Department of Energy, and its contractor Nuclear Waste Partnership. On top of it, they exhibit gross mendacity, lying about radiation, like Tepco does.

In reports of levels of radioactive substances in air, water, food, or sludge, you will find the letters “ND” sometimes. This means “not detected”. This occurs when the measured level is less than a certain specified value. This value and its use are either (in the best case) misleading, and based on lab studies which have nothing to do with the risk to human health… or (worse) they are flat-out wrong and based on a mathematical error perpetuated in 1947, and still being used today in some places.

This table contains the latest measurements of I-131, Cs-134 and Cs-137 from Tokyo sludge. As you can see, under the column “Radioactive iodine 131” and row “Hachioji” there is a value of 18. This means that I-131 was detected in sludge at Hachioji and there was 18 becquerels per kilogram of I-131 in it. Now, next to “Eastern sludge plant” it says “Not detected (<29)". That means that the measurement of I-131 was below the value of 29 which is the cutoff for detection of radioactive iodine.
But it does NOT mean that there was no I-131 in sludge at the Eastern plant. Lloyd A. Currie, from the National Institute of Standards and Technology, is a leading expert on the measurement of chemicals and radionuclides. His work on measurement was adopted by the International Union of Pure and Applied Chemistry (IUPAC) in 1995. A description of his work on detection and quantification limits can be found here, and another version here.

THE ZERO MYTH

In many cases the lay public believes, given sufficient effort or funding, that a concentration of zero may be detected and/or achieved. Not unlike the third law of thermodynamics, however, neither is possible, even in concept. A policy of reporting “zero” when L < LC, yielding the decision “not detected”, compounds this lack of understanding. These are issues of major national importance, especially in the context of legislation and regulation, where necessarily, and appropriately, many of the policy makers have critical sociopolitical expertise, but not necessarily scientific or technical expertise. The solution to this sociotechnical dilemma is, once again, very careful communication; and, beyond that, mutual understanding and education among the complementary disciplines.

Stating that a radionuclide is “not detected” does not mean that that isotope is not in the sample. This is both a practical and conceptual impossibility. There is no such thing as “zero plutonium” in a sample. The concept “not detected” is appropriate for lab work. It is not appropriate when human health is involved, except as a convenience.

1. There is no safe dose of radiation.
2. It is impossible to say there is no radiation in a sample of air, water, or food.

Point #2 is as important as point #1.

THE KAISER METHODKaiser developed a method for determining these radiation thresholds in 1947. It was in use for chemicals and radionuclides until 1995, when it was superceded by the Currie method. But the Kaiser method is still in use today in many areas. One slight problem. Kaiser ignored the problem of false negatives and based the entire method on false positives. In other words, he was totally dedicated to getting rid of results that showed there was radiation when there really wasn’t any. He completely ignored the situation where results showed there was no radiation when it was really there. This means that the probability of, say, plutonium in a sample is (a de facto value of) 50%, even when it is “not detected”!

I was gobsmacked when I read this. It is baffling that this could even be derived… I thought you would end up dividing by zero. It is an egregious error, one that a student in Statistics 101 would never make. And it is still being used today in many places.

This means that, for any study of samples of radionuclides or chemicals that depend on a “detected” or “not detected” decision that use the Kaiser method are WRONG. This error means that these detection thresholds are too high. All studies from 1947 to 1995 use this method. Environmental laws have been passed, policy decisions have been made, all based on invalid data due to a mathematical error.

It’s not just radiation. Whether dioxin is in cornflakes, whether arsenic is in chicken, everything like this has been affected.

What a coincidence that this error has served the interests of nuclear and chemical polluters for all those years.

I have seen output from reputable radiation labs that use the Currie method. Whether Tepco uses it, I doubt. They admitted to publishing false radiation data from Fukushima for two years.

Perhaps the most serious terminological trap has been the use of the expression “detection limit” (a) by some to indicate the critical value (LC ) of the estimated amount or concentration above which the decision “detected” is made; but (b) by others, to indicate the inherent “true” detection capability (LD) of the measurement process in question. The first, “signal/noise” school, explicitly recognizes only the false positive (α,
Type-1 error), which in effect makes the probability of the false negative (ß, Type-2 error) equal to 50%.> The second, “hypothesis testing” school employs independent values for α and ß, commonly each equal to 0.05 or perhaps 0.01.

The “signal/noise” school is the Kaiser method. The “hypothesis testing” school actually considers whether radiation is really there when “non detection” is made.

ADJUSTING THE PARAMETERS OF THE CURRIE METHOD TO REFLECT ETHICAL CONCERNS

Note that the values of α , ß, and σQ given above are IUPAC recommended default values, which serve as a common basis for measurement process assessment. They may be adjusted appropriately in particular applications where detection or quantification needs are more or less stringent.

While the Currie method is correct, the default parameters are appropriate only for lab work. In the context of risk for human health, the values of α and ß need to be adjusted. False positives are much less to be feared than false negatives. Consider a glass of milk which is to be given to your child. Which is worse, being told that there is plutonium in the milk when there isn’t (false positive)… or being told there is no plutonium in the milk when there really is (false negative)?

So α and ß should not be equal. The precautionary principle states the the burden of proof is on polluters when environmental contamination is plausible (not necessarily proved). We are not looking for proof that plutonium is in the milk, we are looking for proof that plutonium ISN’T in the milk, during a radiation catastrophe like Fukushima (and WIPP).

A more appropriate value for ß would be .001, which means there is a one-tenth of one percent probability that plutonium is in the milk when the plutonium is “not detected”. A value of maybe .25 is appropriate for α, a 25% chance that plutonium is not there when it is “detected”. An even higher value might be better, but at 50% it becomes like the Kaiser method the other way.

These values can then be used to derive more appropriate minimum detection limits, by straightforward mathematical analysis, with the assumption of a normal distribution. Currie states that this is a special case. A distribution-free method like ODA would be even better.

Currie and IUPAC recommend that all measured radiation values and uncertainties should be published. This means that the table above, with the Tokyo sludge measurements, does not conform to current scientific protocols, since they omit the I-131 values below the “detection limit” and do not publish the uncertainties.

I’ve been sick with viruses twice since WIPP. Also became disabled for two days, unable to walk. The skin symptoms are worse than any point except May-June 2012. I’m putting this period in 3rd place after Fukushima in terms of health problems. The post-WIPP period is now ranked worse than the period of the initial Fukushima plume in 2011.

This latest virus has knocked me for a loop. My son had it a week and a half and he is still coughing. I’ve had it for a week. I think my strength is coming back some… my ears are still ringing though.

Prior to this, I had another bad bout with the flu a month ago. The skin eruption on my hands, which had been developing since January, and which got worse in February, got even worse with this virus. I did a little research and found that pomace (the seeds, stems and skins from grapevines) reduces the level of interleukin-6. IL-6 is inflammatory (usually), and works together with the cytokine TGF-β to differentiate raw T-cells into inflammatory Th17 cells, which is associated with psoriasis (and many other inflammatory disorders). IL-6 is produced when the immune system detects antigens (such as viruses and plutonium particles). Well, I purchased some Italian grappa, which is pomace brandy, and to my delight it cleared up the skin eruption of my hands around 90%. You see, almost nothing works, and to find something that helped is very welcome.

Well, along came the new virus. My son got sick. A funny thing happened at the time, the skin of my fingertips started puckering like I had gone for a dip in a swimming pool. This turns out to be a result of vasoconstriction of blood vessels in the fingers. Apparently my immune system was fighting something off. Then I came down with the illness.

Now, not only in the skin eruption back in my hands, but it spread to the rest of my body. I’ve got what looks like corns on every one of the 10 toes. And the grappa doesn’t work any more.

But what is most concerning is the red spots under the nail cuticles… and pitting and hardening of skin unaffected by the psoriasis. This is a new autoimmune disease process that wasn’t there before.

Systemic sclerosis or systemic scleroderma is an autoimmune or connective tissue disease. It is characterized by thickening of the skin caused by accumulation of collagen, and by injuries to the smallest arteries. There are two overlapping forms. Limited cutaneous scleroderma is limited to the skin on the face, hands and feet. Diffuse cutaneous scleroderma covers more of the skin, and is at risk of progressing to the visceral organs, including the kidneys, heart, lungs and gastrointestinal tract are affected.

Survival is determined by the severity of visceral disease. Prognosis is difficult to predict until the disease differentiates into recognizable subsets. Patients with limited cutaneous scleroderma have a good prognosis, with 10-year survival of 75%, although <10% develop pulmonary arterial hypertension after 10 to 20 years. Patients with diffuse cutaneous scleroderma have a 10-year survival of 55%. Death is most often from pulmonary, heart and kidney involvement, although survival has greatly improved with effective treatment for kidney failure. (link)

I am not saying I have systemic sclerosis (SSc), the symptoms are not all there that would meet the diagnostic criteria. But it is trending in that direction. It may be another condition related to it. But the important thing is that SSc is characterized by overproduction of the cytokine TGF-β. Earlier I mentioned that excessive IL-6 from foreign antigens (like plutonium) was causing TGF-β to differentiate too many inflammatory Th17 cells. Well, it appears that I now have excessive TGF-β too. It explains the massive skin breakout.

… it is possible that the psoriasis, or its treatment, may have triggered the development of SSc, presumably in a susceptible host. A variety of immunological changes have been well described in patients with psoriasis. Both psoriasis and SSC are predominantly characterized by skin involvement, and it could be hypothesized that the skin abnormalities in psoriasis may have altered the passage and processing of foreign antigen through the skin, thus triggering the development of scleroderma. (link)

The skin breakout changed the way that the antigens were passing through and out of my skin. In addition to the radioactive material, there are viruses coming out, and also apoptotic debris from all the carnage from the immune system. This has triggered a whole new disease process.

This whole thing started in January. I was getting thyroid symptoms then… and the December event at Fukushima had something to do with it. Then WIPP came along in February and it got much worse. The grappa helped but then influenza came along and settled the issue.

In a football analogy, the quarterback FUKUSHIMA starts running, pitches the ball to the running back WIPP, who gets stood up just before the goal line by the safety GRAPPA… but then the lineman INFLUENZA come up from behind and pushes the pile over the goal line. Disease 7, me 0.

I wanted to add some new results of iodine-131 in Japanese sludge. I-131 in Chiba has shown an uptick, but is still below the levels of December through February. This is consistent with the recent upturn in visible activity from the Fukushima webcams.

Tokyo iodine levels are still elevated, but below the peak in late January.