Dear Scott Pruitt – Please establish modern scientific basis for radiation regulations
Scientists for Accurate Radiation Information (SARI) recently delivered a petition to Scott Pruitt, the new Administrator for the Environmental Protection Agency (EPA).
The letter, signed by 34 members or associate members of SARI, requests that Mr. Pruitt direct his Agency to revise the basis of risk-based radiation regulations. SARI members believe that regulations should be on a firm foundation of sound science that is aimed at the best possible protection of human health and the environment.
The letter includes specific action requests and justifications that are based on solid foundations and references.
Aside: Unfortunately, several of the references for SARI’s letter are behind some discouragingly high academic journal paywalls. Please contact us if you need access to any specific reference. End Aside.
Since the EPA’s inception, responsible decision makers have resolutely defended its initial choice to use a simplistic, straight line model that asserts that the harmful effects of radiation or chemicals at extremely high dose continue to be delivered no matter how low the dose is all the way down to absolute zero.
The linear, no threshold assertion acknowledges that reducing doses reduces the magnitude of the effects, but doesn’t acknowledge that there is any point at which the effects completely disappear or become so small as to have an effect that is below any concern.
After choosing that simplistic model for radiation because it appeared to be an easy [cheap] way to compute numerical limits, the EPA later expanded the application of the “no safe dose” assertion to numerous manufactured chemical compounds and even to elements like lead, arsenic, and radon.
By firmly asserting that there is always some kind of negative effect of exposure to regulated materials or to radiation, the EPA created a situation where there has been continuing pressure to “increase safety” or “improve environmental cleanliness” by reducing limits.
The pressure to reduce limits extends to situations where there is no experimental evidence of harm when humans or animal stand-ins for humans have been exposed to materials at or near the regulated limit. It even extends to situations where solid experimental evidence indicates that subjects receiving the radiation dose or regulated chemical exposure have better outcomes than the “control” populations that do not receive the dose or exposure.
Limits that are continually ratcheted to lower and lower levels as sensing equipment capabilities improve can add enormous cost burdens without any improvement in overall outcome. Clean-up equipment effectiveness has not progressed at the same rate as sensing technology capability. Devotedly cleaning selected locations to meet overly ambitious standards exhausts available resources and leaves many more concerning areas untouched and waiting for their allocation of funding.
The other detrimental effect of government regulations based on assuming that any exposure at all carries a probability of harm is that it provides a seemingly rational basis for irrational fear, uncertainty and doubts about safety. People who believe that they “see” ghosts or assume that there are creatures in the dark just waiting to pounce cannot lead normal, healthy lives.
SARI believes that the Environmental Protection Agency must stop using a similar assumption that harmful effects always exist, even when radiation doses or chemical exposures approach absolute zero. Their petition letter concludes as follows.
LNT-based radiophobia fuels needless evacuations, results in extraordinary environmental cleanup costs, inspires avoidance of life-saving medical procedures, produces pressure to lower the diagnostic quality of radiation-related medical imaging, and promotes nuclear fear. Considerations of the basic sciences of biology, physics, chemistry, and other natural sciences should be either the source or the final arbiter of scientific hypotheses about ionizing radiation. Epidemiological studies that identify associations with disease do not prove causation. Many of the key studies often referenced in support of the LNT suffer scientific flaws(1) , that ignore the manifold findings of those basic sciences and make their conclusions based on the precautionary principle (rather than the precautionary approach) that radiation exposure must be proven safe for it to be considered safe. This is an impossible task and not consistent with sound scientific principles. . Failure to take proven biological reality into account leads to counterproductive statistical exercises, sometimes fraught with numerous errors. It further leads to the appearance of erudition purely through mathematical complexity. These studies are not benign; they do not err on the safe side; and they have deadly consequences.
Thus, we ask that the EPA’s risk-based radiation regulations be revised as above, as soon as possible.
Disclosure: I am a member of SARI and a signatory to the letter.
I dont think it is enough to cry “down with LNT”.
It’s a little like Obamacare, you need a replacement.
See http://thorconpower.com/docs/lognt.pdf for one possibility.
I agree that we need a replacement, but the SARI letter and its referenced material does more than simply cry “down with LNT”.
Your Logistic model has some intrigue for an engineer, but it is not much more representative of the biological response of living organisms than the LNT model.
So, what should Scott do?
First, Scott should not start a 10 study of low dose radiation. Been there. Done that.
Second, SARI needs to reach agreement on the replacement language.
The Logistic NT model does not pretend to accurately
model the complex biology associated with radiation
dmage and repair, but to imply that it is no better
than LNT is crazy talk. Logistic NT is consistent
with both the cancer risk we see at acute doses above 100 mSv and the lack of any discernible risk in high
background populations. LNT is not. At low dose rates,
the policy implications are totally different.
You need a regulatory framework that is simple
enough for lawyers to implement. That is LNT’s
only plus but it s a very big plus. If somebody
can come up with a simpler model than Logistic NT
which does a better job of modelling risk,
then I will be the first to applaud.
Political feasibility is also paramount. To change
regulations, you are going to need at least a portion
of the radiation establishment to come on board.
In that regard, I believe hormesis is DOA.
Logistic NT gives the establishment the No Threshold
I apologize for stating that the Logistical model is no better than the LNT, but it does share a common flaw – it indicates that there is never a completely safe dose because the curve only touches zero risk at zero dose. Acceptance of that model will always provide the opportunity for opponents to claim that the established “science” indicates that is always the possibility of harm with no possibility of direct benefit from radiation exposures other than as a treatment for a far worse disease.
Experimental evidence shows that the true biological response is hormetic over a wide range of endpoints for both small and large population samples. (Obviously the large samples are not human – yet.) That biological response has a threshold at which there is no measurable harm and no measurable benefit.
All doses below that threshold are adequately safe. There is even the potential that some benefit can accrue and that potential can be effectively studied with human subjects because there is no reason to assume it’s harmful to purposely expose people to doses below the threshold.
When that is the real, scientific and understood model, regulations can be simple and revert back to the 1934 tolerance dose model. That model formed the basis for adequate protection during the Manhattan Project and up until the early 1950s when Muller’s sponsored point of view began to have influence.
At that point, the Rockefeller Foundation’s steady, patient investment in his career (dating back to the mid 1920s) began to pay off. With his 1946 Nobel Prize as an effective weapon and his large research budget at Indiana University, he successfully propagated the highly profitable [for all energy competitors] notion that radiation was some kind of special peril to the future of mankind with “no safe dose” for humans.
The above image was captured from the upper right hand, above-the-fold front page of the June 13, 1956 edition of The New York Times, the self proclaimed “paper of record.” You can just imagine the newsies sales pitch on the day that paper was printed. It has had a long-lasting impact because of all of the reinforcement that it has received over the years.
The 1934 vintage tolerance dose model was simple, keep doses to something less than 0.2 R/day, with occasional mission driven days of up to 25 R followed by a few days of rest. As long as that threshold isn’t exceeded, there is no harm and no need to impose additional regulations on exposures.
There is some need for some rules to protect people from ingesting or inhaling certain isotopes, but the hazard of internal exposures outside of laboratories is highly exaggerated.
It should be illegal for any enterprise to create any kind of incentive for workers to exceed those daily limits.
Simple. No actions shall be taken to reduce or avoid levels of radiation exposure within the range of natural background.
Even beyond the scientific arguments against LNT, this concept should resonate with the public. All it really asks them to grasp is that there is no difference between man made and natural radiation, with respect to biological effects. If they accept this, how can one justify spending huge amounts to remediate nuclear sites, but not “remediate” Denver?
Establishing the “top” of the natural range could be somewhat difficult, as there are rare locations with very high levels (e.g., Ramasar). It should be more like the maximum for areas where significant numbers of people live. Personally, I’d choose 1 Rem/year; nice round number, and a fairly good representation for what the “top” should be.
Thus, no evacuation, or remediation efforts of any kind, shall be applied to areas where the total (natural + man made) radiation level is less than 1 Rem/year. 100 mRem/year?? A small fraction of background?? What was ICRP smoking?
I don’t suspect the ICRP of smoking something.
I’m pretty sure that the ICRP as a group thought that they should defer to the strongly expressed opinions of Hermann J. Muller.
After all, he was a Nobel Prize winner and a public martyr who had been very publicly left off of the speaker’s list at the first UN Conference on the peaceful uses of atomic energy, held in Geneva in the summer of 1955. He dared to “speak truth to power” in the form of Lewis Strauss, the Chairman of the Atomic Energy Commission.
It’s perhaps only coincidence that both Muller and Strauss owed a big debt of gratitude to the Rockefeller Foundation for giving them substantial career boosts over the years. I’m sure it’s not possible for public conflicts to be staged for ulterior purposes like discrediting a powerful future competitor for the world’s energy purchase dollars.
Maybe things like this will help to “make America great again”in nuclear power!
It will help, but low radiation dose limits are not the main reason why nuclear has gotten so expensive. Even if the limits were raised substantially, NRC would not respond by concluding that meltdowns are OK. And the mindset that no expense shall be spared to prevent meltdowns (or any significant release of radioactivity) is the main reason for high costs. The problem is much deeper, and may be an even heavier political lift.
Meltdowns are NOT OK. It’s a loss of the company event, NRC or no NRC. If the US shifts to technology that can not melt the core and go over the site boundary fence, where’s the political lift? But the current NRC structure and policies are such that new ideas are about impossible to develop in the US. To the point some developers have even said they won’t even try. That’s the ‘heavy lift’ issue that needs a fix.
The tragedy is that the notion that meltdowns (or any significant releases) are unacceptable is not entirely false, not due to actual impacts but to the indefensible public/political reaction (see Japan).
That said, why would it be a loss-of company event, if the changes to dose standards being discussed here were made? One thing rational dose standards WOULD do is greatly reduce the cost of post-meltdown cleanup. Also, we have national industry-wide insurance to cover such things. Certainly in the case of SMRs, along with sane dose standards, the industries current ($20 billion) insurance would be enough.
From what I’ve seen, with the NuScale application, etc.., even if you have an “inherently safe” reactor design, NRC would still require all the same impeccable (“nuclear grade”) fab QA requirements and intrusive regulations; basically the same standard of perfection for almost all facets of the entire operation, just because “its nuclear” and “that how we do things in the nuclear industry”. (For this discussion, and “inherently safe” reactor is one that can’t melt fuel or cause radiation levels above the range of natural background anywhere outside the plant boundary even if it did.) Meanwhile, I’m the only one asking impertinent questions like why, if the reactor really is inherently safe, most of the components and activities shouldn’t be classified as NITS (or be performed to typical industrial-grade standards).
In my view, such changes will be needed for SMRs to be truly economical. And getting NRC, and others within the industry, to change their basic mindsets on these issues will indeed be an incredible political lift.
I tend to disagree. The money invested in design and operations that prevent core damage (melting) is well spent. Machines that serve the purposes served by any large power station should be durable, reliable, and long lived.
In my experience, the excessive cost of building, operating and maintaining nuclear plants comes from the effort to avoid very small and inconsequential “leaks”, spills or exposures in order to seek (and never achieve) absolute zero in terms of contamination or doses.
Yucca Mountain, for example, turned in a design that would cost hundreds of billions instead of a few hundred million in drilling some deep holes because the EPA enforced standard was to absolutely minimize the possibility that the most exposed person over the next 10,000 years might receive 15 mrem in a year.
Vermont Yankee’s owners decided it was more financially risky to keep operating partly as a result of their experience in spending 10s of millions and receiving incredibly negative publicity when they had a cracked pipe in an off gas system that allowed a few hundred milligrams of tritium dissolved in about 140,000 liters of water to enter the soil under the plant.
Even after all of the stresses placed on Fukushima by a natural disaster, core damage might have been avoidable if the operators had simply vented the containments when the pressure was low and the coolant did not contain many fission products. Instead, because of excessive fear of even tiny doses, they kept the containment sealed and allowed pressure to exceeid design limits. That may it almost impossible to push Wanger into the core with the low pressure sources that were available in a complete station blackout situation.
Those are operational examples, but the building examples are many and varied. They relate to the opposition and delays in the process that inevitably make costs pile up in a construction project. Those delays happen because there is a general acceptance of the notion that ANY defect that could possibly result in ANY contamination or exposure is important enough to address and debate.
Way off topic but, your recall on F-U1 is hazy. Operators were required to get Prime Minister Office permission to vent. While waiting they had portable gen hooked up to U1 and in process for U2 and fire hoses laid for injection. When U1 blew, it destroyed all that prep, and that explosion restricted access to all U’s preventing actions to cool U2&3. US Operators won’t ask, they’ll vent as EOPs require. Furthermore, the F Plant Mananger even had to lie to the Gov about injecting fire water. The lesson? Yer screwed once the Gov sticks it’s nose in plant operations because they ain’t trained.
I agree that utilities themselves would want durable, reliable equipment. But, as you say in your first paragraph, that is something that they would desire for any type of power station. All that I’m advocating is that the same standards and practices used for fossil plants be used for SMRs.
Also, I think that it should be the utilities’ decision (as it is for fossil plants), as opposed to NRCs. Given that NRC’s (supposed) charter is to protect public health and safety, and these SMRs are essentially incapable of causing tangible harm. One thing a utility would NOT do is delay a major capital project by more than a year just because someone used the latest and greatest concrete standard as opposed to the one specifically referenced in the license application. And that sort of regulatory BS would not change one bit as a result of different dose standards (tied to the defeat of LNT).
I’m not sure I agree that much of nuclear’s costs are due to trying to prevent minor leaks. I tend to blame component fab QA requirements. As you’ve said yourself in one or more blog posts, the nuclear grade version of anything costs several times as much as the industrial grade functional equivalent. My understanding is that it was those unique requirements than companies like CB&I couldn’t deal with, which resulted in the huge losses (Toshiba, etc..) on the US plant jobs.
Be that as it may, in the examples you gave (e.g., VY), a lot of those costs and consequences were due to public reaction. Regulatory bodies actually didn’t have a problem with it. The question is whether or not significantly raising the dose limits, and repudiating LNT, will significantly reduce over-reactions by the public to minor releases and exposures. At present, my impression is that it will not help much. This is all thoroughly non-quantitative and non-scientific anyway. I suppose you wouldn’t have articles specifically saying that this or that was in excess of some ridiculous EPA limit, but they will keep reporting on these insignificant events, using scary language, etc..
I don’t doubt that more sane limits will significantly affect things like waste repositories and perhaps actions taken during accident events. Then again, the cost of even Yucca remains only a fraction of a cent/kW-hr. As for Fukushiuma, and the actions they chose, would different regulatory limits have mattered? I think that many of their actions (attempts to avoid any release) were to avoid media attention and public reaction. To “save face”. Right now, at Fukushima, they are refusing to release water that only has tritium in it, even though releasing it into the Pacific would have absolutely no impact (and meet even current, ridiculous requirements). It is purely due to politics and public reaction (e.g., people refusing to eat fish for no reason).
Once again, my focus is on reducing the initial capital cost of SMRs. And I think that using more standard industrial QA requirements, regulations and general practices, at least at the plant site itself, will be essential to achieving the necessary costs. NuScale says their goal is $6,000/kW. I think it will need to be more like $2500, despite the benefits of lower job risk, smaller capital increments, etc.. At $6,000, it will remain a niche application (e.g., remote power). So how do we get there (to $2500/kW). I think the above changes will be necessary, and it would be pretty depressing if I can’t even convince the people here.
The last sentence of your entire post is spot on. The question is the degree to which repudiation of LNT and having more sane *regulatory* dose limits will, by itself, will help with that situation. I think attitudes and cultures, within the industry as well as within the public, will have to change as well.
BTW, perhaps a hopeful development?
Obligatory scientific reference: in Is Radiation Necessary for Life> (Forbes, 23 Sept 2015) James Conca describes in layman’s terms a deep sub-surface laboratory experiment in which two different species of bacteria, one very sensitive to radiation, and one very resistant to radiation, were both grown at ultra-low radiation doses 500-fold beneath background, at ordinary low doses in the realm of background, and at higher doses.
As measured by growth rate and expression of stress-marker genes, both species did measurably worse at levels lower than background than when grown in normal or somewhat elevated radiation environments. The results were repeatable, and firmly establish, for at least two bacteria species, that there is indeed a lower limit — somewhere — to the adverse affects of ionizing radiation: response is not linear, and there is a threshold.
To some, such results fairly scream for analogous research into low-radiation effects in animals and higher organisms. Which of course grow slower and exhibit other biological confounders. No one said it would be easy. As Rod related five weeks ago, it was the Department of Energy Science Office who in 2013 determined the Low Dose Radiation Research Program should not happen at all.
Rod further relates that when contacted to find out if that correctly represents EPA‘s position, Enesta Jones, a spokesperson from the U.S. EPA Office of Media Relations, provided the following comment dated Jan 12, 2017:
Scott Pruitt was confirmed as EPA Administrator February 17. One does hope he gets the memo.
I think the SARI letter has not accounted for the fact that the EPA Is in the process for several years of increasing the exposure limits in two areas. I don’t remember both, but one is increasing the emergency response doses by several orders of msgnitude, the doses at which you shelter inside and when you evacuate people, etc. Increasing these will help prevent Fukushima Daiichi 1600 and counting deaths. It may also result in massive reductions in EPZ size for existing nuclear power stations.
IN NOT acknowledging this tremendous improvement away from the LNT model the SARI letter risks calling the EPA stupid and risking forward looking relationships to further the EPAs progress.
The EPA did not increase the shelter in place or evacuation doses given in its recently issued Protective Action Guides compared to the last issuance of those PAGs in the early 1990s.
What you might be thinking of is the hub hub raised by devoted antinuclear activists when the EPA made some revisions in the drinking water action portion of the PAG for certain short lived nuclides after recognizing that their previous limits dramatically overestimated the doses that would be received by drinking that water.
Here is an example article from the opponents:
Here is a more balanced description of the actions taken from a reasonably well researched newspaper article.
Have you looked proposed changes changes that have not been implemented yet? I thought they were in the public comment stage.
I’m pretty sure that the PAGs that were in the public comment stage are the ones that were issued as final documents in January 2017. That is the document to which I linked.
If you are aware of different proposals, please provide links.
I will look. I distinctly remember the emergency response dose one. But, maybe they dropped moving forward with that one.
The other one, I forgot sounds like the water concentration one.
Maybe we should use lower case “l” when we refer to the lNT assumption. The biggest problem with the concept is not the simplistic assumption of linearity, but the absurd notion that there is no safe or insignificant dose or dose rate.
How about changing the acronym to “Lethal No Threshold” to emphasize that the regulations kill far more people than radiation does?
Comments are closed.
Recent Comments from our Readers
The Clinton Nuclear Plant also in Illinois was shutdown essentially for almost 2 years before it was taken over by…
Good Podcast – Very informative One thing that was not discussed is how to deal with a particular fear that…
Renewables people are masters in marketing. Unreliable intermittent generators whose output is all over the place, and usually badly correlated…
Looking at their lineup, Westinghouse seems bound and determined to keep Gen IV in its “place” which is apparently the…
So they are developing a scaled down version of the AP1000, which is a scaled up version of the AP600,…