Muller influenced the BEAR to adopt the Linear No Threshold (LNT) assumption in 1956
Hermann Muller, the 1946 Nobel Prize winner in Physiology and Medicine, insisted that there was no threshold of risk from ionizing radiation. His opinion has had a long lasting influence on standards for radiation dose. He was wrong.
History is complicated. Influential people often impose their will with long-lasting results. The stories can be difficult to unravel, especially when the sources of information are buried away in boxed archives.
Dr. Edward Calabrese and his colleagues have engaged in serious documentation excavation efforts to find out what Hermann Muller knew about the genetic effects of radiation and when he knew it. They undertook this effort to try to unearth why Muller worked hard to establish an assumption that they they could not validate through their research.
Muller was influential because he had been awarded the 1946 Nobel Prize in Physiology or Medicine “for the discovery of the production of mutations by means of X-ray irradiation.” Later, Muller used his reputation to influence the radiation standard setting body to accept the linear no threshold dose response assumption without any serious questions about its scientific validity.
During his acceptance speech, Muller made the following statement about the genetic effects of radiation,
In our more recent work with Raychaudhuri (1939, 1940) these principles have been extended to total doses as low as 400 r, and rates as low as 0.01 r per minute, with gamma rays. They leave, we believe, no escape from the conclusion that there is no threshold dose, and that the individual mutations result from individual “hits”, producing genetic effects in their immediate neighborhood.
(Emphasis added.)
Unfortunately, the phrases in the quote above with minor emphasis do not actually support the phrase with the strong emphasis. 400 R (roughly 4 Sv if the dose is from gamma radiation) is a big dose, not a low dose. The current International Atomic Energy Agency standard used to determine if people should be allowed to live in an area that has been contaminated by radiation is 1/200th of that dose – 20 mSv/year. A dose rate of 0.01 R/min is also not a low dose rate; radiation workers take serious action to avoid an extended stay in a radiation field that is 6 mSv/hr (600 mrem/hr or 0.01 R/min).
If those doses and dose rates were the lowest that Muller investigated, there is still plenty of room for a threshold somewhere below those levels. One of Muller’s research associates – a man named Ernst Caspari – undertook a painstaking series of experiments using significantly lower doses and dose rates than those that Muller used in his groundbreaking work.
Caspari’s experiments used a population of 50,000 fruit flies. His results showed that below a certain dose and dose rate, irradiated flies had no more mutations than the control population. Below an even lower level, the irradiated flies had fewer mutations than the control group. Caspari sent Muller a paper documenting his results and explaining how they supported a threshold dose model about five weeks before Muller delivered his Nobel Prize lecture.
Caspari’s results did not get lost in the mail. Muller acknowledge the results within a week. His response indicates that he recognized their implication. However, since those results contradicted his assumptions, he demanded additional testing and verification.
Though his response and challenge to verify was understandable, that does not explain his decision to use the stage provided by his Nobel Prize award to make a definitive statement about the absence of a threshold for radiation effects. He had in his possession a credible study from one of his own associates that contradicted his assertion that there is “no escape from the conclusion that there is no threshold.” Caspari’s results provided an escape and a path to a different conclusion, namely that there is a threshold below which there is no damage.
There is a logically defensible explanation for a December 1946 decision by a politically active scientist to make a statement indicating that there is no safe dose of radiation. The most politically important discussion of the day within the scientific community was finding a way to influence government leaders to eliminate the threat of atomic weapons. The devastation at Hiroshima and Nagasaki were fresh in people’s minds. There was talk about how the US would base its future influence on maintaining a monopoly of the technology.
Dr. Muller spent a significant part of the next ten years working to eliminate nuclear weapons, with a special focus on halting widespread atmospheric testing. He pushed the genetics community to establish the linear dose response model as being the unquestioned consensus view. By 1956, when the BEAR (Biological Effects of Atomic Radiation) first met, it was a foregone conclusion that it would accept the linear dose hypothesis and change the basis for radiation regulation from the tolerance dose of 0.2 R/day (roughly 20 mSv/day for gamma radiation) established in 1934 to a model that assumes there is a finite risk from any excess radiation exposure, all the way down to a dose of zero.
That effort played a major role in generating public interest in fighting fallout. The doses from the testing were minuscule; other than a very few highly publicized incidents, there was no evidence of any public exposures that were even close to the existing dose limits. Muller and his colleagues knew that unless they could convince large numbers of the public that they were personally at risk, they would not be able to gain the support they needed to influence political leaders to stop the testing programs.
They desperately needed people to believe their assertion that “there is no safe dose of radiation” because only those people would be interested enough to organize protests and other political action campaigns.
I’ve known about this fascinating, but complex history for several months, but I had no way to share the raw material that documents the story in excruciating detail.
There’s now a way to point to the whole collection, starting with an introductory letter from Dr. Jerry Cuttler. Go to Submissions Received during Public Consultation of Discussion Paper DIS-13-01, Proposals to Amend the Radiation Protection Regulations (Warning: the document is an 18.7 MB PDF). Scroll to page 2. Under the heading of Individuals, look for Dr. Jerry Cuttler. His name is a clickable link that will take you directly to his submission, which includes several attached papers.
Dr. Calabrese published a lengthy explanation of his research through the scientific history in Archives of Toxicology. That paper, published in August of 2013 is titled How the US National Academy of Sciences misled the world community on cancer risk assessment: new findings challenge historical foundations of the linear dose response. (Starts on page 89 of the PDF document linked above.)
Perhaps unsurprisingly, the chair of the National Research Council of the US National Academy of Sciences did not like the title or the content of Dr. Calabrese’s paper and responded with a sharply worded letter. His letter was titled Letter from Ralph J Cicerone regarding Edward Calabrese’s paper published online first on August 4th: “how the US national academy of sciences misled the world community on cancer risk assessment: new findings challenge historical foundations of the linear dose response.” [DOI 10.1007/s00204‑013‑1105‑6, Review Article] (Starts on page 108 of the PDF document linked above.)
Here is an illuminating quote from Dr. Cicerone’s letter.
It distresses us to see this article’s accusations, with no actual supporting evidence, in a serious scientific journal. Drs. Muller and Stern are deceased and cannot defend themselves against these accusations. Both scientists were elected to our academy by their peers (Muller in 1931 and Stern in 1948) in recognition of their considerable scientific achievements, and Muller was honored with the 1946 Nobel Prize in Physiology and Medicine for his lifesaving work on the physiological and genetic effects of X-rays. In the 1950s, he joined his fellow scientists in warning the American people about the dangers of atomic war and fallout. With Linus Pauling, he worked to bring about a worldwide nuclear test ban treaty.
(Page 109 of the PDF document linked above.)
This quote supports Dr. Calabrese’s interpretation of the history. It exposes Muller’s heroic status within a certain segment of the scientific establishment and demonstrated that part of the basis for that status was Muller’s moral, well-intentioned effort to halt nuclear weapons testing by spreading fear-inducing information about the effects of “fallout.” From a scientific accuracy point of view, it is ironically accurate that Cicerone chose to link Muller with Linus Pauling, who was also a Nobel Prize winner. Pauling is the man who famously told everyone who would listen that massive doses of vitamin C would cure the common cold.
Dr. Calabrese was offered the opportunity to respond to Dr. Cicerone’s letter. He did so with a letter titled Response to Letter of Ralph J Cicerone and Kevin Crowley regarding “How the US National Academy of Sciences misled the world community on cancer risk assessment: new finding challenge historical foundations of the linear dose response.: [DOI 10.1007/s00204-013-110506, Review Article]. Here is a quote from the final paragraphs of that letter response.
“My article revealed that something seriously wrong occurred with the actions of Stern and Muller, leaders of the radiation genetics community. The failure of BEAR I Committee Genetics Panel to achieve its scientific mission of an objective and detailed appraisal of the scientific foundations of the dose response for mutation was also seriously wrong particularly given its societal importance. Yet, national leaders such as President Cicerone would prefer to protect the image of the NAS and the reputations of Stern and Muller rather than assessing objectively the foundations of the risk assessment scheme they created.
While President Cicerone claims that I have unfairly judged Stern and Muller, he is incorrect. The critical judgement emerges from their actions and words, as documented in open publications, now declassified publications and in publicly available private correspondence. The BEAR I Committee Genetics Panel did not study in detail the key papers upon which the decision on LNT was based, but relied upon the judgements of Stern and Muller. The NAS administration failed to properly vet the actions of this committee. The title of my article is appropriate and its content properly substantiated. It is there to be read by all.”
(Page 113 of the PDF document linked above.)
This story, which was hidden from the view of most people who do not have access to university libraries and their collections of peer-reviewed journals, is now more openly available. The documents and their extensive references are available for anyone who cares enough about the topic. I’ve read the documents and a sufficient number of the references be convinced that Calabrese and Cuttler are correct.
Before jumping in with contradictory opinions, please take the time to read the documents you want to challenge.
Did you mean for the title to include BEIR rather than BEAR?
Feel free to delete this comment if you make that change.
If LNT weren’t reasonable science, it would be gone by now. The BEIR panels constantly re-evaluate what they’re doing. These ad-hominem attacks do more damage to the nuclear cause than LNT ever could.
I believe BEAR, Biological Effects of Atomic Radiation, is correct; http://www.nasonline.org/about-nas/history/archives/collections/cbear-1954-1964.html
BEIR, Biological Effects of Ionizing Radiation, was later; http://www.nap.edu/openbook.php?isbn=030909156X
I’m sure they said the same about Lysenkoism.
So you believe there is that kind of conspiracy around LNT?
@Cheryl Rofer
Did you read the material?
I am not asserting the existence of a conspiracy, but I am stating that the LNT has been an unquestioned assumption for nearly 6 decades because few people were willing to bite the hand that fed them.
The Environmental Protection Agency Office of Radiation and Air/Office of Radiation and Indoor Air/Radiation Protection Division is responsible for the budget line that has funded most of the government’s studies on radiation risk for the past several decades. That office also plays a major role in selecting the members of the BEIR.
The office had a pair of senior bureaucrats that were the points of contact for radiation risk and also the people who controlled how the budget for the Life Span Study (LSS) of atomic bomb victims was spent. One retired about 6 years ago, the other continues to serve. A friend of mine served in that office.
He told me about cafeteria conversations during which those two bureaucrats asserted that the LNT would remain in place as long as they remained as government employees; it was their ticket to a comfortable career.
The BEIR continues to assert that the information extracted from the LSS is the “gold standard” of radiation risk science.
Call it what you will; I call it feather bedding.
My comment on conspiracy was in response to Engineer-Poet. There was quite a structure that kept Lysenkoism alive. I was wondering if he believed that was the case here.
Rod, I know that you like to take the ad-hominem approach to LNT. But I’m a scientist, so I do it differently. I’ve read a great deal of the stuff you’ve provided in the past on Calabrese and Muller’s dislike of nuclear weapons. This doesn’t look any different from the post, although Cicerone’s responses might be interesting in full. In any case, my argument does not depend on Calabrese, Muller, hormesis, blah blah blah. It depends on how science is done.
The terrible actions you are alleging of Muller took place 50-60 years ago. There have been several BEAR/BEIR panels since then. Each includes tens of scientists. Their charter is to reconsider everything, including LNT (which you can read about in BEIR VII – have you read that?).
It’s common scientific practice to extrapolate a curve (in this case a straight line) into areas of sparse data. You have to have a good reason to do otherwise.
The BEIR panels have their criteria, and the studies that Calabrese and others find so convincing do not meet those criteria. Until they do, LNT stands.
What would eliminating LNT do? The extremely small differences in numbers from other extrapolations (yes, even your beloved zero and hormesis) have little or no significance. It would remove the ability to say “no dose of radiation is safe.” While that would be a good thing (if scientifically true), that meme is so engrained in the media that it will take a long time for it to disappear. So even that benefit would be marginal.
In short, to end the use of LNT, you need to marshall a body of evidence equivalent to that in the BEIR reports in favor of your position, have a group of equally respected scientists vet it, as in the BEIR reports, under the aegis of a body equivalent to the National Academy of Sciences.
Until then, this ad-hominem argument damages the nuclear cause.
Wow, Cheryl is rolling with the logical fallacies today.
“If LNT weren’t reasonable science, it would be gone by now.”
Bandwagon Fallacy
“The BEIR panels constantly re-evaluate what they’re doing.”
Appeal to Authority
“These ad-hominem attacks do more damage to the nuclear cause than LNT ever could.”
Red Herring and not even a good one, since Cheryl apparently doesn’t know what an “ad-hominem” is.
Aside: If Calabrese had said that Muller’s influence on the BEAR Committee was wrong because he was a dirty commie who wanted to undermine the US’s defenses against the ruskies, then that would have been an Ad Hominem Fallacy. Instead, Calabrese claims that Muller lobbied the BEAR Committee to take a position that he knew was not supported by the scientific evidence that he was aware of.
Whether Muller did or did not depends on the evidence presented by Calabrese, and the onus is on Calabrese to make a convincing case. It does not depend at all, however, on Muller’s reputation, character, or anything other than his actions. If Muller’s reputation suffers as a result of what has been demonstrated, then that’s too bad, but that’s not enough to make this an Ad Hominem.
“So you believe there is that kind of conspiracy around LNT?”
Straw Man
And the winner is
“Rod, I know that you like to take the ad-hominem approach to LNT. But I’m a scientist, so I do it differently.”
A Straw Man, Poisoning the Well, and a (very pompous) Argumentum Ad Verecundiam all in one!
Bravo!!
Brian, you and I are arguing different things. So your entire post is straw man.
Rod’s argument is that Muller did Something Bad in interacting with BEAR, and therefore LNT is wrong. That’s a non sequitur.
I don’t care if Muller did Something Bad, although Cicerone’s letter is very worth reading on that subject. Bottom line: Calabrese doesn’t show that’s the case.
You don’t address any of the scientific points I’ve made.
IIRC, you’re a scientist or engineer, Brian. How long would it take in your discipline for something erroneous to be discovered?
@Cheryl
I am neither a scientist nor an engineer. I was a humanities major in college and then spent my career as a leader of people. I was fortunate enough to get a solid dose of technical training in a wide variety of topics. I even spent some time as an investigator and served on a couple of promotion boards.
Yes, I tend to point to the way humans make decisions to explain events or policies that make little or no sense without an understanding of human behavior, man’s inhumanity to man and plain old greed.
You asked Brian how long it would take for something erroneous to be discovered. I can tell you from personal experience that many errors can remain in place for decades as long as no one looks very hard or if certain people who have the authority to take action act like the “hear no evil” monkey.
In the case of low dose radiation research, it’s odd how it often gets its funding severely curtailed just as soon as researchers get close finding interesting results that challenge the status quo.
https://atomicinsights.com/low-dose-radiation-research-program-defunded-2011/
Have you ever wondered why, after ten years of careful research, the nuclear shipyard workers study was never published? John Cameron did.
http://www.aps.org/units/fps/newsletters/2001/october/a5oct01.html
@Cheryl
What would eliminating LNT do?
Among other things, perhaps it would substantially lower the cost and the uncertainty associated with long term used fuel disposal. My understanding is that a major portion of the cost associated with building a facility like Yucca Mountain is associated with meeting a standard for radiation exposure that makes absolutely no sense without being based on LNT. Current law requires proof that no one would ever receive more than 15 mrem (0.015 mSv) per year for the next 10,000 years.
There would be no need for titanium drip shields if the limit was a more logical 500 mrem (5 mSv) per year.
And of course, as is usually the case when commenting before reading an entire post, I now realize that I should have withheld comment.
A tu quoque fallacy?! Classic! OK … we have a new winner. 😉
I agree that it’s a non sequitur, but it’s your non sequitur, not his. This particular article is about Muller, but Rod’s criticisms of the LNT model span many articles. I suggest that you read (or at least skim) them before trying to tell everyone here what Rod is and is not trying to say.
Besides, if the “Something Bad” really was withholding scientific evidence against the LNT model from the committee, then it substantially undermines your hackneyed appeals to authority. Don’t you think?
I would love to, but first you actually have to make a scientific point. You have not. All that you have provided are appeals to authority, appeals to ignorance, and a host of other logical fallacies, some of which I have tried to catalog here.
For what it’s worth, although the vast majority of my education was in science (or mathematics), I usually bill myself these days as an engineer. My wife is the scientist. She studies cancer, particularly what causes cancer. Compared to the majority of things that she studies, radiation is a very weak carcinogen. We currently don’t understand very well the relationship between dose and cancer of these far more potent carcinogens, particularly at very low doses. (If we did, my wife would soon be out of a job.) What makes you think that we understand so well the dose response of a much weaker carcinogen at such low doses?
Certainly, the BEIR VII Committee did not, which is why they refered to the LNT model at low doses as a mere “assumption.” Nevertheless, it was an assumption that they were willing to continue to go with for the time being — with historical precedent being an influential factor in the decision, I don’t doubt.
As someone who has a solid education in the physical sciences, I know that it took 218 years for Classical Newtonian Mechanics to be replaced with a more accurate model of how the universe actually works in uncommon situations. And by the way, the old way of thinking was not overturned by a committee.
“It’s common scientific practice to extrapolate a curve (in this case a straight line) into areas of sparse data. You have to have a good reason to do otherwise.”
That’s actually a very bad scientific practice. Good science involves performing experiments/studies to increase data collection in areas of sparse data in order to gain a better understanding. The dictionary has a very good definition of the word science for your immediate reference.
As for arbitrarily assigning a “linear fit”, I can think of many reasons to do otherwise. For one, nature tends to act more exponentially than linearly. See for instance Benford’s law. For another, extrapolation gives one the false confidence that you understand something that you clearly don’t.
Another aspect of this whole over-arching story and the question of whether LNT is correct or not, beyond simply the Test Ban Treaty, is the Cold War doctrine/assumption of Mutually Assured Destruction (MAD).
A fairly significant fraction of the MAD assumption is due to the radiation effects, and not solely the instantaneous blast effects of a nuclear detonation.
If the effects of low enough doses of radiation had been known to be less devastating than asserted by Muller, could MAD have been successful at deterring nuclear war during the Cold War period?
Could the decrease in the proportion of deaths from warfare sinec the 1940’s have occurred without the specter of mutually assured destruction deterring some conflicts, particularly between the U.S.A. and U.S.S.R.?
What Paul said right here.
Third reason not to extrapolate into regions of sparse data: it may be sparse for a reason.
One would have to ask themselves why the data is so sparse. Is it because nobody receives low doses of radiation? There are a million Rad Workers in this country (including myself) who disprove that theory. There have been numerous occasions where low- to mid- level radiation exposure occured due to nuclear-related incidents in this country. That’s not to mention Fukushima, TMI, and Chernobyl. And yet, with this wealth of data at our disposal, we have a sparse collection of data showing low level radiation causing health effects.
I challenge Cheryl or any other nuclear hater to explain this paradox.
“What makes you think that we understand so well the dose response of a much weaker carcinogen at such low doses?”
Because Radiation Biologist, Michael H. Fox said so in his 2014 book (Oxford Press):
“Why we need Nuclear Power: The Environmental Case”
“Indeed, we probably know more
about the carcinogenic effects of radiation than of any other physical or chemical
agent. Based on this information, we can confidently predict the risk of getting
cancer from a particular dose of radiation. ”
——————————————–
Of course he says a lot of other wild notions in this book, including breeder reactors work great (shut down by politics) and that plutonium is astonishingly safe (unless you just happen to inhale particles in the air contaminated with it). The book is amazingly slip shod from a scientist who is basically bought and paid for like any cheerleader in Oil and Gas.
This is the kind of argument that damages the nuclear cause. Paul obviously has not read BEIR VII and doesn’t understand what data is available. Or that when the rest of the data follows a straight line, the logical extrapolation is a straight line, not whatever Paul wants.
Unless all the studies showing thresholds and/or hormesis (both animal and epidemiological) have been faked, there absolutely has been and still is such a conspiracy.
Thanks, Rod. At least you’re providing some substance. I’m not sure if you’re right or not, but this is a reasonable argument when properly backed up. It helps to explain why you think LNT is important.
I think I’ll just let Brian dance around with his identifications of logical fallacies and name-calling. He gets a couple of things right:
Compared to the majority of things that she studies, radiation is a very weak carcinogen. We currently don’t understand very well the relationship between dose and cancer of these far more potent carcinogens, particularly at very low doses.
Certainly, the BEIR VII Committee did not, which is why they refered to the LNT model at low doses as a mere “assumption.” Nevertheless, it was an assumption that they were willing to continue to go with for the time being — with historical precedent being an influential factor in the decision, I don’t doubt.
Yes, LNT is an assumption, not a “theory”, as it is so often referred to by those who would eliminate it. And yes, it makes sense because it’s an extrapolation of the linear behavior at higher doses (what I said earlier).
The replacement of classical Newtonian mechanics by quantum mechanics is hardly analogous, until you have shown that LNT indeed needs to be replaced. It’s romantic to think that LNT will be overturned in one brilliant swipe, but it was originated by a committee, and it will be overturned (if it ever is) by a committee. UNSCEAR (another of those dreadful committees) has a slightly different approach.
And yeah, I did make several scientific points. Sorry if you missed them, Brian.
I come over here and participate in discussions every once in a while because I’m baffled by what drives this hatred of LNT and the certainty that it is wrong. I see a lot of misunderstandings of BEIR VII and how science operates. But I know I’m not going to change anyone’s mind here.
Thanks to Rod for his hospitality.
Ciao
You can add in a link to this article if people want to read it:
http://www.genetics.org/content/33/1/75.full.pdf
At the risk of getting into all kinds of terrible and tortuous back and forth and ad hominem and blind alleys (and charges of being a paid shill and all that), I’ll just say that it seems to me that both Calabrese and Cicerone have perhaps both missed the mark. Their citations list are surprisingly incomplete, and don’t appear to be a comprehensive look at this literature (or the debates at the time).
The Caspari and Stern study did cause a stir, and it seems to have been generally well known and debated at the time.
http://www.genetics.org/content/36/3/281.full.pdf
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1209473/pdf/56.pdf
http://dx.doi.org/10.1007/BF02477341
But subsequent work didn’t confirm the result. Particularly a 1949 publication by Uphoff and Stern, same Stern I believe (which is only available to general readers as an extract).
You can find a more detailed account of the same here:
http://www.osti.gov/scitech/biblio/4419236
Including some of the factors that may have accounted for the “apparent inactivity of irradiation” in the Caspiri and Stern (1st) study: “(1) low sensitivity to irradiation of aged sperm, (2) dependence of induced mutation frequency at low dosages on a time factor, and (3) errors of sampling which might have obscured a true difference between control and experimental rates.”
It seems rather strange to me that Calabrese doesn’t make reference to this research, which Muller had recommended, and was subsequently done showing a contrasting result.
Calabrese seems to be suggesting that the Caspiri and Stern (1st) study somehow settled the matter! From a fuller look of the literature at the time, this appears to have been very far from the case. The fact that Cicerone doesn’t know this history either, I am sure is also telling (if their exchange could be said to be a genuine scientific debate to begin with).
Just a note … I read through this all very quickly (perhaps I have overlooked something). If anyone has anything else to add, please, I welcome any comments (and a closer look at this scientific literature and debate and bit of intrigue from the archives).
The weakness of radiation as a carcinogenic effect is supported by the fact that researchers working at maximum intensity for years trying to prove otherwise have provided statistics that prove that the effect is weak. The news headlines will usually say otherwise, but you have to read the studies and understand the way they manipulate the data in order to understand how their conclusions are reaching.
Common sense can be a guide for those who don’t want to actually read the studies. Take a powerful carcinogen…say tobacco. Think of how many people you know of that smoked or chewed tobacco their entire life. How many got cancer of the throat or lungs?
Now consider those exposed to low level radiation every day. For instance, there are many thousands of people working at nuclear power plants, national laboratories, and nuclear medicine, just to name a few. How many class action lawsuits are there for people dying of cancer due to these jobs? How many heartstring tugging news stories? Where are the people on their deathbeds decrying the nuclear industry and lamenting their career choice?
Rod Adams wrote:
You asked Brian how long it would take for something erroneous to be discovered. I can tell you from personal experience that many errors can remain in place for decades as long as no one looks very hard or if certain people who have the authority to take action act like the “hear no evil” monkey.
Errors can also persist when there are people whose livelihood depends on the continuation of said error.
A very interesting article:
http://www.nationaljournal.com/global-security-newswire/epa-abandons-major-radiation-cleanup-in-florida-despite-cancer-concerns-20140128
Perhaps there’s hope, at least at the state level.
Radiation cleanup standards and Superfund standards in general (e.g., 100 mrem/yr, or a 10-4 to 10-6 allowable cancer risk for a hypothetical most exposed individual) represent what is perhaps the least effective ever use of resources to protect public health (as measured by dollars per life saved). And this statement is true whether you assume LNT or not…..
If you question LNT, the answer is obvious. Any efforts to reduce dose rates within the natural range (certainly those under 1 Rem/hr) is a complete waste of money, period. Zero benefit.
However, even if one assumes LNT, it is still an astonishing waste of money, given the small exposed population and relatively low exposure levels. If one assumes LNT, then the total public health impact (i.e., number of deaths or diseases) scales with COLLECTIVE exposure. Limits should be placed on collective exposure, not individual exposure. Upper limits on individual exposure, or health risk, are meaningless. Only the total number of deaths/diseases counts. You can’t invoke LNT, along with arbitrarily strict limits on individual risk (for certain things only), in order to justify absurdly low dose limits, w/o considering the number of people exposed. You can’t have one w/o the other.
The point is that there are many sources of collective public exposure that are literally millions of times as large as those present at a Superfund site, or in the contamination zone around a nuclear plant, post-meltdown. These include natural background radiation, notably radon (~100 million people exposed to hundreds of mrem per year), air travel, and medical exposures (many of them not necessary). The reason for the higher collective exposure is mainly the much larger number of people exposed. Superfund sites or nuclear evacuation zones involve far smaller numbers of people. However, these larger collective exposures are ignored because either the max individual exposure is under the (arbitrary) limit, or limits are simply arbitrarily applied (e.g., only apply for nuclear industry, or “man made” sources of exposure).
This needs to change. If we insist on continuing to use LNT, there should be limits on collective exposure, as opposed to individual exposure. (Any individual limit should be very high, certainly no higher than the top of the natural background range, after all, we let millions live with such exposure levels!). There should be no distinction between different sources of exposure (e.g., natural vs. man-made). Cost-benefit analysis should be applied, where a consistent dollars per life saved criterion is applied. Assuming LNT, that would correspond to a dollars per man-Rem avoided (not dose to some most exposed individual).
For the $11 billion required to remediate the site discussed in the article, we could probably mitigate radon exposure in all ~100 million US homes, and the avoided collective dose would be thousands if not millions of times higher. Pompously proclaiming a maximum individual cancer risk of 10-4 (or perhaps even 10-6) in a world where 25% of the people die of cancer is obviously capricious and arbitrary, as well as blatantly hypocritical. Obviously there are vastly larger sources of risk out there that are not being addressed.
I jumped the gun by commenting before reading beyond the headline.
Extremely well put, Mr. Hopf.
These aspects of risk management are the part of Bob Applebaum’s crusading that makes me most mad. Bob, of course, never takes any positions in regards to weighing different risks against each other, as doing so would be contrary to his crusading around the Internets. I am actually surprised he has yet to show up in this comment section.
Cheryl – Actually, I was referring to Special Relativity, but I’ll give you points for consistency. Once again, you’ve summoned the presumptuousness to tell someone else what you think that they have said, and once again, you’ve gotten it wrong.
Nevertheless, either example from physics serves as a good caveat against extrapolating to scales beyond the range in which you have good scientific data. It shouldn’t matter whether you are talking about speed, size, or dose.
But if you are going to be miffed whenever someone is polite enough to answer a question that you have posed directly to them, then perhaps in the future you should simply refrain from asking the question, or at the very least explain that the question was meant to be rhetorical.
It must be because, as you like to point out, you’re a scientist, so you “do it differently.” 😉 Could you please explain these missed scientific points to us troglodytes? (I mean the points other than the ones on which we agree, of course.) I’d hate to miss anything important, and I promise not to get miffed if you are polite enough to answer my question. Thanks.
Physics guy in my past life, turned public policy/economics.
For me, the most interesting thread in a year. The Fukushima accident really struck a chord with me when it happened. In the sciences, I only have a math/physics degree and knew very little about nuclear power and nothing about nuclear safety and radiation levels.
But unlike 1986, we had google in 2011. It didn’t take me long (2 or 3 days) to wonder what in the world the US was thinking with a 50 mile evacuation zone, the insistence of complete loss of water at #4, State recommending to evacuate Japan, etc.
It is almost 3 years ago, when I’d walk around my small Japanese city meeting the same shopkeepers I always met but said under my breath, “I’m very sorry that my country is screwing you over.” How could we not know what we are doing when we have some of the best nuclear engineers/scientists in the world?
It took a while longer to realize how political this is.
It was also in mid March 2011 that I muttered, “That’s it, we are no longer a superpower if we react to an ally like Japan in such an incompetent manner.” A great power, yes. A superpower, no.
@Cheryl
In response to Paul you said:
“Or that when the rest of the data follows a straight line, the logical extrapolation is a straight line, not whatever Paul wants”
I guess I don’t begrudge your assumption that, in the untested low-dose regions, the model would follow a straight line, but I do beg to differ with the ADDITIONAL assumption that that line intersects the origin(as LNT would have it). There’s no way any amount of data well away from the origin could tell you that. The fact that most every preparation survives the background dose puts that response a the zero level or every close to it.
There’s every reason to believe that in the regions where the dose is testable, any straight line that fits the data is parallel to a lower oblique asymptote of the actual model and that this asymptote and model intersect well to the right of the dose-response origin. This then becomes what has been called the “Threshold Model”; with further analysis and testing this could evolve into a “Hormesis Model”. There is a relatively simple five-parameter model of a hormesis function based upon an oblique hyperbola. I can supply details if you’re interested.
Hormesis is simply the result of life adapting to stress.
The poison is in the dose.
I think Jim Hopf nails the key question here; why collective dose from NPPs is treated differently than dose from other sources (natural and especially man-made medical sources which are a choice). This post started with a premise, about how we got to LNT. And my reading is that the premise is probably correct and I don’t personally care if that premise was not based on sound science or not. Ridding the world of nuclear weapons is a good goal. Effects of rad dose below natural background is hardly my strong point, but a can read and comprehend. What is see here is argument pro LNT “because it’s in the mud zone of data, so lets be conservative.” And also argument anti LNT because “no, LNT is in the mud, not really the known data of life with background dose.”
So wrt the premise of the original post, what’s driving LNT today? That’s the question that needs to be clearly answered. Is it the same as the original premise or just somebody is making a lot of money from it?
Jim – did you mean to write 1 REM/yr? Even under a standard of as high as reasonably safe, 1 REM/hr is a dose rate that should be avoided.
@William Vaughn
I’m not aware that in all cases the line is parallel to “a lower oblique asymptote of the actual model.” In fact, from available research on the issue, several extrapolations are suggested: linear, downwardly curving, upwardly curving, threshold, and hormetic (here and here).
The lack of statistical significance of studies in the low dose range (below 100 mSv) does not mean there is “no effect.” And animal (human surrogate) studies are prospective and very challenging to apply. BEIR VII has already suggested DDREF values to adjust for linear risk in the low dose range. But notes: “there is considerable statistical uncertainty in the DDREF selection.” They appear to be working with, and seeking to apply in a reliable manner the findings from the research suggested in this article (and other animal and human surrogate models).
Given that we’re talking about extrapolations in a very low dose range, and that LNT does not exclusively pertain to risk estimates in the low dose range, I take it you are in general agreement with the assessment on intermediate and high doses, and the general finding (based on “more than a century” of research) that “Little question exists that intermediate and high doses of ionizing radiation, say >100 mSv, given acutely or during a prolonged period, produce deleterious consequences in humans, including, but not exclusively, cancer” (p. 13761).
@EL
I don’t know about William, but I do not agree with the following statement.
I take it you are in general agreement with the assessment on intermediate and high doses, and the general finding (based on “more than a century” of research) that “Little question exists that intermediate and high doses of ionizing radiation, say >100 mSv, given acutely or during a prolonged period, produce deleterious consequences in humans, including, but not exclusively, cancer”
Your source is a ten year old paper whose authors had no knowledge of the DNA testing conducted by research groups funded by the low dose radiation research program. Example: Berkeley Lab at Lawrence Berkeley National Lab
http://newscenter.lbl.gov/news-releases/2011/12/20/low-dose-radiation/
I specifically reject the assumption that doses below a damage threshold are cumulative. What is your proposed mechanism for the accumulation? If a repair from a radiation dose has been made, why would a later dose add to damage that no longer exists? Measured results consistently show that a priming dose reduces the damage from a later dose.
Rather than conspiracy, there’s a very strong rule of precedent applied here.
The standard by which efforts to repel the LNT have been held accountable to until now is that they should prove beyond any reasonable doubt that LNT is false.
Meanwhile LNT stays, not based on any proof it’s correct at low dose, just on the fact it’s the existing rule.
The reason this is possible is that at very low dose, the effect predicted by LNT is very small. There’s a very large number of studies that found basically no effect, and then concluded this was compatible with LNT, since the effect predicted by LNT was so small it would not be distinguishable.
There’s also the fact that many studies use corrective factors, and when you know the result everyone expects you to find is LNT, then it’s very easy to be influenced into using the corrective factors that will confirm it. For exemple, in the Taiwan Cobalt 60 contamination case, the raw cancer rate was 40% lower than the normal cancer rate, but it was assumed this was because the average age of the population of those newly built apartments was lower the Taiwan average. Why not but it seems to me that it might have ignored then that the inhabitant would be also poorer than the average (it seems those constructions also included a large number of low cost and socialized housing built quickly to lodge a fast growing population). So when do you know that you have *the* good value for your corrective factors and stop trying to enhance it ? Isn’t it very tempting to say it’s probably good once the results confirms LNT is true, or at least becomes compatible with it ?
The basic problem is that the predicted LNT effect is very small and in all case buried below other carcinogenic risks in all studies, which makes it very hard to create a good study. Of course, for those who believe in hormetic effects, the situation is similar, just reversed.
I don’t think any real scientist would argue that extrapolation of any form is a bad practice.
As for linearly fitting data, of course all data over a small domain looks linear! Calculus 101 proves that. Only a very arrogant scientist believes they understand the ENTIRE problem based on a limited domain of testing.
“Or that when the rest of the data follows a straight line, the logical extrapolation is a straight line, not whatever Paul wants.”
With this set of assumptions, I should assume that the V-I curve for transistors is linear across its entire range, because there is a linear region.
(Hint: It’s not. In fact, there’s a threshold voltage down at the bottom of the range, which makes transistor curves a near perfect analogy.)
From BEIR VII:
“At doses of 100 mSv or less, statistical limitations
make it difficult to evaluate cancer risk in humans. A
comprehensive review of available biological and
biophysical data led the committee to conclude that the
risk would continue in a linear fashion at lower doses
without a threshold and that the smallest dose has the
potential to cause a small increase in risk to humans.3
This assumption is termed the “linear-no-threshold”
(LNT) model.”
Translation – There is not sufficient data at low levels of radiation in order to make a determination scientifically. Therefore, the assumption (now considered assertion since assumption is highly engrained in non-scientific minds) that linear fit is the best model.
What is very interesting is that when the matter of concern is waste from phosphate mining instead of nuclear energy, the limit today *is* that more logical 500 mrem.
See “EPA Abandons Major Radiation Cleanup in Florida, Despite Cancer Concerns” http://www.nti.org/gsn/article/epa-abandons-major-radiation-cleanup-florida-despite-cancer-concerns/?mgs1=a83cextvBg
I’m not going however to forget to use that case the next time someone tells me the nuclear waste problem is dire and unsolvable.
“Little question exists that intermediate and high doses of ionizing radiation, say >100 mSv, given acutely or during a prolonged period, produce deleterious consequences in humans, including, but not exclusively, cancer.”
I don’t agree. “Acutely” and “over a prolonged period” are two very different things. In either case, the statement isn’t scientific enough for an agreement to be made one way or the other. Acutely could mean instantaneously or it could mean over a few hours. Prolonged period could mean anything. Context in this discussion requires actual time intervals to be specified. Only then could the statement be compared to experimental evidence.
@Cheryl : You say that the LNT assumption makes sense because it’s an extrapolation of the linear behavior at higher doses.
Actually the LNT assumption results from the following two assumptions :
– DNA damage is linear according to the received radiation dose, independent of the rate
– Cancer risk is linear according to the DNA damage
The first of those two assumptions is the easiest to test and it has been proven wrong by a succession of experiments.
One of those is the MIT study proving that instant exposure to 100mSv does massively more DNA damage than the same dose over 5 weeks, actually over 5 weeks it’s not detectable : http://www.ncbi.nlm.nih.gov/pubmed/22538203
Another study has shown that DNA repair centers do not behave linearly according to dose : http://www.ncbi.nlm.nih.gov/pubmed/22184222
Etc, etc.
Therefore for the LNT assumption to hold true, we would need to have a non-linear relation by which a small amount of DNA damage at low dose would be more likely to result in cancer that a large amount at high dose.
This is highly unlikely and non natural. LNT still might be a convenient way to model risks for which we don’t know for sure what we should use instead.
But it’s certainly not the option that makes most sense based on todays knowledge about DNA and ionizing radiation.
@Rod Adams
My statement on this research (and summary of available scientific literature) differs in no respects from that of Health Physics Society.
http://hps.org/physicians/documents/Radiation_Benefit_and_Risk_Assessment.ppt
I find their assessment of risks and benefits from radiation to be fully consistent with available scientific research (and have no problem with any of their statements to this effect). They even specifically cite the Brenner, et. al. (2003) review paper as a foundation for their assessment of this issue.
I also don’t see any contrast with the Berkeley Lab modeling of LNT extrapolations in low dose range (below 0.1 Gy as they report). There are already DDREF adjustments to LNT in the low dose range (accounting for biphasic dose response as reflected in prospective animal and in vivo cellular studies). Above this range, as your reference details, they find results are consistent with dose response model (derived from LSS cohort of atomic bomb survivors and others), and show “that cancer incidence increases with an increase in ionizing radiation …”
If you are going to disagree with HPS, foremost radiation health professionals (Brenner, et. al.), BIER VII, health standards that form the basis of current radiation protection standards, and Lawrence Berkeley National Lab low dose radiation research program, it begs the question on what substantive and independently verifiable scientific basis. Personal intuition?
@Paul
One way of answering this question is the following:
“Compared with higher doses, the risks of low doses of radiation are likely to be lower, and progressively larger epidemiological studies are required to quantify the risk to a useful degree of precision. For example, if the excess risk were proportional to the radiation dose, and if a sample size of 500 persons were needed to quantify the effect of a 1,000-mSv dose, then a sample size of 50,000 would be needed for a 100-mSv dose, and ≈5 million for a 10-mSv dose. In other words, to maintain statistical precision and power, the necessary sample size increases approximately as the inverse square of the dose. This relationship reflects a decline in the signal (radiation risk) to noise (natural background risk) ratio as dose decreases” (p. 13761).
The largest study I know of is the 15 country low dose cohort (here and here) of some 407,391 nuclear workers (with a mean cumulative dose of 19.4 mSv). As predicted, excluding the Canadian data which appears to be overestimated, excess relative risk was “no longer significantly different from zero” (which is not the same as no effect). Subsequent follow-up with UK segment of this cohort (a much smaller segment of 174,541 workers) has shown a statistically significant result (and for mean lifetime dose in range of 24.9 mSv).
As all of these cohorts age, subsequent follow-up is anticipated and warranted.
Getting a statistically significant result for a mean lifetime dose of 24.9 mSv for 175 000 workers denies LNT (as you noted above when stating sample size required to confirm it).
It sounds like data mining where the scientific foundation for statistical significance is broken by selecting amongst a large data set the one that randomly appears to be statistically significant. I’m not saying that the same can’t be done in reverse to deny radiation effect by randomly selecting the studies that randomly don’t seem to show it.
It’s so difficult statistically to get significant results from such studies that I think the truth could only come from better technical understanding of how radiations directly affect DNA, and how cancer risk rises from DNA damage. Those at the moment already give strong indication of large non-linear factors, as well as high similarity between radiation caused damages and other damage causes (showing the unique way radiation risks is treated with regard to other risks does not have a factual justification).
Or from having the opportunity to precisely study what happens with a group of person where the individually predicted effect is high enough to clearly and unambiguously stand out from normal cancer risk. The Goiania incident is quite near from such a case. Four people died from Acute Radiation Syndrome. 28 suffered skin burns (skin dose for such an effect is above 2 Gy), several people lost fingers and 5 required skin grafts (very high *skin* dose, clearly above 10 Gy). Around 250 were identified as being contaminated, see http://www.lastwordonnothing.com/2011/03/28/contamination-in-goiania/
25 years after, no case of cancer death clearly associated to this could be identified. There’s an ARS survivors dying from cirrhosis on which esophagus and prostate cancers were identified on autopsy. While it’s normal to be tempted to think they have to come from the exposure, it’s important to remember that cirrhosis and esophagus cancer have the same main cause, and prostate cancer is very common in aged males. There could be a link, but the evidence is not decisive.
See http://www.iaea.org/OurWork/NE/NEFW/WTS-Networks/ENVIRONET/environetfiles/StakeholderWhp-Denmark2012/Goiania_Radiological_accident-Risk_mngt_Veiga2.pdf.
And http://www.radiationandreason.com/uploads/enc_GoianiaValverdeVienna2013.pdf
It might also help get rid of the rather absurd concept of collective dose, whereby the health effects from exposure of–say–1 million people to 100 mrem each are judged to be the equivalent of exposing 200 people to 500 rem each. That concept is embedded in various regulatory constructs for assessing the impacts of radiation on populations.
@jmdesp.
Indeed. We seem to be getting that understanding, and non-linear outcomes remain to be demonstrated at population or ecological level.
If we can stick to available research (and not jump too far ahead), we should be able to come to agreement and understand the major questions that are being developed in this literature. Anybody can read these studies, and understand their implications and relevance to the kinds of questions you would like answered in a particular way (with data already indicating the contrary).
Regarding your reference on Goiania incident. Am I correct that exposures for general public were kept to 5 mSv in first year, and 1 mSv over 50 years? Among directly exposed populations, incidence risk and mortality from all causes look high to me (although “low statical power due to the small size of the cohort”). Cohort studies are underway (as indicated), “increasing number of people [are] asking for illness compensations (cancer and non-cancer).”
It seems to me that there is evidence for LNT, as far as radon exposure goes. The WHO are quite adamant on this:
WHO Handbook On Indoor Radon
Radon in homes and risk of lung cancer: collaborative analysis of individual data from 13 European case-control studies
At the same time there are studies contradicting LNT:
Effects of Cobalt-60 Exposure on Health of Taiwan Residents Suggest New Approach Needed in Radiation Protection
Test of the linear-no threshold theory of radiation carcinogenesis for inhaled radon decay products
Much of the WHO evidence comes from coal miners exposed during mining, but some of it has been taken from people exposed to natural radon in the home [the BMJ link above]
Is it possible that LNT is sometimes valid?
Seems there WAS a BEAR, biological effects of ATOMIC radeation but NOW there is a BEIR, the Biological Effects of Ionizing Radition.
Confusing.
The good thing about LNT is that it is EASY to apply and if you are safe by LNT, you are certainly safe, its that conservative. The bad thing is that it is THAT conservative, but people THINK it is unconservative.
Thanks Rod for a nice and well referenced overview of this important issue. It was just what I was looking for. It complements nicely the numerous articles on this website which have discussed aspects of radiation health effects, safety, rule-making and the impact of the subject on nuclear construction, nuclear advocacy and anti-nuclear propaganda.
The subject of radiation health effects is even clearer today than it was 60 years ago, and the rules laid down in those years should be revisited based on current knowledge. The fact that nuclear technology is arguably a mature technology should not mean that it should necessarily suffer from regulations which are at least partly obsolete, but which – only due to their old age – are treated as sacrosanct and as being beyond reappraisal or augmentation.
In any case, the public should know that the radiation dose limits which are currently preventing people from returning to large parts of the Fukushima exclusion zone are in fact very conservatively set. We can all agree to uphold such very conservative limits if we choose to, but we should not distress ourselves with any fear that such limits accurately reflect the difference between safe and unsafe conditions – as if the exclusion zone is some kind of “death zone” or what not. Not at all. The exclusion zone is merely a zone in which levels of radiation exceed more or less arbitrary regulatory limits which are at least 100 to 1000 times lower than any radiation dose level which has been shown to actually cause negative health effects.
In other words, it is not any kind of established radiation risk which defines the extents of the Fukushima exclusion zone. It is purely a regulatory choice with no relationship at all to any identifiable health effects. If actual measurable radiation risk was exclusively used to define the extents of that exclusion zone, the zone would be 99% smaller than it is today. No?
Rod,
Attacking Linus Pauling over Vitamin C is a cheap shot. I would not recommend you repeat it. Pauling is one of if not the greatest Chemist America produced. Just a few years after Shrodinger published his paper on the the wave equation in quantum mechanics, Pauling published his work on the valence-bond theory (in 1933 I think). It’s hard to overestimate what an astounding achievement that was. It was and is to this day the primary theory underlying understanding of chemical bonds (MO theory is perhaps more precise, modern … but VB is still taught).
Pauling developed the alpha-helix model of protein structure and many people believe that if he had been allowed to travel to Europe (he was blocked by the State Department for his Leftist political beliefs) he would have developed the double-helix model of DNA before Watson and Crick.
He also won the Nobel Peace Prize.
Not a bad record.
He gave the commencement address at Rutgers University the year I graduated with an UG degree in Chemistry. His address focused on the subject of ‘Secular Humanism’ (basically why we should behave even if we don’t believe in God). I’ve never forgotten that day.
@SteveK9
Hermann Muller also did exceptional work and won the Nobel Prize. Steven Chu is another Nobel Prize winner with exceptional achievements.
My comparison was an effort to point out that expertise in one area should not confer hero status and or isolate someone from being questioned when they express strong opinions on other topics.