Evidence suggesting LNT was fabricated as a purposeful effort to hamstring nuclear technology development
Arguments about whether or not nuclear energy should play a large role in our future energy mix often stop on the issue of cost and schedule. Even the most stubborn nuclear energy advocates cannot ignore the fact that nuclear plant construction projects have struggled with cost and schedule performance.
All too many of those poorly performing projects were never turned around and were eventually cancelled or converted to other fuel sources.
The effort expended to protect people from exposure to ionizing radiation is perhaps the single biggest driver of cost and schedule challenges associated with the design, licensing, construction, operation, maintenance and security posture of nuclear power plants. The high cost of the effort is driven by the demand for nearly perfect protection from all radiation.
That search for perfection mandates consideration of all imaginable events that could affect systems containing radioactive material or components that are relied upon to provide containment of radioactive material in the event of a failures in those systems. If there is a way to add a layer of protection that reduces the probability of a release, it is often added without much consideration of the cost.
If there is any uncertainty about how much material can potentially be released by any particular sequence of events, the default assumption is that 100% of the maximum amount of material that could possibly be in the system in the worst imagined situation.
There are times when there has been enough testing and quality assurance to convince regulators that releases will be limited to a smaller portion of the inventory, but the reviewers are professional skeptics that demand solid proof.
As instruments have become more sensitive, enabling the measurement of ever tinier quantities of radiation or radioactive materials, the drive to achieve the lowest possible doses has been made even more expensive.
Isn’t radiation a natural part of our environment? Why do regulators demand near perfect protection?
Despite a voluminous and growing body of contrary evidence, a significant portion of the population still believes that all ionizing radiation, down to single gamma rays or high energy electrons, is capable of causing cancer. They believe that harm that is detectable at high doses delivered rapidly can be the basis for projecting harm from low doses delivered over a lifetime of exposure. They even believe that a dose that might cause harm if absorbed by a single recipient will cause the same amount of harm even if spread to millions of recipients.
This “no threshold” assumption of potential harm has been enshrined as the basis for the regulatory treatment of radiation. It is embodied in rules that demand “cost is no object” efforts to keep doses as low as reasonably achievable and standards for material purity from radioactive isotopes that sometimes attempt to keep human exposures over future millennia to tiny fractions of normal background radiation.
The “no safe dose” assumption is the basis for precautionary response plans that recommend forcible relocations for entire area populations to “protect” them. The estimated doses that trigger these actions are 1/10th of the lowest doses where scientists have found evidence of increased risk of cancer during a lifetime after exposure.
The assumption that all radiation is hazardous is the root cause of the passion that activists bring to battles opposing all uses of nuclear energy and even all uses of radiation to diagnose medical conditions, treat afflictions and sterilize food.
Anecdote illustrating blanket LNT acceptance by nuclear safety community.
Recently, I attended a presentation given by one of the world’s leading nuclear safety experts gave a presentation. One of his slides was a graph to be used in a safety analysis decision process. The bottom axis of the graph was estimated radiation dose in units of rem. An audience member asked whether the numbers were for acute doses or doses spread out over time – hour, day, month, year or lifetime.
The nuclear safety expert – who is nearing the end of a long and distinguished career in positions of direct responsibility for nuclear safety or advice to those with regulatory responsibility – could not answer the question. It seemed as if he really wanted to ask the questioner why it mattered.
With the “no safe dose” model, radiation safety professionals have officially told nuclear safety professionals that dose rate does not matter, that all doses need to be carefully tracked and recorded, and that all harm is cumulative.
In today’s world of scientific and engineering specialization, professionals in one field generally do not spend much time questioning experts in a different field. They trust that the experts have done their job and are providing accurate inputs.
The public may have some wariness about trusting experts, and activists sometimes claim unwarranted expertise, but real subject matter experts are often quite trusting of their fellow credentialed experts. In fact, some professions have ethics codes that encourage members to trust other experts in areas outside their field of expertise.
What was the basis for selecting the linear, no-threshold (LNT) model for radiation harm?
Though it’s convenient, mathematically simple and well established in regulatory assumptions, the LNT is a scientifically unsupportable model. It cannot be proven with empirical evidence and it can only be said to be “consistent” with epidemiological data that is so scattered at low doses that almost any line can be drawn through data points with an equal mathematical fit.
The LNT was initially bred and propagated by geneticists. Those geneticists left some letters in historical archives that indicate they were more interested stimulating grants than in protecting people from harm.
Their grant target was the Rockefeller Foundation, which was run by people with interests that were threatened by the spectre of competition from nuclear technology, both energy production and other applications. Not surprisingly, the people with interests did not explain their concerns by openly stating that they would benefit financially if they could find ways to slow the development of useful applications of nuclear technology.
The RF itself, even though ostensibly separate from the founding oil oligarch family, still derived a substantial portion of its income from its endowment, which was overshelmingly – $405 M of $591 M (70%) – composed of stocks traceable to the Standard Oil Trust (p.297-299).
The geneticists fabricated the LNT on the shaky ground of reported results from high dose, high dose rate experiments conducted on Drosophila (fruit flies), with the primary purpose of stimulating mutations. Those experiments had mostly been completed more than 30 years before the LNT was officially developed and applied to recommended radiation standards.
There was little confirmatory follow-up conducted at the time; other researchers had difficulty replicating the Drosophila findings and Hermann Muller, the primary investigator moved on to other aspects of genetic research with drosophila.
Unfortunately, there doesn’t appear to have been a sustained effort to challenge the scientific basis for the LNT until 2009, when Edward Calabrese, a toxicologist at the University of Massachusetts Amherst who specializes in the efffects of dose on biological responses, began trying to test the validity of no-threshold models for chemical exposure limits.
His paper titled “The road to linearity: why linearity at low doses became the basis for carcinogen risk assessment” was the beginning of a continuing effort to learn more about the science, the actions and the personalities involved in firmly implanting the LNT model into both regulations and public perceptions.
What evidence did radiation protection professionals use before adoption of the LNT model?
People began working with x-rays and radioactive isotopes like radium and its daughter products within months of their discovery in the late 1800s. The properties were fascinating and obviously useful.
There were early indications that radiation could be dangerous if people received too much exposure or ingested too large a quantity of radioactive material.
Radiologists learned quickly that they could get nasty burns if they spent too much time too close to their devices. Young ladies employed to use radium-laced phosphorescent paint learned not to lick their brushes and to use proper industrial hygiene practices.
People who found that patent medicines like Radithor made them feel good learned to limit their consumption to something less than the 1400 bottles that caused Eben Byers to experience a painful, well publicized demise.
In 1934, after more than three decades of practical experience and documented studies, the International Commission on Radiation Protection determined that humans working with x-rays and radioactive materials could tolerate a daily dose of 0.2 roentgens (R) (p. 87). They knew it would take ten times as much as that dose to cause a slight reddening of the skin and even more before there were detectable changes in blood chemistry.
The tolerance dose was seen as a conservatively-chosen threshold below which there would be no harm. Observant radiation professionals also reported evidence indicating that humans physiology was able to recover from radiation injuries. If a worker exceeded the daily threshold, the standard – and effective – prescription was a few days without radiation exposure.
A worker who approached her daily tolerance doses for a full year would receive about 500 mSv, but the ICRP did not recommend any tracking systems or annual limits. The members assumed practitioners would follow their recovery recommendations to allow healing that would protect them from cumulative harm.
The 1934 ICRP tolerance dose is 10 times higher than today’s U.S. occupational radiation worker limits of 50 mSv/yr. It is 500 times higher than the 1 mSv/yr annual dose limit for a member of the general public.
The ICRP tolderance dose recommendation was the basis for radiation protection programs during the Manhattan Project. Historical records indicate that worker safety was a high priority and that project leaders were proud of the fact that there were only a small number of injuries caused by overexposure among the tens of thousands of project workers.
The project would have been impossible to complete under today’s radiation protection assumptions.
Rockefeller Foundation involvement in creating and instititionalizing the LNT
In 1954, soon after President Eisenhower gave his “Atoms for Peace” speech to the UN and promised the world that the U.S. was ready to make an effort to use nuclear energy for commercial power generation, the Board of Trustees of the Rockefeller Foundation decided that the American public wasn’t well informed about the effects of atomic radiation. (P. 506)
Quoting the RF Annual Report for 1955 under the heading of National Academy of Sciences Atomic Radiation (p. 190).
The effect of atomic radiation on living organisms has become a matter of urgent concern not only because of the development of nuclear weapons, but also because nuclear energy and radioactive materials will be increasingly used throughout the world for peaceful purposes, as for example in medical research and treatment, and in industrial power installations.
The RF Board determined that they would fund a survey of developed knowledge of biological effects of radiation. The Atomic Energy Commission – and the Manhattan Project before it – had been investing large sums of money into radiation efffects research since 1940 and had consistently informed the public that the tolerance dose was providing effective protection.
Certain press outlets and activist groups had claimed that the AEC had an obvious conflict of interest in establishing radiation dose limits while also engaging in an aggressive nuclear weapons testing program. The involved journalists and activists avoided pointing out the obvious fact that numerous AEC scientists were personally accepting exposures at the high end of the allowable range without significant concerns.
The Board turned to Detlev Bronk, the President of the National Academy of Sciences to ask if he could put a committee together. They were able to get an almost immediate answer, since Bronk was already sitting at the conference table. He was a member of the Board.
They asked Arthur Sulzberger if he would lead the communications effort to promote the results of the study once it was complete. Sulzberger, the publisher of the New York Times, was also a board member and readily agreed to spread the study results by providing prominent coverage.
The RF provided Bronk’s NAS with nearly $300,000 to cover the costs of organizing the BEAR I and preparing the desired reports. It continued to be the sole source of support for the BEAR committee until the committee was disbanded 1963. That year, the Atmospheric Test Ban treaty was signed by the U.S. and the USSR.
For unknown reasons, the press seems to have ignored the potential conflict of interest caused by having a private, oil baron financed Foundation request and fund a study of the health effects of a potentially formidable economic competitor to oil, coal and gas.
Stacking the committees
When he formed the six panels that made up the full Biological Effects of Atomic Radiation (BEAR) study group, Bronk selected Warren Weaver to lead the genetics panel. Weaver was a former mathematics professor who had been the man in charge of the RF’s research grants in biology, including genetics, since 1932. Under Weaver’s program in natural sciences, the RF provided a major portion of worldwide genetics research funding throughout the 1920s-1950s (P. 504).
According to a series of papers by Dr. Calabrese, the BEAR committee did not engage in much discussion about the appropriate model to use for determining radiation harm.
Hermann Muller had been given a Nobel Prize in 1946 for his 1926-27 work showing that high doses of radiation cause visible, inheritable changes that appeared to be mutations in Drosophila. He was a prickly little man who was known for aggressively defending the priority of his findings. By 1956, he had invested ten years striving to persuade medical professionals and health physicists to accept his certainty that there was no threshold for radiation harm to genetic material.
His efforts had not resulted in acceptance of his theory or any changes in radiation dose limits.
Once he was in a room full of fellow geneticists in a committee chaired by a long-time supporter of his – and their – work, he had little difficulty convincing the panel to adopt his model even though it was not the same as what he had published for his Nobel Prize-winning research.
As Calabrese has documented, the meeting minutes indicate that the NAS BEAR I genetics panel quickly settled on the LNT model. It then spent the remainder of its time discussing ways to determine the slope of the line to predict rates of genetic defects.
About a year later, Dr. Edward Lewis from CalTech published a paper titled “Leukemia and Ionizing Radiation” in the May 1957 issue of Science. His paper extended the predicted endpoint harm from genetic damage to cancer formation. Lewis and his employer had also received support from the RF via Weaver’s natural sciences program.
Was the LNT universally accepted?
Some of the people appointed to the first committee to officially apply the LNT model for radiation risk communication corresponded among themselves and freely admitted they were exaggerating both their level of understanding and the negative efffects of small doses of radiation.
A couple of the participants even admitted in archived correspondence that they were attempting to appeal to established radiation genetics funding sources for increased support.
Muller, the most forceful proponent of the LNT model, received what may have been the most generous reward. Within months after the NAS BEAR genetics committee report was published, his program at Indiana was awarded a multi-year, $350,000 research grant from the Rockefeller Foundation (RF) (p. 28).
That year, the RF awarded grants totaling $991,000 for genetics research, roughly half of which went to support programs employing members of the NAS BEAR genetics committee.
Muller’s 1956 grant from the RF was large enough to support him, several graduate students and his young family for the rest of his life.
Why would a Nobel Laureate be tempted to abuse his science in order to exaggerate radiation risks?
In the fall of 1946, before Muller received his Nobel Prize, he was 56 years old, had a young second wife, a baby daughter, and no tenure, savings or retirement plan. He had spent most of the war years as a temporary instructor at Amherst College, filling in for professors who had joined the war effort. Despite his well-earned reputation as a skilled Drosophila laboratory researcher, he struggled as a classroom teacher. He was not rehired for the 1945-46 school year.
With the help of a short term grant from the RF and the recommendation of a friend, he obtained a position at the University of Indiana. The grant was large enough to pay Muller’s salary and research costs, so the school was willing to take the risk of hiring a professor who had a sketchy background that included a known suicide attempt, a falling out with his original PhD mentor, association with socialist student groups, prewar research in Germany and the USSR, and a poor reputation for teaching skills.
After Muller was given his Prize, the University of Indiana was pleased that it had accepted the RF’s deal. The presence of a Nobelist on its faculty brought welcome credibility to the institution’s genetics program. Muller and his wife were understandably ecstatic when they received notice of his award; it provided the resources necessary to pay off debts along with providing a huge bump in professional standing.
Here is a thought-provoking quote from the lecture that Muller delivered on the occasion of his Nobel Prize.
Both earlier and later work by collaborators (Oliver, Hanson, etc.) showed definitely that the frequency of the gene mutations is directly and simply proportional to the dose of irradiation applied, and this despite the wave-length used, whether X- or gamma- or even beta-rays, and despite the timing of the irradiation. These facts have since been established with great exactitude and detail, more especially by Timoféeff and his co-workers. In our more recent work with Raychaudhuri (1939, 1940) these principles have been extended to total doses as low as 400 r, and rates as low as 0.01 r per minute, with gamma rays. They leave, we believe, no escape from the conclusion that there is no threshold dose, and that the individual mutations result from individual “hits”, producing genetic effects in their immediate neighborhood.
(Emphasis added.)
Aside: The bolded segments above show that Muller did not even pretend to have data from the radiation levels typically associated with nuclear plant operations. His lowest level of 400 R is 80 times as high as the annual occupational worker limit in the U.S. It was delivered to a creature that only lives for 21 days. End Aside.
From that day on, Muller apparently never deviated in public from his assertions that there was no threshold for radiation harm. He was the first stubborn defender of the model, but he certainly wasn’t the last.
Future articles will discuss theories attempting to explain reasons why Muller’s model achieved a durable, difficult-to-overcome status.
Although I preach for the converted and have written too much, I would like to add the following:
Seen in its historical contest, the LNT-assumption WAS a logic development.
I have tried to analyze the background and the following development on http://wp.me/p1RKWc-1lF
In the need to create fear, cancer was promoted to be the result of radiation.
Also here I have tried to look at the disturbing facts: http://wp.me/p1RKWc-1iq
This, and much more, led to the public misconception as promoted, mostly by Greenpeace.
On http://wp.me/p1RKWc-mu , I give a “bulletproof” assessment shoving, that Greenpeace’s credibility is a myth.
The result of this mess can be seen in Germany:
http://wp.me/p1RKWc-11F and http://wp.me/s1RKWc-90
But also in my native country, Denmark.
My conclusion is that we – the pro nuclear – have neglected to defend ourselves in the media.
Who will try to do something?
I have tried a little
BUT
If you want: You may find some ammunition to your gun at http://wp.me/s1RKWc-41
Rod, Thanks for retelling this story, and for sequels.
Thorkil, Thanks for the added references.
Without the added requirements of LNT, could nuclear energy be an “Energy Cheaper than Gas”?
Without these added requirements would there be the recent cancellations of large nuclear projects and could the extra savings have been enough to save some of the recent plants that have shut down?
Years ago I worked on a coal plant project with a few Stone and Webster guys that had worked at Fort St. Vrain. They told me about the creation of good designs. A project can have its deliverables 90 percent correct and things will probably go fine. They told me that getting that last 10 percent costs more than the first 90 percent. Is the “extra” protection afforded by LNT over that which is adequate for human health a similar increase in cost?
You ask, could ThorCon generate electricity “cheaper than gas? That’s tough; I’d say competitive with gas. A natural gas combustion turbine is basically just a turbine and a generator. In a ThorCon plant we have a similar costing (steam) turbine and generator, but in addition we must have the fission heat source comprising the fission reactor and heat exchanger(s) to make steam. So the higher capital cost of the additional equipment has to be balanced agains the cost of fuel: natural gas versus (relatively cheap $/joule) uranium/thorium. In the case of LNG, the cost of liquifying CH4 and transporting it means ThorCon is cheaper than LNG.
A simple combustion turbine/generator (NGCT) ramps up/down quickly, so the renewables advocates like them, but they emit significant CO2, in excess of 454 g/kWh, which is why the EPA original CO2 limit plan was scrapped in favor of the complex clean air plan — to satisfy the wind/solar advocates’ requirements for fast backup (in my opinion). However the more efficient NGCC (natural gas combined cycle) turbines, which emit less CO2, have the added capital cost of a heat exchanger to take the NG exhaust heat and make steam, and an additional steam turbine/generator. So ThorCon could be less expensive than NGCC. Only time will tell, so I say “competitive” with natural gas.
I am no fan of most foundations. However, didn’t Exxon (formerly Esso, one of the Standard Oil companies) enter the nuclear fuel business as did Gulf Oil?
Could the push to adopt LNT be attributable to the leftist political views of Muller, the RF et al. ? I don’t know Bronck’s politics but he apparently defended Communist Owen Lattimore from McCarthy (who was substantially correct). LNT would serve to hobble development of peaceful nuclear energy and create irrational fear of further nuclear weapons development which were areas the US had superiority over the Soviets in the immediate post-war era. Did the Soviets also adopt the LNT?
One book that addresses the subversive nature of charitable foundations is “The Foundations: Their Influence and Power” by Rene Wormser who was a chief investigator of the Reece Committee which was tasked by congress in the 1950’s to look into such matters. I have always wanted to read the book but have not yet done so.
https://www.amazon.com/Foundations-Their-Influence-Rene-Wormser/dp/1939438241/ref=sr_1_2?s=books&ie=UTF8&qid=1502230505&sr=1-2
@FermiAged
Tentative participation in a portion of the nuclear enterprise by an enormous oil and gas production company doesn’t prove anything. Neither does Nelson Rockefeller’s loudly professed support for making New York a key player in nuclear energy in the early days of the technology.
“In today’s world of scientific and engineering specialization, professionals in one field generally do not spend much time questioning experts in a different field. They trust that the experts have done their job and are providing accurate inputs.”
Hmmmmm…. Like, for instsnce, global warming?
I spend a fair amount of time quizzing the denalists of anthropogenic climate change (ACC) about their knowledge of the physics of infrared radiation transport of heat through the atmosphere.
They avoid any qualitative (let alone quantitative) discussion of this like the plague. Science has no place in their world view.
I fully understand the quantum mechanical explanations of CO2 molecules response to infrared radiation. I do not dispute the fact that CO2 and other gases can raise a planet’s atmospheric temperature above what it would be in their absence. In the global warming controversy, I challenge the way climate models are used to support claims that, without massive social restructuring and cutbacks in the use of fossil fuels that the climate will reach a cataclysmic state within a century.
9 ‘ice ages’ before industrial age. But we’re 100% sure we can control the climate. Let’s stop having kids, eating beef and send half our money to third world nations so they can prepare. We can hang out at Al Gore’s house on the weekends and enjoy some AC and his heated pool.
Noted accomplishments, low recognition, poor personal finances.
He sounds like the perfect target for the KGB’s foreign asset program. The Soviets loved preying on people like him. Now, I’m not saying he was an agent. However, I suspect his path might’ve been ‘nudged’ along a specific direction by outside forces.
Many if not most of the tax-exempt foundations supported causes diametrically opposed to the values of their founders. John D. MacArthur was very conservative, yet the MacArthur Foundation supports left-wing causes. Same with the Ford Foundation which Henry Ford II essentially disavowed. There are several other examples.
I had a post that appears lost where I pointed out that Exxon (Esso) and Gulf entered the nuclear business long after the LNT hypothesis was adopted. These ventures appeared to be efforts at making a profit or diversification. In some cases, nuclear energy was seen as a potential way to increase oil supply through efforts like Project Plowshares or using nuclear energy to supply heat for refineries.
I suspect the RF was not as nearly concerned about oil company competition as Rob’s hypothesis suggests. Their actions ARE consistent with other actions that support leftist objectives. I think the RF sponsoring left-wing researchers who may or may not have been conscious agents of the USSR makes more sense. I wonder if Gofman, Sternglass, Tamplin etc. received any tax-exempt foundation support.
Rob’s suggestion cannot be ruled out, however. The RF has been accused of nefarious activities in the field of medical school accreditation similar to Rob’s thesis of the LNT. Those interested should research the Flexner Report.
@FermiAged
I’m pretty sure that you are an engineering/science type and are thus naturally transparent, honest and consistent. Subtleties, disinformation, and strategic deception actions are probably foreign or incomprehensible to you.
Consider the following aphorisms and think about how they might be extended or applied to more fully understand why oil companies might decide to invest a small portion of their capital budget into “nuclear” industry.
1. If you can’t beat them, join them. (And then beat them from the inside.)
2. Keep your friends close and your enemies closer. (Pretending to be one of them is a great way to keep your enemies close.)
If giant successful companies try something and cannot make it work, small, less capitalized companies should stay away.
Here’s a story that is intended to stimulate critical thinking about oil and gas company historical interest in atomic energy.
https://atomicinsights.com/shell-oil-gas-companys-perspective-energy-future/
The elements that make Muller a good target for the Soviet’s intelligence program mean that he was a good target for any organization that wished to acquire him. In this case, it looks like he was the RF’s puppet, probably not the Soviet’s.
Actually, the entry of petroleum companies in the nuclear field was more extensive than I thought. I found a memo of a congressional committee that discussed the matter:
http://www.geonius.com/family/dad/nuclear.html
Orwell said those who control the past control the future, so the past is your focus. Advancing good science brings more benefit. There is no convincing case for a generally applicable safe “threshhold” for people, or you would present it front and center.
March 1996: “cancer incidence of male United States Air Force (USAF) aircrew (342 cancers, 532,980.97 man-years) with non-flying Air Force officers (827 cancers, 1,084,370.08 man-years) between 1975-89.. statistically significant excesses of aircrew cancers for all sites, testis, and urinary bladder. Previous studies ..may have been biased by the use of external comparison groups. ..we detected notable excess aircrew cancer risk for cancers of the testis, urinary bladder, and all sites combined.”
Cancer incidence in United States Air Force Aircrew, 1975–89.
( http://www.researchgate.net/publication/14370325_Cancer_incidence_in_United_States_Air_Force_Aircrew_1975-89 )
“We don’t know what causes most health problems that could be linked to radiation, including some forms of cancer and reproductive health issues like miscarriage and birth defects. If you are exposed to cosmic ionizing radiation and have these health problems, we can’t tell if it was caused by your work conditions or something else.
We don’t know what levels of cosmic radiation are safe for every person.”
( http://www.cdc.gov/niosh/topics/aircrew/cosmicionizingradiation.html )
November 2004: “epidemiologic data on cancer risks from eight cohorts of over 270 000 radiologists and technologists in various countries. The most consistent finding was increased mortality due to leukemia among early workers employed before 1950, when radiation exposures were high…”
Cancer Risks among Radiologists and Radiologic Technologists: Review of Epidemiologic Studies
( pubs.rsna.org/doi/abs/10.1148/radiol.2332031119?journalCode=radiology )
You left off this part:
“All other aviator cancer classifications were not significantly different from the comparison cohort; most notably, cancers of the colon and rectum, skin (both malignant melanoma and non-epithelial), brain and nervous system, Hodgkin’s Disease and leukemias. Previous studies of commercial pilots that demonstrated excesses of these cancers may have been biased by the use of external comparison groups.”
100 cancers difference in a tens thousand+ pop is static. Like 1%.
Enviro factors easily explain difference. Eg different diets, sleep routine for pilots etc etc
Here’s one thing we do know, with scientific certainty, there is no difference in biological impacts between ionizing radiation from nuclear power sources and ionizing radiation from natural sources (or from medical, or air travel, etc..). Dose (rem) is dose.
Given this, the salient facts are that natural background levels vary by factors of several (or more) and no correlations between background levels have ever been established. And this is with enormous statistical samples, millions of people living in different areas with different natural background dose levels. Based on this alone, we *know* that radiation doses within the range of natural background have no measurable or remotely significant impact on cancer rates. Meanwhile, fossil fuels are killing millions annually and causing global warming….
Studies on much smaller groups of people, which are actually seeking to establish an effect from a specific source/agent (nuclear industry radiation), are nowhere near sufficient to refute the iron clad argument given above, especially given that even their predicted increases are tiny (e.g., ~1%). So many factors could explain a blip like that. Too small to measure, too small to matter, especially given the horrendous impacts of the (fossil) alternatives to nuclear.
@Pu239
I assume that you are referring to the initial article. If you read it again, you may recognize my position is that the 1934 recommendations from the ICRP worked well enough to protect people.
Your citing of epidemiology isn’t convincing considering the many uncertainties that get hidden in the statistical analysis and summarized conclusions. Statistical correlations are not evidence, especially when there is no real control of confounding factors.
Playing devil’s advocate here: LNT does have one advantage: easy to use and execute. Lots of data support hormesis, but it is tough to implement esp. From reg viewpoint.
Perhaps a politically and scientifically reasonable compromise can be achieved.
Think Pareto. 80% of the problem with LNT is exagerration of small doses on large populations. The most extreme example: LNT predicts around 1 million deaths per year from natural background Radiation!
This can be fixed with a threshold.
Based on the available data on humans and animals, a 2 mSv/day threshold appears reasonably Defensible. So in the interest of being conservative, let’s set a strongly defensible 1 mSv/day threshold and treat the excess as LNT. So a Linear Threshold model. Fixes the biggest problem, and easy to use, and politically less finicky than hormesis. Seems like a good shot to me.
Cyril,
Hell, I’d be happy with a threshold of ~1 Rem/year (i.e., near the top of the range of natural background, for most places on earth).
That allows use of the ironclad argument I mentioned in another post (i.e., the absurdity of spending huge sums to reduce radiation levels within the range of background).
That by itself would have enormous impacts. Many applied dose limits are as low as a few mrem/yr (e.g., decommissioning standards for most exposed individual – not kidding).
For accidents, a high limit would need to be applied. The rationale being that limits for emergency situations should be higher than those applied for normal operations, etc.. (i.e., “deliberate” exposures). For accidents, perhaps 10 Rem/year. That would have made a huge difference at Fukushima.
Dose per year is difficult to defend, it doesn’t make much sense.
Dose should be on a relevant biological repair timescale. This is done with medicine, e.g. daily or 6 or 8 hourly recommended maximums for aspirin. a 1000 pills of aspirin a year limit would not make sense and would be even dangerous. Taking 1000 pills in 1 day is rather different than 3 pills a day for a year, obviously. The former would kill you, the latter has nil effect on health despite being a bigger dose (3*365=1095 pills).
We should be pushing daily limits, in my opinion.
I don’t think we need a different limit for accidents. But there is no need to evacuate an area if there are doses above 1 mSv/day expected. Evacuation is always worse in terms of human impact than a nuclear accident, even for the more serious ones like Fukushima this is the caase. Rather I would recommend a polluter-pays scheme where the nuclear plant (owner or operator, that is up for discussion) that causes greater than 1 mSv/day in an accident is forced to pay up to anyone receiving more than 1 mSv/day, for every additional mSv of dose.
When was the last time you were accidentally exposed to aspirin?
“When was the last time you were accidentally exposed to aspirin?”
Just this morning Brian! and yesterday and the day before, and just about every other day. When I had a glass of water from the tap.
It’s amazing how many pharmaceuticals traces are in drinking water. Aspirin’s the least of my worry though. Such tiny amounts in the tap water.
When dose is so low that no harm is done, the accidental vs. voluntary dose argument is not relevant.
Cyril – This is not a discussion about homeopathy.
Cyril,
I’m not disagreeing with anything you say, from a scientific point of view. I’m just saying that, as justified as it may be, setting such high dose limits may be politically impossible. It seems to me that it would be much easier to get the public to go along with the notion that dose rates within the range of natural background are clearly not a problem. That is, if millions of people are happily living in certain areas of the planet that have always had annual doses of ~1 Rem/yr, without higher cancer rates, etc.., then people should be comfortable be exposed to doses lower than that. And such a limit is high enough to reduce most of the unnecessary costs, with the exception of accidents.
As for accidents, I suggested 10 Rem/yr because that’s the dose at which statistically significant increases in cancer rates *begin* to occur (according to what I’m hearing many people say). Given that, as Fukushima showed, many areas may be over 1 Rem/yr, the limit I suggested for normal operations is too low and would result in unnecessary over reactions.
Hmm I don’t know James, 1 mSv/day sounds lower to me than 100 mSv/year. 1 sounds less than 100 even though there are 365 days in a year.
What would you prefer: 99 mSv on the first minute of January 1 and then nothing for the rest of the year? Or 1 mSv per day for a year?
Having a per day limit rather than per year would be an extreme leap forward, in my opinion. Both from political viewpoint as well as scientifically.
@gmax137,
“You left off this part”
Because I was being limited but a comment length filter, which appears to give more space for selected individuals. Just 1 more way to control the “discussion” here.
@Pu239
If there is a way to adjust comment length on an individual user basis, I am not aware of it.
I make no secret of the fact that the discussion here is moderated, “controlled” if you will.
The 1st amendment not only prevents the congress from passing laws that abridge the right of free speech, but it also gives everyone the freedom to peaceably assemble.
In my understanding, that means we can get together with like minded people if we want and we can disinvite others.
As structured, the best analogy is not invited guests gathered in your private living room in a quaint little town where you need not lock your doors. The better analogy is a rented booth along a public sidewalk in or near the International Nuclear Shopping Mall. As a “for-profit, tax-paying, publishing company” on the interwebs, you and your company are subject to many more international laws and regulations than just the 1st Amendment of the US Constitution and how it applies to small private groups.
>I am not aware
This excuse is also “adored by little statesmen and philosophers and divines,” along with “a foolish consistency.” But really, who can fully understand the workings of the site software…
>I make no secret of the fact that the discussion here is moderated
You also don’t “distribute accurate information” about this topic or related policies, to my knowledge.
@Pu239
Are you calling me a liar?
Why do you consider a web site to be analogous to a rented booth on a public sidewalk?
Please identify the laws that you think I’m not paying attention to. Would your identification be any different if Atomic Insights was organized as an “environmental charity,” which is how The Economist described Greenpeace in a recent article.
The policies, site rules on commenting, and details on deleted or rejected comments occurrences and reasons are not clearly posted, as they are on some sites. If you want to extrapolate that to lying by omission or something, that’s up to you. Ignorance of how your own site software works is not a great excuse, but is understandable.
Public sidewalks allow anyone to walk by; this site allows anyone to surf by. You presumably rent your site hosting. You engage with your visitors/customers much like someone at a promotional booth would, giving out opinions and having discussions, and the donation jar or hat is visible to all.
There are more anti-discrimination and publishing regulations in the world than I’ll ever know, and I’d bet a lawyer could make a case of some kind, in some jurisdiction, if motivated. Sorry, I didn’t have anything too specific in mind, other than very arbitrary suppression of views you don’t support.
Having studied the BEIR VII report, from that there is no statistical evidence for LNT for solid cancers. The statistical inference method used was biased in favor of the standing hypothesis, LNT. For leukemia even the method used favored a quadratic relationship for low doses.
A properly done Bayes factor analysis would show the same for solid cancers. In any case, BEIR VII did not consider actual low dose data as there was none known to the committee. More recent studies suggest a hormetic effect at low dose rates.
Rod, I think you meant 80 times………in your aside.
‘His lowest level of 400 R is 8 times as high as the annual occupational worker… ‘.
you’re welcome.
Thanks, as always, for the clear and interesting synopsis on the LNT. I’m sad to see nuclear slowly dying while most people seem to see it as a good thing or simply don’t care. I’m sure its the ideas behind LNT that drive public perception, even if the general public doesn’t even know what it is.
The demand for near perfect protection from all radiation (including radiation levels that are a fraction of natural background) is indeed the main cost driver for nuclear. However, while LNT should be rejected, that may not be enough by itself.
I’ve asked (before), if LNT were replaced by a threshold model, would NRC react by saying that meltdowns are OK? While there are some costs associated with low dose limits, the majority of nuclear’s costs are due to (extreme) meltdown prevention efforts. This is not tied to any dose limits, but instead a conviction that any significant release (or accident) is “unacceptable” and “must never happen”. Thus the standards of perfection.
And the real truth is that much of this conviction is based on what the public reaction will be to any such release. Look at what happened in Japan. The public is essentially demanding that coal (which is thousands of times more dangerous and harmful) be used instead, even though Fukushima caused few if any deaths. It is true that sane dose limits would have greatly reduced most of Fukushima’s “impacts” (long term evacuations, etc..), but still, it’s not like NRC would abandon efforts to prevent meltdowns at all costs. Getting that to change in an effort distinct from rejecting LNT.
This hits the nail on the head. The FAA does not say airlines must never crash, the NHSTA does not say there must be no car accidents, and the FDA does not require new drugs to have no side effects. I can imagine if the FCC regulated like the NRC it would require each new version of the iphone to have an analysis showing how the technology advancements will not increase the likelihood of the AI “singularity”.
Exactly, Robert. In the article, Rod talks about how everything is required to be based on the assumption of the worst-conceivable release, or the worst conceivable set of events in general, dreamed up by people (regulators, etc..) with very active imaginations. Whether there is actually any significant probability of such things happening in the real world is not considered. Then you have to spare no expense ensuring that that set of events is impossible, even if the actual likelihood of it happening is negligible.
NRC may try to respond that if you rigorously show that the probability is negligible, they will accept that. But therein lies the rub. The exhaustive analyses they require (to prove such a negative) is so difficult and expensive that its easier to just swallow the absurd, bounding assumption and deal with it. I know this because I used to do nuclear analyses. I’ve made plenty of unrealistic (“bounding”) assumption for precisely that reason.
continued….
My impression is that no other industries work this way. At the risk of being simplistic, they are simply allowed to proceed (w/ few if any regulations). Then, *when* something bad happens, specific regulations to address that accident or event (based on the evaluation of what went wrong) are put in place. In other words, it is all completely reactive. They let experience in the field determine what bad results actually have any real chance of happening. No money is spent on avoiding things that, while theoretically possible, experience shows never happen in the real world.
My impression is partly based on how I often hear newspapers referring to a technology (e.g., drones or self-driving cars) “getting ahead of regulations”. Think about what that implies. Those things (drones, etc..) were simply allowed to be deployed, w/o and before development of any significant regulations. (We later hear about some regulations being passed, reactively, in response to events.)
Here’s another example. My understanding is that the *majority* of man made chemicals/materials that are used in consumer products, etc.., have *never* had any testing or epidemiological studies to demonstrate that they are not harmful to human health. That is, wide scale use was allowed w/o demonstration of these materials’ safety. Again, it’s all reactive at best.
Also, as I’ve said before, the use of LNT isn’t as much of a problem as its selective application. Current policy does not demand near perfect protection from all radiation. It only demands that for radiation from the nuclear power or weapons industries.
According to LNT, negative impacts (deaths, etc..) scale directly with *collective exposure*. Well, the truth is that mankind’s collective exposure from nuclear power, including accidents, is negligible compared to other sources of exposure (natural background, medial, air travel, etc..). I think the nuclear power may be as low as one millionth of man’s collective exposure. Despite this, almost no efforts are being made to reduce those vastly larger collective exposures. Suffice it to say that the expense per man-Rem avoided imposed on the nuclear power industry is thousands to millions of times what is being spent on reducing other sources of exposure.
Pointing out this clear double standard should be a more unassailable approach than scientifically questioning LNT. And it should be more than sufficient to do the job. Stated as simply as possible, how can one justify going to extreme lengths to reduce nuclear industry public radiation exposures well within the range of natural background, while doing nothing at all to reduce any other sources of exposure?
I often bring this up: collective human population exposure to natural background radiation is at least 7 billion times 0.002 Sv/year = 14 million person-sieverts. LNT predicts 1 cancer per 10 Sv so 1.4 million cancers. This would put natural background radiation in the top 10 causes of death worldwide. It is also about 10% of all cancers; and background radiation varies by a large factor globally. Therefore, LNT expects a significant correlation of background radiation difference with cancer rates. This clearly does not exist. By deduction, LNT is wrong. Further use of LNT need not be executed: the hypothesis is already rejected.
Telling the public that cancer risk scales directly with radiation exposure, all the way down to zero, is bad enough, but the message chosen by nuclear opponents that there is “no safe dose” of radiation is far more pernicious. They try to claim that their carefully chosen phrase is merely an expression of LNT, but they know better.
They know that while LNT says:
High dose = high risk
Low dose = low risk
Negligible dose = negligible risk
the public interprets “no safe dose” to mean:
Even negligible dose = significant risk
To think of it graphically, imagine a plot of excess cancer risk vs. radiation dose, where the Y value of the line is not zero at X=0 (as LNT would suggest), but still has a significant positive value at X=0. (The Y-intercept? is significant.)
That is how the public interprets “no safe dose”. Something that is “not safe” does not equate to negligible risk. It means significant risk (even at ~zero dose). A complete lie, even if LNT were true.
Can someone here do a review of the following paper:
http://www.nature.com/leu/journal/v27/n1/full/leu2012151a.html?foxtrotcallback=true
I don’t have access to it. It seems to claim significant increase in leukemia in children from background gamma radiation. It appears they didn’t look at total cancers or total morbidity/mortality of the children though, if so that’s a serious flaw of the study.
BTW, I’m posting this concerning the proposed Fukushima release of tritiated water, in case any of you find it helpful.
I was told that there is a total of ~17 quadrillion Bq of tritium that is now distributed within 777,000 tons of water (in the tanks). I did some calcs based on the ingestion dose conversion factor for tritium, of 1.73E-11 Sv/Bq (from EPA Federal Guidance Report #11).
The calcs show that if a person drank 1 liter of water directly out of the tanks (i.e., no dilution, in the Pacific, etc..), they would get a dose of ~4 millirem. The average person drinks ~1 liter of water (or other fluids) per day. So, even if I got all my annual hydration from the water in the Fukushima tanks, my annual dose would be ~1.5 rem, which is not much higher than the range of natural background and far too small to have any health impact.
Please spread the above facts far and wide. I have to ask, how come nobody has tried to put this whole “issue” to bed by drinking water directly out of the tanks, in front of TV cameras, etc…
This whole episode is an example of how often things depart from any scientific basis (or dose limits) at all, whether the basis is LNT or not.
Very good!
Interesting trivia to add: there are no harmful bacteria in the water either, as it has been gamma sterilized by the reactor and subsequently filtered extensively…
Another bit of trivia is the water is orders of magnitude safer to drink than seawater…
“Water, water everywhere and not a drop to drink.”
Water, water everywhere,
Nor any drop to drink
There is another explanation that deserves examination.
There was a widespread horror within the scientific community towards nuclear weapons. At that time nuclear weapons were being tested via above ground explosions and these explosions were injecting radioactive materials into the atmosphere.
Unfortunately for these activist scientists, the amount of radiation from nuclear testing was insufficient to cause alarm given the 1950’s radiation standards. It was only after the radiation standards were tightened that the air testing of nuclear weapons became an issue.
Some support for this theory is provided by the quote
“The RF provided Bronk’s NAS with nearly $300,000 to cover the costs of organizing the BEAR I and preparing the desired reports. It continued to be the sole source of support for the BEAR committee until the committee was disbanded 1963. That year, the Atomospheric Test Ban treaty was signed by the U.S. and the USSR.”
@Stephen Duval
There were many reasons why the no threshold dose model was created. The existence of one does not preclude the existence of others.
The historical record, however, suggests that well connected people with economic interests were laying the groundwork for excessive fear of radiation long before there was an atmospheric testing program. Stories of the Radium Girls and Evan Byer received sensationalistic coverage in major newspapers in the late 1920s and early 1930s. Hermann Muller was chosen to receive the Nobel Prize in physiology or medicine in 1946. He used the world stage provided by that to make a firm declaration of the “no safe dose” assumption even while admitting that the lowest doses that he had used in experiments was 400 R.
https://www.nobelprize.org/nobel_prizes/medicine/laureates/1946/muller-lecture.html
The RF began its generous support of Hermann Muller in the 1930s. Without the support of the RF, Muller would have been driving a truck or performing some other kind of menial labor not requiring any security clearance during WWII. Instead, he was placed as an instructor at Amherst College, which was given financial support to cover his salary. The reasons he couldn’t otherwise locate a university job or a war-related research job are well documented, so I will not repeat them here.