Arguments about whether or not nuclear energy should play a large role in our future energy mix often stop on the issue of cost and schedule. Even the most stubborn nuclear energy advocates cannot ignore the fact that nuclear plant construction projects have struggled with cost and schedule performance.
All too many of those poorly performing projects were never turned around and were eventually cancelled or converted to other fuel sources.
The effort expended to protect people from exposure to ionizing radiation is perhaps the single biggest driver of cost and schedule challenges associated with the design, licensing, construction, operation, maintenance and security posture of nuclear power plants. The high cost of the effort is driven by the demand for nearly perfect protection from all radiation.
That search for perfection mandates consideration of all imaginable events that could affect systems containing radioactive material or components that are relied upon to provide containment of radioactive material in the event of a failures in those systems. If there is a way to add a layer of protection that reduces the probability of a release, it is often added without much consideration of the cost.
If there is any uncertainty about how much material can potentially be released by any particular sequence of events, the default assumption is that 100% of the maximum amount of material that could possibly be in the system in the worst imagined situation.
There are times when there has been enough testing and quality assurance to convince regulators that releases will be limited to a smaller portion of the inventory, but the reviewers are professional skeptics that demand solid proof.
As instruments have become more sensitive, enabling the measurement of ever tinier quantities of radiation or radioactive materials, the drive to achieve the lowest possible doses has been made even more expensive.
Isn’t radiation a natural part of our environment? Why do regulators demand near perfect protection?
Despite a voluminous and growing body of contrary evidence, a significant portion of the population still believes that all ionizing radiation, down to single gamma rays or high energy electrons, is capable of causing cancer. They believe that harm that is detectable at high doses delivered rapidly can be the basis for projecting harm from low doses delivered over a lifetime of exposure. They even believe that a dose that might cause harm if absorbed by a single recipient will cause the same amount of harm even if spread to millions of recipients.
This “no threshold” assumption of potential harm has been enshrined as the basis for the regulatory treatment of radiation. It is embodied in rules that demand “cost is no object” efforts to keep doses as low as reasonably achievable and standards for material purity from radioactive isotopes that sometimes attempt to keep human exposures over future millennia to tiny fractions of normal background radiation.
The “no safe dose” assumption is the basis for precautionary response plans that recommend forcible relocations for entire area populations to “protect” them. The estimated doses that trigger these actions are 1/10th of the lowest doses where scientists have found evidence of increased risk of cancer during a lifetime after exposure.
The assumption that all radiation is hazardous is the root cause of the passion that activists bring to battles opposing all uses of nuclear energy and even all uses of radiation to diagnose medical conditions, treat afflictions and sterilize food.
Anecdote illustrating blanket LNT acceptance by nuclear safety community.
Recently, I attended a presentation given by one of the world’s leading nuclear safety experts gave a presentation. One of his slides was a graph to be used in a safety analysis decision process. The bottom axis of the graph was estimated radiation dose in units of rem. An audience member asked whether the numbers were for acute doses or doses spread out over time – hour, day, month, year or lifetime.
The nuclear safety expert – who is nearing the end of a long and distinguished career in positions of direct responsibility for nuclear safety or advice to those with regulatory responsibility – could not answer the question. It seemed as if he really wanted to ask the questioner why it mattered.
With the “no safe dose” model, radiation safety professionals have officially told nuclear safety professionals that dose rate does not matter, that all doses need to be carefully tracked and recorded, and that all harm is cumulative.
In today’s world of scientific and engineering specialization, professionals in one field generally do not spend much time questioning experts in a different field. They trust that the experts have done their job and are providing accurate inputs.
The public may have some wariness about trusting experts, and activists sometimes claim unwarranted expertise, but real subject matter experts are often quite trusting of their fellow credentialed experts. In fact, some professions have ethics codes that encourage members to trust other experts in areas outside their field of expertise.
What was the basis for selecting the linear, no-threshold (LNT) model for radiation harm?
Though it’s convenient, mathematically simple and well established in regulatory assumptions, the LNT is a scientifically unsupportable model. It cannot be proven with empirical evidence and it can only be said to be “consistent” with epidemiological data that is so scattered at low doses that almost any line can be drawn through data points with an equal mathematical fit.
The LNT was initially bred and propagated by geneticists. Those geneticists left some letters in historical archives that indicate they were more interested stimulating grants than in protecting people from harm.
Their grant target was the Rockefeller Foundation, which was run by people with interests that were threatened by the spectre of competition from nuclear technology, both energy production and other applications. Not surprisingly, the people with interests did not explain their concerns by openly stating that they would benefit financially if they could find ways to slow the development of useful applications of nuclear technology.
The RF itself, even though ostensibly separate from the founding oil oligarch family, still derived a substantial portion of its income from its endowment, which was overshelmingly – $405 M of $591 M (70%) – composed of stocks traceable to the Standard Oil Trust (p.297-299).
The geneticists fabricated the LNT on the shaky ground of reported results from high dose, high dose rate experiments conducted on Drosophila (fruit flies), with the primary purpose of stimulating mutations. Those experiments had mostly been completed more than 30 years before the LNT was officially developed and applied to recommended radiation standards.
There was little confirmatory follow-up conducted at the time; other researchers had difficulty replicating the Drosophila findings and Hermann Muller, the primary investigator moved on to other aspects of genetic research with drosophila.
Unfortunately, there doesn’t appear to have been a sustained effort to challenge the scientific basis for the LNT until 2009, when Edward Calabrese, a toxicologist at the University of Massachusetts Amherst who specializes in the efffects of dose on biological responses, began trying to test the validity of no-threshold models for chemical exposure limits.
His paper titled “The road to linearity: why linearity at low doses became the basis for carcinogen risk assessment” was the beginning of a continuing effort to learn more about the science, the actions and the personalities involved in firmly implanting the LNT model into both regulations and public perceptions.
What evidence did radiation protection professionals use before adoption of the LNT model?
People began working with x-rays and radioactive isotopes like radium and its daughter products within months of their discovery in the late 1800s. The properties were fascinating and obviously useful.
There were early indications that radiation could be dangerous if people received too much exposure or ingested too large a quantity of radioactive material.
Radiologists learned quickly that they could get nasty burns if they spent too much time too close to their devices. Young ladies employed to use radium-laced phosphorescent paint learned not to lick their brushes and to use proper industrial hygiene practices.
People who found that patent medicines like Radithor made them feel good learned to limit their consumption to something less than the 1400 bottles that caused Eben Byers to experience a painful, well publicized demise.
In 1934, after more than three decades of practical experience and documented studies, the International Commission on Radiation Protection determined that humans working with x-rays and radioactive materials could tolerate a daily dose of 0.2 roentgens (R) (p. 87). They knew it would take ten times as much as that dose to cause a slight reddening of the skin and even more before there were detectable changes in blood chemistry.
The tolerance dose was seen as a conservatively-chosen threshold below which there would be no harm. Observant radiation professionals also reported evidence indicating that humans physiology was able to recover from radiation injuries. If a worker exceeded the daily threshold, the standard – and effective – prescription was a few days without radiation exposure.
A worker who approached her daily tolerance doses for a full year would receive about 500 mSv, but the ICRP did not recommend any tracking systems or annual limits. The members assumed practitioners would follow their recovery recommendations to allow healing that would protect them from cumulative harm.
The 1934 ICRP tolerance dose is 10 times higher than today’s U.S. occupational radiation worker limits of 50 mSv/yr. It is 500 times higher than the 1 mSv/yr annual dose limit for a member of the general public.
The ICRP tolderance dose recommendation was the basis for radiation protection programs during the Manhattan Project. Historical records indicate that worker safety was a high priority and that project leaders were proud of the fact that there were only a small number of injuries caused by overexposure among the tens of thousands of project workers.
The project would have been impossible to complete under today’s radiation protection assumptions.
Rockefeller Foundation involvement in creating and instititionalizing the LNT
In 1954, soon after President Eisenhower gave his “Atoms for Peace” speech to the UN and promised the world that the U.S. was ready to make an effort to use nuclear energy for commercial power generation, the Board of Trustees of the Rockefeller Foundation decided that the American public wasn’t well informed about the effects of atomic radiation. (P. 506)
Quoting the RF Annual Report for 1955 under the heading of National Academy of Sciences Atomic Radiation (p. 190).
The effect of atomic radiation on living organisms has become a matter of urgent concern not only because of the development of nuclear weapons, but also because nuclear energy and radioactive materials will be increasingly used throughout the world for peaceful purposes, as for example in medical research and treatment, and in industrial power installations.
The RF Board determined that they would fund a survey of developed knowledge of biological effects of radiation. The Atomic Energy Commission – and the Manhattan Project before it – had been investing large sums of money into radiation efffects research since 1940 and had consistently informed the public that the tolerance dose was providing effective protection.
Certain press outlets and activist groups had claimed that the AEC had an obvious conflict of interest in establishing radiation dose limits while also engaging in an aggressive nuclear weapons testing program. The involved journalists and activists avoided pointing out the obvious fact that numerous AEC scientists were personally accepting exposures at the high end of the allowable range without significant concerns.
The Board turned to Detlev Bronk, the President of the National Academy of Sciences to ask if he could put a committee together. They were able to get an almost immediate answer, since Bronk was already sitting at the conference table. He was a member of the Board.
They asked Arthur Sulzberger if he would lead the communications effort to promote the results of the study once it was complete. Sulzberger, the publisher of the New York Times, was also a board member and readily agreed to spread the study results by providing prominent coverage.
The RF provided Bronk’s NAS with nearly $300,000 to cover the costs of organizing the BEAR I and preparing the desired reports. It continued to be the sole source of support for the BEAR committee until the committee was disbanded 1963. That year, the Atmospheric Test Ban treaty was signed by the U.S. and the USSR.
For unknown reasons, the press seems to have ignored the potential conflict of interest caused by having a private, oil baron financed Foundation request and fund a study of the health effects of a potentially formidable economic competitor to oil, coal and gas.
Stacking the committees
When he formed the six panels that made up the full Biological Effects of Atomic Radiation (BEAR) study group, Bronk selected Warren Weaver to lead the genetics panel. Weaver was a former mathematics professor who had been the man in charge of the RF’s research grants in biology, including genetics, since 1932. Under Weaver’s program in natural sciences, the RF provided a major portion of worldwide genetics research funding throughout the 1920s-1950s (P. 504).
According to a series of papers by Dr. Calabrese, the BEAR committee did not engage in much discussion about the appropriate model to use for determining radiation harm.
Hermann Muller had been given a Nobel Prize in 1946 for his 1926-27 work showing that high doses of radiation cause visible, inheritable changes that appeared to be mutations in Drosophila. He was a prickly little man who was known for aggressively defending the priority of his findings. By 1956, he had invested ten years striving to persuade medical professionals and health physicists to accept his certainty that there was no threshold for radiation harm to genetic material.
His efforts had not resulted in acceptance of his theory or any changes in radiation dose limits.
Once he was in a room full of fellow geneticists in a committee chaired by a long-time supporter of his – and their – work, he had little difficulty convincing the panel to adopt his model even – though it was not the same as what he had published for his Nobel Prize-winning research.
As Calabrese has documented, the meeting minutes indicate that the NAS BEAR I genetics panel quickly settled on the LNT model. It then spent the remainder of its time discussing ways to determine the slope of the line to predict rates of genetic defects.
About a year later, Dr. Edward Lewis from CalTech published a paper titled “Leukemia and Ionizing Radiation” in the May 1957 issue of Science. His paper extended the predicted endpoint harm from genetic damage to cancer formation. Lewis and his employer had also received support from the RF via Weaver’s natural sciences program.
Was the LNT universally accepted?
Some of the people appointed to the first committee to officially apply the LNT model for radiation risk communication corresponded among themselves and freely admitted they were exaggerating both their level of understanding and the negative efffects of small doses of radiation.
A couple of the participants even admitted in archived correspondence that they were attempting to appeal to established radiation genetics funding sources for increased support.
Muller, the most forceful proponent of the LNT model, received what may have been the most generous reward. Within months after the NAS BEAR genetics committee report was published, his program at Indiana was awarded a multi-year, $350,000 research grant from the Rockefeller Foundation (RF) (p. 28).
That year, the RF awarded grants totaling $991,000 for genetics research, roughly half of which went to support programs employing members of the NAS BEAR genetics committee.
Muller’s 1956 grant from the RF was large enough to support him, several graduate students and his young family for the rest of his life.
Why would a Nobel Laureate be tempted to abuse his science in order to exaggerate radiation risks?
In the fall of 1946, before Muller received his Nobel Prize, he was 56 years old, had a young second wife, a baby daughter, and no tenure, savings or retirement plan. He had spent most of the war years as a temporary instructor at Amherst College, filling in for professors who had joined the war effort. Despite his well-earned reputation as a skilled Drosophila laboratory researcher, he struggled as a classroom teacher. He was not rehired for the 1945-46 school year.
With the help of a short term grant from the RF and the recommendation of a friend, he obtained a position at the University of Indiana. The grant was large enough to pay Muller’s salary and research costs, so the school was willing to take the risk of hiring a professor who had a sketchy background that included a known suicide attempt, a falling out with his original PhD mentor, association with socialist student groups, prewar research in Germany and the USSR, and a poor reputation for teaching skills.
After Muller was given his Prize, the University of Indiana was pleased that it had accepted the RF’s deal. The presence of a Nobelist on its faculty brought welcome credibility to the institution’s genetics program. Muller and his wife were understandably ecstatic when they received notice of his award; it provided the resources necessary to pay off debts along with providing a huge bump in professional standing.
Here is a thought-provoking quote from the lecture that Muller delivered on the occasion of his Nobel Prize.
Both earlier and later work by collaborators (Oliver, Hanson, etc.) showed definitely that the frequency of the gene mutations is directly and simply proportional to the dose of irradiation applied, and this despite the wave-length used, whether X- or gamma- or even beta-rays, and despite the timing of the irradiation. These facts have since been established with great exactitude and detail, more especially by Timoféeff and his co-workers. In our more recent work with Raychaudhuri (1939, 1940) these principles have been extended to total doses as low as 400 r, and rates as low as 0.01 r per minute, with gamma rays. They leave, we believe, no escape from the conclusion that there is no threshold dose, and that the individual mutations result from individual “hits”, producing genetic effects in their immediate neighborhood.
Aside: The bolded segments above show that Muller did not even pretend to have data from the radiation levels typically associated with nuclear plant operations. His lowest level of 400 R is 80 times as high as the annual occupational worker limit in the U.S. It was delivered to a creature that only lives for 21 days. End Aside.
From that day on, Muller apparently never deviated in public from his assertions that there was no threshold for radiation harm. He was the first stubborn defender of the model, but he certainly wasn’t the last.
Future articles will discuss theories attempting to explain reasons why Muller’s model achieved a durable, difficult-to-overcome status.