Recently an Atomic Insights reader shared a document that inspired a new line of thinking about the chronology of atomic energy development.
The inspirational document was a PDF copy of a chapter titled Little Red Schoolhouse from Freeman Dyson‘s memoir, Disturbing the Universe. It was a brief tale about a memorable burst of creativity in the company of 30-40 free-thinkers who spent the summer of 1956 in San Diego in a rented school building conceiving new ideas for marketable nuclear power sources and research tools.
In addition to Dyson, the group included Ted Taylor and Edward Teller. They had been called together by Frederic de Hoffmann, who had spent the fall of 1955 persuading the top management of the General Dynamics corporation that the time had arrived for the company to get involved in the commercial development of atomic energy products.
“Freddy” de Hoffman had a solid basis for his recommendation. He had just returned from the first Atoms for Peace conference held in Geneva, Switzerland in August 1955. In addition to attending the conference, de Hoffman had been one of the two American members of the group of 17 international experts chosen to curate the technical program for the conference by reviewing and selecting the papers to be presented. In that position he saw not only the ideas that made the cut to be presented, but others that were in earlier stages of maturity.
Dyson reported that he was invited to join the group based on a previous encounter with Edward Teller, even though he had never had anything to do with nuclear energy. He accepted the invitation because he wanted the chance to work with Teller and because he had carried a vision about the potential of almost unlimited quantities of atomic energy since 1937. His vision did not come from a fictional work like H. G. Wells’s classic The World Set Free, but from a book of lectures by one of the leading scientists in Great Britain that he had read as a boy growing up in Winchester.
Here is how Dyson described the impact of that book on his thinking.
Eddington the astronomer, in the book New Pathways in Science, which I read as a boy in Winchester, not only warned us against nuclear bombs but promised us nuclear power stations. here is the happier side of his vision of the future:
We build a great generating station of, say, a hundred thousand kilowatts capacity, and surround it with wharves and sidings where load after load of fuel is brought to feed the monster. My vision is that some day these fuel arrangements will no longer be needed; instead of pampering the appetite of the engine with delicacies like coal and oil, we shall induce it to work on a plain diet of subatomic energy. If that day ever arrives, the barges, the trucks, the cranes will disappear, and the year’s supply of fuel for the power station will be carried in a tea-cup.
This vision had always remained vivid in my mind, together with the warning against the military use of subatomic energy which appears a few pages later in the book. Eddington used the word “subatomic” to describe what we now call nuclear or atomic energy. We all knew even in 1937 that the world would soon run out of coal and oil. The possible availability of nuclear energy to satisfy the peaceful needs of mankind was one of the few hopeful prospects in a dark period of history.
(Source: Dyson, Freeman, Disturbing the Universe, p. 94)
Like the vast majority of the world’s population, Dyson believed that developing an abundant, incredibly dense source of energy would be a boon to mankind. Like so many of the straight-talking people I talk or correspond with, however, he appears to have been unable to envision how the people who own or operate “the barges, the trucks, the cranes,” or who control access to the “delicacies like coal and oil” might react to protect themselves from being made obsolete.
I had heard about Eddington and his prediction about subatomic energy before reading that passage. Almost two years ago, I mentioned him in a post titled Smoking Gun Research Continuing in Earnest, describing how he had given an address at the 1930 meeting of the World Power Conference. His talk included a phrase so memorable in the collective mind of the energy industry leaders that it still appears prominently on the history page of the World Energy Council.
In his address, Eddington said that in the future “subatomic energy would provide the plain diet for engines previously pampered with delicacies like coal and oil.”
I’d searched in vain to find the full talk, but Dyson’s mention gave me the clue I needed. A few keystrokes, a click or two, and $10 later I was able to begin reading Eddington’s lectures to find out more about the discoveries that had given him the ability — well before 1930 — to confidently predict in print and at internationals gatherings of businessmen, politicians that subatomic energy existed, could be released for either vast good or harmful weapons, and would someday reduce the market’s appetite for coal and oil.
Despite my decades of interest in atomic energy and its history, I had not thought much about the fact that a whole segment of the scientific community was absolutely sure that “subatomic” energy could be released based on their studies of heat and light production from stars.
I have called it a vision; but to the astronomer it means much more than an extravagant flight of theory. We look up at the sky and our telescopes show a thousand million stars. Everyone of those is a celestial furnace which apparently defies the law that limits our terrestrial undertakings–that if you do not continually replenish your furnace it will die out. Geological, physical, biological evidence seems to make it certain that the sun has warmed the earth for more than a thousand million years; but the calculation first made by Kelvin still stands incontrovertible that the sun’s heat cannot have been maintained for more than twenty million years unless it is being fed from some secret store of energy of a kind unknown in his day. By all ordinary rules the sidereal universe which we see blazing with light should have long since been cold and dead. None of the sources of power utilised by our present civilization could have kept it alive for more than a small fraction of the time it is known to have existed. It seems then quite plain that the “cup of water” method of maintenance is actually in operation in the stars, or that there is some partial adaptation of it. To the engineer the prolific liberation of subatomic energy is a Utopian dream; to the physicist it is a pleasant speculation; but to the astronomer it is just a common well-recognized phenomenon which it is his business to investigate.
(Source: Eddington, Sir Arthur, M.A.,D.Sc., LL.D., F.R.S., New Pathways in Science, Messenger Lectures delivered to Cornell University, 1934. Printed in Great Britain. p. 137)
A more cynical man — like me — might have included a mention that “to a coal or oil businessman, the prolific liberation of subatomic energy is an existential nightmare which it is his business to block it as long as possible.”
For astronomers, subatomic energy was a demonstrated fact; they had no doubt that it could be made to work on Earth someday.
By 1956, when Dyson and his visionary friends were devising new ways to beneficially use atomic energy, other scientists and engineers had discovered how to release and control atomic energy. There were submarines plying the world’s oceans using nuclear energy and several reactors had already produced useful electricity for grid distribution.
It was during that same hopeful summer that the Rockefeller Foundation-sponsored National Academy of Sciences committee on the Biological Effects of Atomic Radiation (BEAR) was putting the finishing touches on a report with “findings” and recommendations that continue to discourage people from replacing coal and oil with atomic energy. As the BEAR Genetics subcommittee’s heavily promoted report concluded:
We ought to keep all of our expenditures of radiation as low as possible. Of the upper limit of ten roentgens suggested in Recommendation C, we are at present spending about one-third for medical X-rays. We are at present spending less–probably under one roentgen–for weapons testing. We may find it desirable or even almost obligatory that we spend a certain amount on atomic power plants. But we must watch and guard all our expenditures. From the point of view of genetics, they are all bad.
A similar committee working in Great Britain, another nation whose wealth and political power was tied to petroleum production and financing, produced a similar report at the same time.
People who regularly read Atomic Insights have heard this story before, but it’s important to review the highlights.
The Genetics Committee, one of six subcommittees of the BEAR committee, was chaired by Warren Weaver, the director of natural science programs for the Rockefeller Foundation. His committee included 12 carefully selected geneticists, at least 7 of whom had received significant financial support from his program at the RF.
Hermann J. Muller, a Nobel Prize winning scientist who owed his career to Rockefeller Foundation grants and academic endorsements, came to the first meeting of the committee prepared with an assertion that there was no threshold for damage from radiation. He had been on the stump with that assertion for at least a decade. Unbeknownst to almost everyone, that assertion had been disproven by experiments that he oversaw.
In order to defend his position regarding the lack of a threshold for radiation damage he had cooperated with several others, including Curt Stern and to obscure the results of those experiments. (Calabrese, E.J., How the US National Academy of Sciences misled the world community on cancer risk assessment: new findings challenge historical foundations of the linear dose response, Arch Toxicol (2013) 87 p 2064)
The BEAR Genetics committee members deferred to Muller’s authority and spent the rest of their meetings devising methods to estimate the slope of the line. Their estimates of radiation risk varied by almost three orders of magnitude. Some of the committee members refused to provide estimates because there was so little experimental data available and what little was available provided massively divergent results. The disagreements among the committee members were obscured in the final report by discarding the low effects estimates and by failing to note the refusal of some of the members to provide an estimate. (Calabrese, E.J., On the origins of the linear no-threshold (LNT) dogma by means of untruths, artful dodges and blind faith, Environmental Research, 142 (2015) p. 437)
Though there is far more to the story to tell, a few more facts are worth noting in this brief summary. The effort by the NAS to study and produce a publicly promoted report on the biological effects of atomic radiation was initiated and funded by the Rockefeller Foundation. As the sole funding source for the studies, which lasted from 1954-1962, the RF provided the study charges to the National Academy of Sciences.
They stacked the deck by placing the man in charge of their grant program in the role of Chairman of a key subcommittee and providing several of their steadily supported scientists to serve on the committee. They achieved a desirable–for them–result of producing a clearly worded, extremely credible report that instilled enough fear of radiation to permanently hamper the development of the technology.
The fear produced a predictable call for multiple layers of protection and approvals that were never applied to technology with greater proven harm, all because a committee of their creation had asserted that all doses of radiation cause harm even if that predicted harm couldn’t ever be detected. There is nothing like an unseen, undetectable, incurable agent that might lead to damage to distant generations to cause long-term psychosis.
The RF had all of the means, motive and opportunity needed to create the fear, uncertainty and doubt campaign. There is documentary evidence that the participants in the creation of the key 1956 report were at least partly motivated by their desire for increased funding. There is also indication that they were promised that those increases would follow their agreement to go along with the report. (Calabrese, E.J., On the origins of the linear no-threshold (LNT) dogma by means of untruths, artful dodges and blind faith, Environmental Research, 142 (2015) p. 438)
Muller, the leading participant in the effort to create the “no safe dose” assertion, was immediately rewarded when the Rockefeller foundation gave his employer, the Indiana University, a grant of $350,000 in 1956. That grant brought the total funding from the RF to Muller’s genetics research group at IU to almost $700,000. That was serious money in 1956. This is about as close as one can get to quid pro quo evidence for an event that happened close to 70 years ago. (Ref: Rockefeller Foundation Annual Report, 1956 p. 117).
It’s time to discard the no threshold model and to recognize that there is no evidence that supports the hypothesis that all radiation, no matter how low the dose, is bad. In fact, there is a growing body of evidence to support the contrary result. At certain doses and dose rates, ionizing radiation stimulates adaptive protection systems. When received at a dose rate on the order of 0.1-0.2 cGy/day, it is an antioxidant that makes us healthier.
Until the publication of the carefully crafted, economically motivated report of the NAS BEAR I genetics committee, the low dose stimulatory effect of radiation had been recognized and accepted by nearly all radiation scientists. That acceptance was based on observational evidence of human responses, not on falsified reports of high dose experiments conducted on fruit flies.
Rod, watching yoour mind at work is a pleasure. Keep it up!
You state “There is nothing like an unseen, undetectable, incurable agent that might lead to damage to distant generations to cause long-term psychosis.”
This seems at odds with most studies on people’s ability to assess and address long-term existential threats. I know you consistently point out efforts to sustain the fear, uncertainty, and doubt (FUD) related to the linear no-threshold (LNT).
I would really like to see you work with some of your contacts to produce either a report or a book piecing this all together in clear terms. Preferably something I could hand to people when they tell me how wonderful their solar panels are for the environment.
Thank you very much for your efforts,
That’s a great link. It shows how the economically motivated propaganda techniques in use today haven’t changed much from the campaign I described to deny the science of subatomic energy and create doubt in the public mind about the benefits of replacing the delicacies of coal and oil with a far less resource intensive source of almost unlimited power.
I’ve never thought of referring to H.J. Muller as a “gadfly” who created controversy out of thin air, but the description actually fits pretty well.
Superb article on the historical context and the dirty propaganda war waged by competing interests.
When received at a dose rate on the order of 0.1-0.2 Gy/day, it is an antioxidant that makes us healthier.
This needs further explanation. While the dose rate is smaller that for a CT scan (10 mGy in minutes), Getting this for a day or day after day after day would not be healthy.
While I do share your belief that radiation dangers have been massively overstated for the benefit of the commericial interests you so well describe, I do not think it is helpful to go to the other extreme and declare large radiation doses harmless.
Remember that we need to reach out ot many who are skeptic or hostile towards nuclear energy. Changing your own belief takes a lot of effort and people will use any mental excuse to avoid this. So the 0.1-0.2Gy/day remark at the end will come as a relief to any anti who has read this far. The response this triggers is “0.1-0.2Gy/day, that can’t be right, so the whole article must be wrong,”
Thank you for pointing out my very important typo – that should have been 0.1-0.2 cGy (0.1-0.2 R)
A factor of 100 is a big deal. I’m sorry for the error.
This is still very high, especially as a chronic level it would add up to 350 to 700 mSv a year. I think call to relax radiation limits or safety standards are highly counterproductive. No airline would argue that aviation safety regulations should be reduced. Obviously no airline has to fight with a regulator littered with people that work diligently on ploys to shut down the whole industry.
So I do understand your frustration with regulation that is not introduced to increase safety, but solely as a punitive measure to increase cost.
Such measures are not only about radiation and safety, but also on bogus environmental concerns, like cooling tower requirements for seawater cooled plants. I think this can only be tackled by insisting that the competition is measured by equally high standards, not by lowering your own standards.
There aren’t any safety features on modern nuclear plants that I would call superflous. The QA requirement for secondary parts might be something that should be removed. Concrete and rebar that is good enough for dams, bridges and skyscapers should be good enough for a reactor components. A diesel generator good enough for a hospital should be good enough for an NPP.
Might be worth mentioning that the teacup analogy is informed by Einstein’s E=m*c^2.
A 100 MW power station (300MW thermal), will in a year consume a mass of m=E/c^2
=3e8 J/s * 3e7 s/(3e8 m/s)^2 = 0.1kg = 1 teacup.
Complete transformation of mass into energy would require a stash of antimatter, which seems to be hard to obtain. Complete fissioning of uranium in a fast breeder would yield approximately a thousand’s of this, so the 100MW power station would need around 100 kg fuel in a year.
As an example for recent carefully targeted fossil fuel funded philantropy, you may find this interesting:
Qatar Friendship Fund (QFF) opens first mega-project, a 24.3 million USD multifunctional fishery processing facility in Onagawa, Japan
Note that a restart of the Onagawa nuclear plant, which rode out earthquake and tsunami without problems, would displace $600 million worth of LNG annually if restarted. Given that Qatar is the world LNG leader, we can expect vocal opposition from the local fishermen, most likely complaining that routine tritium releases would endanger their business of hauling the last fish from the rapidly emptying ocean.
I suspect that the jobs at the nuclear plant are much better-paying and more desirable than the fishing jobs.
They are a lot of fishers in the area, it’s the main traditional work and fear of radiation made it a lot harder to sell their fish. RRMeyer totally hits the spot here.
The jobs might be higher paying, but the economy in Japan is very much dependent upon the fishing industry. Overall, far more jobs are created by the fishing industry than NE. Actually, my asertion, presented as fact, is somewhat presumptive. I should say that I would assume that far more people are employed by the fishing industry than are by the NE industry.
But putting the blame squarely on the fishing industry, in respect to the implication they would create an “excuse” for not starting this NPP is somewhat shallow. Sentiment globally has a distrust, even if ill advised, of fish products suspected of being exposed to radiation. The fishermen prefer to sell their product, and would be more inclined to disregard the exposure, or be open to discrediting the alleged “dangers” of eating fish that have been exposed.
In truth, the startup DOES affect the ability of these fishermen to sell their product. They aren’t creating the FUD. They are reacting to their consumer’s belief in the FUD. Blaming the fishermen is somewhat ill advised. Blame those who create the alleged FUD, that damages the marketability of the fishermen’s product.
One of Rod’s podcasts had a guest from a Californian energy agency who spoke about the history of anti-nuclear activism in California. Does anybody remember the details? I recall the guest saying that a fishing town became very angry when they heard a nuclear power plant was going to be built nearby and didn’t want their locale ruined.
‘ ..the economy in Japan is very much dependent upon the fishing industry. Overall, far more jobs are created by the fishing industry than NE.’ In fact, Japan ran balance of trade surpluses for thirty years, till 2012, when they closed a third of their electricity production and started importing huge volumes of coal and gas to replace the power from the reactors.
‘Between 1980 and 2010 Japan recorded a trade surplus every year. But since the Fukushima nuclear disaster in March 2011, imports have surged due to the weakening of the Japanese yen and increased purchases of fossil fuels and gas. As a result in 2014 trade deficit was the worst on record.’
But as Rod often points out, higher costs for Japan mean higher profits for the guys selling the fuel.
As for the real or perceived contamination of fish, the only one above Japan’s limit for radiocesium in the last few years was caught within the Fukushima Daichi breakwater, which has been cut off from the ocean. Blue water fish have higher levels of radiation from cold-war era bomb testing, and much higher levels from naturally occurring polonium, radon and potassium. In contrast, mercury from burning coal has been steadily raising levels of mercury in fish, especially in apex predators like tuna and dolphin, but the Japanese seem quite blasé about that – the government only sets limits for inshore fish.
‘A Health Ministry survey in 2005, for instance, found an average of 0.7 parts per million of mercury in bluefin tuna, and the highest concentration found was a startling 6.1 parts per million – more than 15 times the limit for other types of seafood. The Japanese government has issued advisories warning pregnant women and young children to limit their consumption, but mercury does not seem to be a high priority for officials.’
Poa: If the fishermen in Fukushima wanted to fight the FUD, they would point out that
1) Fish are carefully monitored and all seawater fish (even those caught next to the plant) have been below the extremely strict 100Bq Cs/kg standard in the last couple of months.
2) In the original accident 3.5PBq (=3.5e15Bq) Cs-137 was released directly to the ocean. Any additional releases that might be happening now are by a factor of a million to a billion lower and have no chance to make any change to the radiological situation.
Instead, they have for over a year prevented Tepco from releasing 4000 tons of water that was pumped from the ground on the hillside of the plant, and carefully decontaminated to meat international drinking water standards. Yes, that is right, Tepco is not allowed to release water that is safe to drink into the ocean.
They are also preventing any relaese of contaminated water that has been stripped from all radioisotopes except tritium. Tritium does not bioaccumulate and would instantly dilute to completely harmless (and even undetectable) levels if released. La Hague releases 10 times as much tritium every year than is stored at Fukushima, without any environmental impact.
By claiming that these releases might be dangerous, the fishermen are adding to the FUD. I guess they are mainly motivated by getting as much compensation as possible out of Tepco. But I bet that fossil fuel interests buying influence, like in Onagawa, would also have an impact.
I had some problems with the link. If others have trouble, here is a link to the article:
It’s been my experience that the warm water released from nuke plants attracts fish. The warm water released from the condenser has a lot of fish food within. Fishermen, at least sport fishermen (OK fisher-persons) flock to outlets at power plants. Enhancing the productivity of the water may be considered a good thing. I doubt whether vastly watered down tritium releases would have any discernible effect on the fish poplulation.
I noticed the ‘typo’ in Gy/day went I converted to 20 Rem per day. So I knew it was not right.
I picked up 1 Rem in half hour in S3G prototype S/G, eddy current testing in 1970. No problem, of course. The .002Gy/day is also 73 rem per year, so it’s up there a bit, but still would not bother me.
Below are two links to newly released investigative reporting. The report is not nuclear related, and introducing it here runs a risk of “polluting” your narrative with the topic of climate change. However, if corroborated, I think you will find the information provided to be directly and strongly supportive of your general hypothesis.
Thanks for the great links. Though some may see it as off topic, I don’t. It supports the premise that hydrocarbon suppliers take action to protect their business, even if it means slanting science.
I thought you might appreciate them. I had heard about this previously. But this reporting appears to be reasonably comprehensive and properly documented.
If this reporting is accurate (and I tend to think it is), it does far more than indicate “that hydrocarbon suppliers take action to protect their business, even if it means slanting science.”
I suggest that this material could be the basis for an important chapter in your book.
I agree with Rod, great links.
Commenting on the original BEAR report, Rod Adams wrote:
A similar committee working in Great Britain, another nation whose wealth and political power was tied to petroleum production and financing, produced a similar report at the same time.
The sordid story of the BEAR report and its linear, no-threshold theory has been well covered in this blog. It might be interesting to learn the story of the British committee and its similar conclusions. It would not surprise me if shady deals of the same kind were involved.
Well – Here’s another thought that is slightly off topic. It fits in with the conspiracy thinking. It seems rather curious that the oil companies have become awash in natural gas and oil in a rather sudden manner. Horizontal drilling and fracking do not seem that revolutionary that they wouldn’t have been around a while. A lot of work was going on to find fuels that will serve as alternatives to gasoline. I’m sure the oil companies were keeping a close eye on this situation. What would the oil companies do to keep such an alternative from coming to market? Flooding the world with cheap oil would eliminate such competition for a generation. Cheap natural gas isn’t helping the promulgation of nuclear power, for example.
But in our world of competitive pricing driven by pure market forces, this scenario can’t happen can it?
I believe that the correct term you are searching for is “price war.” Here is a sampling of articles that might interest you.
For some reason, the only price war that most of the media has recognized recently is the one that the Saudis are waging on the frackers and other non-conventional oil producers.
Rod’s right about the price war. But there’s another element to this too: the “associated gas” coming up along with the tight oil in the Bakken and elsewhere had to go somewhere, and in many places it was illegal to flare it. Even if the price gotten for it was minimal, the oil paid for the well so the gas went for whatever it would fetch.
This situation is not going to last long. As the tight-oil companies go bankrupt with sub-$50 oil, they’ll stop completing new wells and likely shut in some old ones. The surfeit of gas may quickly turn into a scarcity, and the price could go back toward the 2008 peak again. I expect this to happen before the Vogtle and Summer plants go on-line, and the CEOs who commissioned them are going to look like prophets.
You are right about the drilling. Sometimes I read T. Boone Pickens statements. He has reported the same. Less supply will mean higher prices.
In my area, the local coal generation station is being closed down and will be replaced with a gas cogeneration facility. This is happening across the country. More demand will mean a higher price. I do not believe the shift to natural gas over other fuels is confined to the power industry.
Rod has reported previously about the major natural gas exporting facilities being constructed. Less supply means a higher price.
ENRON was originally a gas pipeline company. I guess they bankrupt PG&E with their shenanigans. Those links seem to be pointing to something similar or at least that the glut is temporary. Good bantering with Kenneth llindsey. If it seems to good to be true,……..
I suspect the issue of “price war” or “market share” is much broader.
Why did the KSA chose to act at this time? It seems likely the production from US shale and tight oil contributed to their decision. However, if market share was their true concern, it is interesting to note that there are multiple other “new sources” beginning to impact the market. These include:
Non-corn based biofuels and biochemicals are beginning to have commercial import. In particular, the biochemicals are higher margin products replacing oil based incumbents. While still relatively low production volume, the weakness of these products is production cost. A good time to hit them with price competition.
Gas to liquids. Multiple smaller GTL systems are coming on the market. These have the potential to convert substantial amounts of currently flared gas to useful, cost effective products.
Large gas projects. While US gas production may decline somewhat (I am not sure I believe that), several very large gas projects are about to come on stream, for example Australia.
Gas to ethane. This has the potential to have an important impact on oil use. Presently most ethane is co-produced or cracked from oil. Siluria appears to have an alternative that could be much lower cost. Interesting that Saudi Aramco is a major investor in this start-up.
Technology that could lower the cost of certain oil sands production and enable production in previously unlikely locations. Most notable, if proven, is technology from MCW.
My point is that the oil prices that had been in place for roughly the last decade were sufficiently high, and had been in place for sufficient duration, that they were supportive of multiple alternate competitive alternatives to fossil oil.
OPEC (notably the KSA) needed to act at this time.
Recent Comments from our Readers
The Clinton Nuclear Plant also in Illinois was shutdown essentially for almost 2 years before it was taken over by…
Good Podcast – Very informative One thing that was not discussed is how to deal with a particular fear that…
Renewables people are masters in marketing. Unreliable intermittent generators whose output is all over the place, and usually badly correlated…
Looking at their lineup, Westinghouse seems bound and determined to keep Gen IV in its “place” which is apparently the…
So they are developing a scaled down version of the AP1000, which is a scaled up version of the AP600,…