Some lessons were learned from TMI. Others were not.

Three Mile Island from the air

Three Mile Island

On March 28, 1979, a little more than thirty-five years ago, a nuclear reactor located on an island in the Susquehanna River near Harrisburg, Pennsylvania, suffered a partial core melt.

On some levels, the accident that became known as TMI (Three Mile Island) was a wake-up call and an expensive learning opportunity for both the nuclear industry and the society it was attempting to serve. Some people woke up, some considered the event a nightmare that they would do anything to avoid repeating, and some hard lessons were properly identified and absorbed. Unfortunately, some people learned the wrong lessons and some of the available lessons were never properly interpreted or assimilated.

The melted fuel remained inside the TMI unit 2 pressure vessel, nearly all the volatile and water-soluble fission products remained inside the reactor containment, and there were no public health impacts. The plant was a total loss after just three months of commercial operation, the plant buildings required a clean-up effort that took 14 years, the plant owner went bankrupt, and the utility customers paid dearly for the accident.

The other unit on the same site, TMI-1, continues to operate well today under a different owner.

Although the orders for new nuclear power plants had already stopped several years before the accident, and there were already people writing off the nuclear industry’s chances for a recovery, the TMI accident’s emotional and financial impacts added another obstacle to new plant project development.

In the United States, it took more than 30 years to finally begin building new nuclear power plants. These plants incorporate some of the most important lessons in their design and operational concepts from the beginning of the project development process. During the new plant construction hiatus, the U.S. electricity industry remained as dependent as ever on burning coal and burning natural gas.

Aside: A description of the sequence of events at TMI is beyond the scope of this post. There is a good backgrounder—with a system sketch—about the event on the Nuclear Regulatory Commission’s web site. Another site with useful information is Inside TMI Three Mile Island Accident: Moment by Moment. End Aside.


The TMI event was the result of a series of human decisions, many of which were made long before the event or in places far from the control room. Of those decisions, there were some that were good, some that were bad, some that were reactions based on little or no information, and many made without taking advantage of readily available information.

One of the best decisions, made long before the event happened, was the industry’s adoption of a defense-in-depth approach to design. From the very beginning of nuclear reactor design, responsible people recognized that bad things could happen, that it was impossible to predict exactly which bad things could happen, and that the public should be protected from excess exposure to radioactive materials through the use of multiple barriers and appropriate reactor siting.

The TMI accident autopsy shows that the basic design of large pressurized water reactors inside sturdy containment buildings was fundamentally sound and adequately safe. As intended by the designers, the defense-in-depth approach and generous engineering margins allowed numerous things to go wrong while still keeping the vast majority of radioactive materials contained away from humans. Here is a quote from the Kemeny Commission report:

We are convinced that if the only problems were equipment problems, this Presidential Commission would never have been created. The equipment was sufficiently good that, except for human failures, the major accident at Three Mile Island would have been a minor incident.

Though it is not well-known, the NRC completed a study called the State of the Art Reactor Consequences Analysis (SOARCA aka NUREG-1935) that indicated that there would be few, if any, public casualties as the result of a credible accident at a U.S. nuclear power plant, even if there were a failure in the containment system.

One of the most regrettable aspects of TMI was that the heavy investment that the United States had made into the infrastructure for manufacturing components and constructing large nuclear power plants—factories, equipment, and people— was mostly lost, even though the large components and basic design did what they were supposed to do.

There were, however, numerous lessons learned about specific design choices, control systems, human machine interfaces, training programs, and information sharing programs.

Emergency core cooling

The Union of Concerned Scientists and Ralph Nader’s Critical Mass Energy Project had been warning about a hypothetical nuclear reactor accident for several years, though it turns out that they were wrong about why the emergency core cooling system did not work as designed.

The core damage at TMI was not caused by a failure of the cooling system to provide adequate water in the case of a worst case condition of a double-ended sheer of a large pipe; it was caused by a slow loss of cooling water that went unnoticed for 2 hours and 20 minutes. The leak, in this case, was a stuck-open relief valve that had initially opened during a loss of feedwater accident.

While the slow leak was in progress, the operators purposely reduced the flow of water from the high pressure injection pumps, preventing them from performing their design task of keeping the primary system full of water when its pressure is low.

It’s worthwhile to understand that the operators did not reduce injection flow by mistake or out of malice. They did what they had been trained to do. Their instructors had carefully taught them to worry about the effects of completely filling the pressurizer with water because that would eliminate its cushioning steam bubble. Their instructors and the regulators that tested them apparently did not emphasize the importance of understanding the relationship between saturation temperature and saturation pressure.

The admonition to avoid “going solid” (filling the pressurizer with water instead of maintaining its normal steam bubble) was a clearly communicated and memorable lesson in both classroom and simulator training sessions. When TMI control room operators saw pressurizer level nearing or exceeding the top of its indicating range, they took action to slow the inflow of water. At the time, they had still not recognized that cooling water was leaving the system via the stuck open relief valve.

The physical system had responded as it had been designed, but the designers had neglected to ensure that their training department fully understood the system response to various conditions that might be expected to occur. It’s possible that the designers did not know that a pressurizer steam space leak could cause pressure to fall and the pressurizer level to rise at the time that they designed the system. There was not yet much operating experience; the large plants being built in the 1960s and 1970s could not be fully tested at scale, and computer models have always had their limitations, especially at a time when processing power was many orders of magnitude lower than it is today.

There was also a generally accepted assumption that safety analysis could be simplified by focusing on the worst case accident.  If the system could be proven to respond safely to the worst case conditions, the assumption was that less challenging conditions would also be handled safely. The focus on worst case scenarios, emphasized by very public emergency core cooling system hearings, took some attention away from analyzing other possible scenarios.

Lessons learned

  • Following the TMI accident, there was a belated push to complete the loss of flow and loss of coolant testing program that the Atomic Energy Commission had initiated in the early 1960s. For a variety of political, financial, and managerial reasons, that program had received low priority and was chronically underfunded and behind schedule.
  • Today’s plant designs undergo far more rigorous testing programs and have better, more completely validated computer models.
  • Far more attention has been focused on the possible impact of events like “small break” loss of cooling accidents.
  • All new operators at pressurized water reactors learn to understand the importance of the relationship between saturation pressure and saturation temperature.

At the time of the accident, there was no defined system of sharing experiences gained during reactor plant operation with all the right people. TMI might have been a minor event if information about a similar event at Davis-Besse, a similar but not identical plant, that happened in September 1977 had made it to the control room staff at TMI-2.

Certain sections of the NRC knew about the Davis-Besse event, engineers at the reactor supplier knew about it, and even the Advisory Committee on Reactor Safeguards was aware of the event, but there was no established process for sharing the information to other operating units.

Lesson learned: After the accident, the industry invested a great deal of effort into a sustained program to share operating experience.

The plant designers also did not do their operators any favors in the design and layout of the control room. Key indicators were haphazardly arranged, there were thousands of different parameters that could cause an alarm if out of their normal range, and there was no prioritization of alarming conditions.

Lesson learned: After the accident, an extensive effort was made to improve the control rooms for existing plants and to devise regulations that increased the attention paid to human factors, man-machine interfaces, and other facets of control room design. All plants now have their own simulators that are designed to mimic the particular plant and are provided with the same operating procedures used in the actual plant. Operators are on a shift routine that puts them in the simulator for a week at a time every four to six weeks.

The initiating failures that started the whole sequence took place in the steam plant, a portion of the power plant that was not subject to as much regulatory or design scrutiny as the portions that were more closely associated with the nuclear reactor and its direct cooling systems.

Lesson still being learned: An increased level of attention is now paid to structures, systems, and components that are not directly related to a reactor, but there is still a confusing, expensive, and potentially vulnerable system that attempts to classify systems and give them an appropriate level of attention.

For at least 10 years prior to March 28, 1979, there had been an increasingly active movement focused on opposing the use of nuclear energy, while at the same time the industry was expanding near many major media markets and was one of the fastest growing employment opportunities, especially for people interested in technical fields. The technology was often in the spotlight, with the opposition claiming grave safety concerns and the industry—rather arrogantly, quite frankly—pointing to what had been a relatively unblemished record.

The industry did not do enough in the way of public outreach or routine advertising to explain the value of their product. They rarely compared the characteristics of nuclear energy against other possible electricity sources—mainly because there are no purely nuclear companies. In addition, the electric utility industry has a long tradition of preferring to be quiet and left alone.

The accident at TMI developed slowly over several days, but it became a major news story by mid-morning on the first day. Not only was it a “man bites dog” unusual event, but it was an event that the nuclear industry, the general public, the government, and the news media had been conditioned to take very seriously. Although nuclear experts from around the United States sprang into action to assist where they could at the plant itself, there was no established group of communications experts who could help reporters understand what was happening.

No reporter on a deadline is motivated or willing to wait for information to be gathered, evaluated, and verified. In the absence of real experts willing to talk, they turned to activists with impressive sounding credentials who were quite willing to speculate and spin tall tales designed to generate public interest and concern.

Lesson not yet learned: Although most decision makers in the nuclear industry understand the importance of planned maintenance systems to keep their equipment in top condition and the importance of a systematic approach to training to keep their employees performing at the top of their game, they have not yet implemented an effective, adequately resourced, planned communications program that helps to ensure that the public and the media understand the importance of a strong nuclear energy sector.

Planned communications efforts have a lot in common with planned maintenance systems. They might appear to be expensive with little immediate return on investment, but repairing a broken public image is almost as challenging and expensive as repairing a major plant component that failed due to a decision to reuse a gasket or postpone an oil change. As the guy in the commercial says, “You can pay me now or pay me later.”

That is probably the most tragic part of the TMI event. Despite being the subject of several expensively researched and documented studies, countless articles, thousands of documented training events, and more than a handful of books, the event could have—and should have—made the established nuclear industry stronger and the electric power generation system around the world cleaner and safer.

So far, however, TMI Unit 2’s destruction remains a sacrifice made partially in vain to the harsh master of human experience.

Note: I have purposely decided to avoid attempting to discuss the performance of the NRC or to judge their implementation of the lessons that were available to be learned. That effort would require a post at least twice as long as this one.

Additional Reading

General Public Utilities (March 28, 1980) Three Mile Island: One Year Later

Gray, Mike, and Rosen, Ira The Warning: Accident at Three Mile Island a Nuclear Omen for the Age of Terror W. W. Norton, 1982

Ford, Daniel Three Mile Island: Thirty Minutes to Meltdown Penguin Books, 1981

Hampton, Wilborn Meltdown: A Race Against Disaster at Three Mile Island A Reporter’s Story Candlewick Press, 2001

Report of the President’s Commission On The Accident At Three Mile Island. The Need for Change: The Legacy of TMI, October 1979

Three Mile Island A Report to the Commissioners and to the Public, January 1980

A slightly different version was originally published on March 24, 2014 at ANS Nuclear Cafe as What Did We Learn From Three Mile Island?. The only things changed here were the first paragraph and the placement of the TMI photo.

About Rod Adams

8 Responses to “Some lessons were learned from TMI. Others were not.”

Read below or add a comment...

  1. Robert Hargraves says:

    Has DOE and NRC not yet learned the importance of timely, transparent communications about accidents? The recent failure to communicate facts about the “puff” at the Waste Isolation Pilot plant allowed press speculation to abound.

  2. FermiAged says:

    My first involvement in nuclear power plant simulators came in the mid-1980’s. Even then, the simulators were barely adequate but rapidly getting better. Going solid or drawing a bubble realistically in the pressurizer was not an easy task for a simulator model. Forget about draining the RCS or mid-loop operations.

    It must have been even worse in the 1970’s. The state of the art is ALOT better now.

    • cyril r says:

      New plant offerings have a superior approach. They have all the operator training etc. but they also have automatic or even passive systems as a backup in case operators do make multiple mistakes. The use of special valves in the ESBWR for example, means that the reactor will be fed with coolant even if the operators close down coolant valves and think there’s plenty of coolant around. The system can’t be overriden, and the type of valve is such that it requires no maintenance at all over the 60 years of plant lifetime.

      This takes all operational human factors out of the equasion.

      In my opinion we should not rely on human actions for nuclear plant safety. If we are to power the world with nuclear, we need to think about 10000+ nuclear plants in almost every country in the world. Consider just the cultural differences as one example – some countries have a general haphazard approach to safety as part of their culture.

      Designing in safety is much easier than learning humans how to behave flawlessly in hundreds of countries around the world.

  3. cyril r says:

    TMI had another important – and largely unlearned – lesson. The safety systems worked well enough that, even in the face of human error and a confusing ( I would say flawed) plant control/electrical design, there was not a single casualty. The containment in particular worked well. Even by the silly LNT standard, only around 1 person would have died from radiation. By any scientific standard, its zero.

    Despite this important observation, nobody in charge seems to have learned this lesson. In stead whenever there’s a problem at a nuclear plant, people are evacuated, traffic accidents kill many, people aren’t allowed to return to their homes and commit suicide (fukushima).

    It is even more sad when you realize that there was another learning opportunity, Chernobyl. The evacuation and permanent displacement over a few millisieverts has caused a large multiple of deaths, damage, economic harm, than the radiation dose itself.

    Fukushima was just a repeat of history. No lesson learned. Just mindless evacuation of everybody in a 20 mile radius, without considering dose rates, and not allowing return (permanent displacement) even after 3 years now.

    Because of this mindless evacuation approach, not only has greater harm been done than without any evacuation at all, but also nuclear energy is considered very dangerous.

    The WHO recently estimated the yearly death toll from fossil fuel and biomass burning to be 7 million per year. This is amazing news and yet it got sparse mainstream media coverage. In stead we see so called “environmentalists” suggesting we increase usage of biomass, even though biomass kills millions of people per year. And not all just old people, children are also greatly affected. And as if that wasn’t bad enough, the “environmentalists” suggest we use this killer biomass to displace clean and safe nuclear plants.

    Whenever there’s a nuclear accident, people go out on the street crying for the end of the nuclear industry.

    But when we hear from the WHO that 7 million people are killed every year by fossil fuels and biomass, nobody goes out on the street. When there’s an aircraft crash – happens about every month – nobody is out on the street calling for the end of the aircraft industry. When Volvo, a major car maker, produces a report than 2 million people are killed by traffic accidents around the worlds, nobody is out on the street calling for an end of motor-powered-transport. When there’s a major bus accident killing dozens of little children, seems really dramatic, yet again, nobody out on the street calling for the death of the bus industry.

    This nuclear exceptionalism is debillitating.

    • simple touriste says:

      “The WHO recently estimated the yearly death toll from fossil fuel and biomass burning to be 7 million per year.”

      Is that a LNT extrapolation?

  4. SolarEyes says:

    Stanford Plan for 100% Wind, Water & Solar (WWS) for all energy purposes (heating, cooling, transportation, industry) by 2030-2050

    • Engineer-Poet says:

      Jacobson glosses over the massive difficulties of space- and time-shifting the unreliable power flows of wind and solar to where and when they are most needed.  Among other things, the energy cost of the storage required to convert an average watt of PV into a watt of firm capacity lowers the EROI to barely above 2.