David Boettcher

David Boettcher - Professional Engineer

Eur Ing D B Boettcher BSc(Hons) CEng MIET

Providing innovative solutions to engineering and business problems opportunities.


Bad design of Man-Machine Interface (MMI)

One of the areas I became interested and involved with during my time in nuclear engineering was the consideration and incorporation into the design process of human factors and consideration of human performance, sometime called ‘ergonomics’ or engineering psychology. Ergonomics should influence and guide the design of the points at which humans and machines meet and interact, the ‘Man-Machine Interface’ or MMI.

My interest was particularly in the design of nuclear reactor control rooms, but the techniques involved can be applied anywhere a human being has to interact with a machine. Good ergonomics makes it easier for humans to understand what is happening and to make decisions about what actions to take.

Too often in the past designers neglected consideration of human performance, assuming that automatic systems would take care of situations and that the human operating staff were there as some sort of observers who were not part of the machine. This resulted in designers not giving adequate consideration to what the operators needed in terms of information to enable them to understand and react to developing situations, and to take control after automatic systems had reacted to the initial situation.

Two very significant nuclear accidents illustrate how lack of consideration of ergonomics and human performance factors resulted in disaster. Both of these accidents could have been avoided by very simple changes to the MMI, which were lessons that I and the team I was part of took very good note of during our work.

Three Mile Island

The accident at the Three Mile Island nuclear power station in Pennsylvania was a prime example of lack of attention to providing the operators with clear and unambiguous information. The operators were given misleading information and consequently they misunderstood what was happening in the reactor coolant circuit. As a result a minor event was turned into a major disaster by the operators taking the wrong actions.

The source of the misleading information was an indicator lamp in the control room that was wired to show whether power was being supplied to the pressuriser relief valve, a valve that was opened automatically to relieve excess pressure in the nuclear reactor's coolant system. The valve was designed to open when electrical power was applied to it, and close mechanically when the power was removed. The lamp was intended to show whether the pressuriser relief valve, an extremely important valve, was open or closed, and the presence or absence of electrical power was used as a proxy for this. This was a very bad design decision, but it was a few dollars cheaper than adding position switches that would read the actual position of the valve.

On the day of the accident a routine maintenance activity went wrong and caused the reactor to trip or scram, to be automatically shut down. This caused a temporary surge in pressure in the reactor coolant circuit, and the pressuriser relief valve opened automatically to relieve excess pressure. When the pressure in the reactor had dropped enough, power was automatically removed, the lamp in the control room went out, and the valve should have closed. But this time the valve stuck open and the pressure kept on dropping.

The control room operators knew that something was wrong, they were being bombarded with alarms and could see from pressure gauges that the reactor coolant pressure was falling rapidly. This was a great concern to them because it meant that coolant was being lost from the reactor. If the pressure continued to fall the nuclear fuel could overheat.

One of the first things they did was check whether the pressuriser relief valve was open or closed. When they saw that the indicator lamp was dark they assumed that the valve was closed. But the pressure in the reactor coolant system kept on dropping. Coolant injection pumps were automatically started to add coolant to the reactor, which caused the coolant level in the pressuriser to rise. The operators had been taught to never allow the pressuriser to become full of coolant because it could then no longer control the pressure in the reactor, so they switched off the pumps and the pressure in the reactor continued to fall.

Because they being fed incorrect information the operators became confused and bogged down in a "mind set". They assumed that they knew what was happening and were not able to stand back and reappraise the situation when their actions only made things worse. They failed to work out that the coolant leak was from the pressuriser itself, and so they couldn't stop the leak.

The leak continued for several hours until a relief shift arrived and realised what was happening. A temperature gauge on the pressuriser relief valve tail pipe, a pipe that carried discharged coolant to a tank, was showing a high temperature. This indicated that hot coolant was flowing down the pipe even though the relief valve was supposedly shut. The relief team closed a second valve in the pipe and the loss of coolant was stopped. But by then the reactor's core had been severely overheated and part melted; radioactive material was leaking from damaged fuel rods.

Although better display of the pressuriser relief valve tailpipe temperature would have helped considerably, the root cause of the accident was the use of the presence or absence of power to the pressuriser relief valve as a ‘proxy’ for the open or closed state of the valve. For an important valve like this, using a proxy indication is totally unacceptable; detection and display of the actual valve position, e.g. using limit switches on the stem, is essential.


The accident at Chernobyl in the Ukraine was another event that could have been prevented by proper design taking account of human factors. The designers knew that the reactor would become unstable if it was operated in a particular way and provided information systems in the control room to warn the operators of this. However, bad design of the safety systems and poor operator training resulted in disaster.

Ironically it was during a test to find ways to improve the safety of the reactor that the operators took the reactor into an unstable operating condition. In order to perform the test they had disabled safety systems and bring the reactor to conditions of temperature, pressure and reactivity that were outside the safe operating parameters established by the designers. When they attempted to shut down the reactor the graphite tips of the slow moving control rods caused the reactivity to go super critical which caused an uncontrolled power surge that blew the reactor, all the attached safety systems, and the building that the reactor was in, to pieces.

Nothing had been designed to cope with this magnitude of catastrophe. Most of the operators who were in the control room at the time of the accident were killed within minutes by nuclear radiation, and the only thing that could be done was to dump tons of sand onto the exposed reactor, and then when things had cooled down a bit, build a tomb, the ‘sarcophagus’ around the smouldering remains.

Although a better reactor protection system that prevented the reactor being operated in an unsafe way would have helped, the root cause of the problem was poor design of the nuclear and thermodynamic characteristics of the reactor and poor operator training. Although the operators thought that they were doing the right thing, their lack of understanding of the characteristics of the reactor allowed them to operate it in an unsafe way, which the design should never have allowed in the first place. But if the operators had been properly trained, they would have known this and avoided it.

Prevent Accidents, Improve Performance, Increase Productivity

The accidents at Three Mile Island and Chernobyl could have been avoided if proper attention had been paid to human factors and human performance during the design of those two plants and in training the operating staff.

But it doesn't have to be a nuclear power plant for these problems to occur; many accidents and simple errors happen every day due to a lack of consideration of human factors and human performance during the design process, or during plant modifications.

Consideration of human performance and human factors during design and training can improve the performance of human beings everywhere they are involved, from using a screw driver to operating a nuclear power station; in fact, everywhere that someone is required to make a decision or take an action. Where safety is involved, it is vital that this is done properly all the way through the design process in a structured and audited way, through training the people who are to use the system, and with regular monitoring and reviews during the life of the system.

Low Colour Distinction

Usually incorrectly called ‘colour blindness’, people with low colour distinction have difficulty in distinguishing shades of certain colors, most commonly reds and greens. An excellent page about this is Tips for designing scientific figures for color blind readers. It is obvious that you should avoid using easily confused colors. One way that this was resolved for an MMI that I was working on was to make symbols different shapes as well as different colours, e.g. a status indicator on a computer display was made green and round for normal operation, changing to red and triangular for an alarm condition.

Copyright © David Boettcher, 2006 - 2024 all rights reserved. Please feel free to contact me via the Contact me page.

This page updated April 2019. W3CMVS.