Since 18 of December 2019 conferences.iaea.org uses Nucleus credentials. Visit our help pages for information on how to Register and Sign-in using Nucleus.

Limitations of Managing Safety by Numbers

25 Feb 2016, 14:00
30m
Boardroom B/M1 (IAEA HQ)

Boardroom B/M1

IAEA HQ

Vienna International Centre, Vienna, AUSTRIA

Speaker

Diana Engström (Sweden)

Synopsis

Work, especially that in complex, dynamic workplaces, often requires subtle, local judgment with regard to timing of subtasks, relevance, importance, prioritization and so forth. Still, people in Nuclear Industry seem to think safety results from error counts and people just following procedures. In the wake of failure it can be tempting to introduce new procedures and an even stricter “rule following culture”. None, or at least very little, attention is given to tacit knowledge and individual skills. I am aiming to highlight the inadequacy of putting too much trust in formalization and that reporting and trending of events will contribute to increased learning, an increased nuclear safety and an efficient operational experience. The ability to interpret a situation concrete depends on proven experience in similar situations, analogical thinking and tacit knowledge. I intend to problematize the introduction and use of so-called Corrective Action Program (CAP) and computerized reporting systems linked to CAP in the Nuclear Industry. Categorization and trending in computerized reporting systems is only based on the direct/triggering cause and not based on any analyzes, So the question we have to ask ourselves is what the trends are really telling us, if anything at all.

During my master studies I began to realize that the whole industry, from regulators to licensees, seems to be stuck in the idea that the scientific perspective on knowledge is the only “true” perspective. This leads to an exaggerated belief in that technology and formalized work processes and routines will create a safer business. The computerized reporting system is costly but will not, as the idea was from the beginning, contribute to increased nuclear safety since the reports is based on the trigger and not the underlying causes and in-depth analysis. Managing safety by numbers (incidents, error counts, safety threats, and safety culture indicators) is very practical but has its limitations. Error counts only uphold an illusion of rationality and control, but may offer neither real insight nor productive routes for progress on safety.

The question is why the CAP, error counts and computerized reporting systems have had such a big impact in the nuclear industry? It rests after all, on too weak foundations. A part of the answer is that the scientific perspective on knowledge is the dominating perspective. What people do not seem to understand is that an excessive use of computerized systems and an increased formalization actually will create new risks when people lose their skills and ability to reflect and put more trust in the system than in themselves.

This does not mean that people should stop reporting completely, it only means that organizations that use these kinds of computerized reporting systems need to understand the limitations of the system and the trending. Putting 5000-10 000 reports in to a system every year and trend them might, in best case, help an organization to discover concrete problems but it will not help organizations discover the latent organizational weaknesses which eventually will lead to a severe nuclear accident. I fear that nuclear industry put too much trust into these reporting systems and trends and that it might make the organizations “blind” for the real threats against nuclear safety.

Country or International Agency Sweden
Type "YES" to confirm submission of required Forms A and B via the official channels YES

Primary author

Diana Engström (Sweden)

Presentation materials