Skill - Improve Your Understanding of Human Error (Part 3)

Summary: This is part three in my series of posts on human error. This post is again based heavily on Chapter 3 of “To Err is Human” by the Institute of Medicine. This post reviews some ideas on the nature of safety, observes that some types of complex systems are more prone to error/failure than others, and introduces the term “naturalistic decision making.”

Introduction

Normal Accident Theory (NAT) was developed in the aftermath of the accident at the Three Mile Island nuclear power plant in 1979. Charles Perrow defined two related system dimensions (think of them as axes on a graph or sides of a 2 x 2 box)— interactive complexity and loose/tight coupling—that he claimed together determine a system’s susceptibility to accidents.

Understanding the Nature of Safety

Charles Perrow's NAT concludes that accident are inevitable in certain systems. Although they may be rare, accidents cannot be completely prevented, making them "normal" in complex, high technology industries. In contrast to studying the causes of accident and errors, other researchers have focused on the characteristics that make certain industries, such as aircraft carriers or chemical processing, highly reliable.22 High reliability theory believes that accidents can be prevented through good organizational design and management.23 Characteristics of highly reliable industries include an organizational commitment to safety, high levels of redundancy in personnel and safety measures, and a strong organizational culture for continuous learning and willingness to change.24 Correct performance and error can be viewed as "two sides of the same coin."25 Although accidents may occur, systems can be designed to be safer so that accidents are very rare. 

Are Some Types of Systems More Prone to Accidents?

Accidents are more likely to happen in certain types of systems. When they do occur, they represent failures in the way systems are designed. The primary objective of systems design ought to be to make it difficult for accidents and errors to occur and to minimize damage if they do occur.30

Perrow characterizes systems according to two important dimensions: complexity and tight or loose coupling. Systems that are more complex and tightly coupled are more prone to accidents and have to be made more reliable.32 In Reason's words, complex and tightly coupled systems can "spring nasty surprises."33

In complex systems, one component of the system can interact with multiple other components, sometimes in unexpected or invisible ways. Although all systems have many parts that interact, the problem arises when one part serves multiple functions because if this part fails, all of the dependent functions fail as well. Complex systems are characterized by specialization and interdependency. Complex systems also tend to have multiple feedback loops, and to receive information indirectly, and because of specialization, there is little chance of substituting or reassigning personnel or other resources.

In contrast to complex systems, linear systems contain interactions that are expected in the usual and familiar production sequence. One component of the system interacts with the component immediately preceding it in the production process and the component following it. Linear systems tend to have segregated subsystems, few feedback loops, and easy substitutions (less specialization).

Complex, tightly coupled systems have to be made more reliable.39 One of the advantages of having systems is that it is possible to build in more defenses against failure. Systems that are more complex, tightly coupled, and are more prone to accidents can reduce the likelihood of accidents by simplifying and standardizing processes, building in redundancy, developing backup systems, and so forth.

Another aspect of making systems more reliable has to do with organizational design and team performance.

Conditions That Breed Errors

Factors can intervene between the design of a system and the production process that creates conditions in which errors are more likely to happen. James Reason refers to these factors as psychological precursors or preconditions.40 Although good managerial decisions are required for safe and efficient production, they are not sufficient. There is also a need to have the right equipment, well-maintained and reliable; a skilled and knowledgeable workforce; reasonable work schedules, well-designed jobs; clear guidance on desired and undesired performance, et cetera. Factors such as these are the precursors or preconditions for safe production processes.

Research on Human Factors

Human factors research applies knowledge about human strengths and limitations to the design of interactive systems of people, equipment, and their environment to ensure their effectiveness, safety, and ease of use.

Research in the area of human factors is just beginning to be applied to health care. It borrows from the disciplines of industrial engineering and psychology. Human factors is defined as the study of the interrelationships between humans, the tools they use, and the environment in which they live and work.51

A human factors approach is used to understand where and why systems or processes break down. This approach examines the process of error, looking at the causes, circumstances, conditions, associated procedures and devices and other factors connected with the event. Studying human performance can result in the creation of safer systems and the reduction of conditions that lead to errors. However, not all errors are related to human factors. Although equipment and materials should take into account the design of the way people use them, human factors may not resolve instances of equipment breakdown or material failure.

Two approaches have typically been used in human factors analysis. The first is critical incident analysis. Critical incident analysis examines a significant or pivotal occurrence to understand where the system broke down, why the incident occurred, and the circumstances surrounding the incident.53 Analyzing critical incidents, whether or not the event actually leads to a bad outcome, provides an understanding of the conditions that produced an actual error or the risk of error and contributing factors.

Another analytic approach is referred to as "naturalistic decision making."54 This approach examines the way people make decisions in their natural work settings. It considers all of the factors that are typically controlled for in a laboratory-type evaluation, such as time pressure, noise and other distractions, insufficient information, and competing goals. In this method, the researcher goes out with workers in various fields, such as firefighters or nurses, observes them in practice, and then walks them through to reconstruct various incidents. The analysis uncovers the factors weighed and the processes used in making decisions when faced with ambiguous information under time pressure.

For Further Reading

Sources of Power, Gary Klein

Naturalistic Decision Making (Expertise: Research and Applications Series) by Caroline E. Zsambok and Gary Klein

Making Decisions Under Stress: Implications for Individual & Team Training by Janis A. Cannon-Bowers and Eduardo Salas (Paperback - Dec 1998)