High Reliability Healthcare: Applying CRM to High-Performing Teams

In this series, Steve Kreiser describes a model for applying aviation’s crew resource management to healthcare. This model incorporates different elements inherent in most CRM programs but has an additional benefit of including simple error prevention tools and techniques that help reduce human error. These seven tools, essentially a “people bundle” to make humans more reliable, can help individuals experience fewer errors while encouraging teams to catch and trap those errors that do still occur in complex systems. The series will continue on Tues. and Thurs. through Jan. 12.

In 2006, Lauren Wargo, a 19-year-old from Shaker Heights, Ohio, went to an outpatient surgical center where a plastic surgeon was going to remove a mole from her eyebrow. The oxygen used during her surgery and an electrical device used to seal blood vessels combined to create a flash flame that left her face, neck, and ear badly burned. Four years later, the 23-year-old still has to wear make-up to cover the scars on her face and is unable to completely close one eyelid.1

In court the doctor testified that he turned on the electrical device after announcing that he was about to do so – and after he thought the anesthesiologist assistant had turned off the oxygen Wargo was receiving through a face mask. The assistant testified that she never heard the doctor say he was turning on the device. If she had, she told the court, she would have repeated the statement to the doctor and would have turned the oxygen off.

Unfortunately, this is not an isolated event. Ten years after the IOM estimated 98,000 deaths per year due to medical mistakes and errors, an epidemic of preventable harm in healthcare continues. A 2009 review found that preventable medical harm still accounts for over 100,000 deaths per year.2

How do these events happen? They are usually not the result of a single error, but rather, a series of human errors and system failures, as described by Jim Reason’s Swiss Cheese Model3 (Figure 1). Complex, high-risk organizations design their systems not to allow a single human error or system failure to result in an accident or event of harm. They put in defensive barriers – technology, processes, policies, or other people – to catch or block the errors from resulting in tragedy. The use of other people to catch and trap human error is the essence of a concept that came out of the airline industry in the 1970s: crew resource management (CRM).

Figure 1
Figure 1.

CRM Background
The history of CRM dates back to the 1960s and 70s. In response to a number of tragic airline disasters, NASA convened a workshop in 1979 to examine the underlying causal factors in an attempt to improve air safety. Data was presented that showed in the majority of these crashes, air crew should have been able to overcome minor mechanical malfunctions, weather-related challenges, or other unforeseen issues. However, repeated failures in interpersonal communications, decision-making, and leadership were identified as the primary cause in over 70% of the accidents. It was at this workshop that the label crew resource management (originally called cockpit resource management) was first applied in an attempt to develop training programs that would make better use of the human resources available in the cockpit. Dr. John Lauber, a member of the National Transportation Safety Board (NTSB), succinctly defined the goal of CRM at the time, “To use all available sources – information, equipment and people – to achieve safe and efficient flight operations.”4

Crew Resource Management as it applies to healthcare is essentially the use of a team approach to trap errors before they reach a patient. In this respect, CRM is a set of countermeasures designed to avoid the majority of errors, trap them when they inevitably occur, and – when they do get through a system’s defenses – mitigate the consequences with a rapid response.5 The key to effective CRM programs, then, is a team commitment to avoiding errors through the committed use of tools and techniques for error prevention on the part of individuals in an organization. When errors do occur, the reliance on teams to cross check and coach one-another is the best approach to trapping the errors before they reach the patient.

What is the best way to put these two concepts together in a busy, fast-paced healthcare environment? One answer can be found by looking at how both the airline industry and U.S. Naval Aviation have developed training programs to enhance information sharing while avoiding errors.

At United Airlines, one of the first commercial airlines to develop a CRM program after the 1979 NASA workshop, pilots focus on command, leadership, and resource management (C/L/R). This approach employs seven different elements to detect threats to safety, avoid errors and effectively manage a team. The U.S. Navy refers to their CRM program as aircrew coordination training (ACT), with a stated goal of increasing mission effectiveness, minimizing preventable errors, maximizing crew coordination and optimizing risk management. It also uses seven similar elements.

With this in mind, a model has been developed applicable to healthcare using these two programs as a reference. This model incorporates different elements inherent in most CRM programs but has an additional benefit of including simple error prevention tools and techniques that help reduce human error. These tools, essentially a “people bundle” to make humans more reliable, can help individuals experience fewer errors while encouraging teams to catch and trap those errors that do still occur in complex systems.

Figure 2 shows the seven elements as they apply to the environment of care. These elements start with leadership and move clockwise around the circle to enhance information sharing, all intended to build better situational awareness and decision making on the part of the entire team.

Figure 2
Figure 2.

This article is the first in a series by Steve Kreiser. In subsequent articles, he will describe each of the seven elements in the model. Watch for the next post in this series, Element #1 – Leadership, on Thurs., Dec. 22.

Steve Kreiser is a consultant with Healthcare Performance Improvement (HPI). previously, Kreiser was an FA-18 pilot with more than 21 years of experience in the U.S. Navy, and a first officer for a major airline, where he worked extensively in the area of crew resource management. Mr. Kreiser can be contacted at steve@hpiresults.com.

References

  1. The Cleveland Plain Dealer. (2010, May 23). Burn victim hopes her story calls attention to dangers of surgical fires.
  2. Consumers Union. (2009, May). To err is human, to delay is deadly. The authors adopted the 100,000 annual estimate as the absolute minimum lower boundary of deaths due to medical harm in hospitals in the United States. This includes 99,000 annual deaths from hospital-acquired infections estimated by the CDC plus 2,039 deaths among Medicare patients alone from “accidental puncture or laceration.”
  3. In Managing the Risk of Organizational Accidents (1997), James Reason first depicted the “Swiss cheese model” to describe the role of active errors and latent system weaknesses in organizational accidents.
  4. Cooper, G. E., White, M. D., & Lauber, J. K. (1980). Resource management on the flightdeck: Proceedings of a NASA/industry workshop. (NASA CP-2120). Moffett Field, CA: NASA-Ames Research Center.
  5. Helmreich, R. L., Merritt, A. C., & Wilhelm, J. A. (1999). The evolution of crew resource management training in commercial aviation. International Journal of Aviation Psychology, 9(1), 19-32.