A guest column by the American Society of Anesthesiologists, exclusive to KevinMD.
A decade of working in quality and patient safety has taught me a painful lesson: Don’t say “quality” or “patient safety” to frontline health care workers. Too often, they become immediately defensive or evasive. How did q#^l&ty and s^f*ty become four-letter words? A key lies in understanding the safety model that medicine has embraced.
In 1978, British statistician George Box wrote, “All models are wrong, but some are useful.” When tackling complex problems, we rely on mental models to simplify complexity. These models utilize assumptions and simplifications that normally do not significantly degrade their performance. Occasionally, a model’s shortcomings become the very issue needing to be solved.
My college chemistry professor taught me to consider model inaccuracies. I had previously studied the Bohr atomic model, with each nucleus surrounded by electron orbitals. One simply had to find the proper elemental balance that filled the orbits to solve atomic equations. How hard could chemistry be? My naivety was exposed when my chemistry professor wrote a simple chemical equation on the board. Despite my best efforts, I could not balance the equation. The professor explained that the Bohr model would never reach the solution. It was an inadequate model. He then introduced the quantum electron cloud model.
The health care industry is struggling with its own version of the Bohr model – the Bad Apple Theory. As described by Sidney Dekker in The Field Guide to Understanding’ Human Error,’ the Bad Apple Theory assumes 1) Complex systems are basically safe and 2) These systems need to be protected from unreliable people. According to the Bad Apple Theory, a system’s main failure point is human error. Proponents argue that the health care system is effectively defined by policy, and patient safety events could be avoided if people stopped going off-script.
Under the Bad Apple Theory, system failures are explained by identifying human deviation. Underlying motives and circumstances may be cursorily explored, but the final cause is the individual’s failure to follow policy, protocol, or guidelines. This can occur even in the absence of plausible causation. When focusing on human deviation from expected behavior, solutions tend to lean towards worker education, training, certification, two-person verifications, or more rules and policies. Sound familiar? A 2018 study identified training, education, and policy development or revision as the most frequent root cause analyses interventions. Not surprisingly, all these interventions are considered “weaker actions” by the National Patient Safety Foundation (NPSF) RCA2 classification.
Numerous other indications suggest that a health care system is built on the Bad Apple Theory. Is the most common root cause of safety investigations “human error?” Has the name of the safety reporting system been adopted as a verb to denote “reporting a troublesome colleague?” Do frontline staff become nervous and defensive during safety investigations? In a culture where human deviation is the primary culprit, human correction becomes the primary intervention.
Safety experts have developed a more advanced safety model. In 1947, Army psychologists Paul Fitts and Richard Jones reviewed hundreds of aviation mishaps attributed to “human error.” Their report found that most “pilot errors” should have been attributed to deeper, systemic issues. For example, the flaps and landing gear in the “Flying Fortress” B-17 bomber used identical-style switches located next to one another. Pilots would accidentally retract their gear instead of the flaps during the crucial landing sequence. For years, this was attributed to “pilot error,” yet these mishaps all but disappeared once the cockpit design was modified.
Fitts’ report captures what would later emerge as the Human Factors Theory: “It should be possible to eliminate a large proportion of so-called ‘pilot-error’ accidents by designing equipment in accordance with human requirements.” Human Factors Theory seeks to identify and fix the underlying dynamics that lead to “human errors.” Sidney Dekker explained, “I no longer approach investigations as a search to find where people went wrong. Instead, I seek to understand why their assessments and actions made sense to them at the time.”
Under the Humans Factor Theory, failures are not satisfactorily explained by demonstrating human deviation from expected behavior. Instead, the circumstances and underlying pressures are meticulously explored, and systemic deficiencies are identified. Interventions lean towards design improvements, engineering controls, or process simplification and standardization. Staff and workload balance may be adjusted, or software enhancements may be pursued. These interventions are considered “intermediate” and “stronger actions” in the NPSF RCA2 classification.
In contrast to the Bad Apple Theory, the Human Factors Theory assumes 1) Complex systems are not fundamentally safe and 2) People create safety while negotiating multiple system goals. In other words, complex systems are filled with vulnerabilities, competing demands, and latent failure points. They constantly evolve. Despite attempts to codify operations, vulnerabilities will inevitably align, and a catastrophic failure becomes imminent. Under these circumstances, people’s ability to perceive the situation and adapt is the primary source of safety. The ability to deviate from expected behavior has recently been identified as a major source of safety in complex systems – a paradigm referred to as “Safety-II.”
In a culture where systemic vulnerabilities are the primary culprit, system corrections become the primary intervention, rather than human correction. People act as a final safety net when the system inevitably threatens failure. Imagine working in a health care system that views its workers as the solution, not the problem. Perhaps we could start using “quality” and “patient safety” again.
Disclaimer: The views expressed in this article are those of the author and do not necessarily reflect the official policy of the Department of Defense or the U.S. Government. The appearance of non-U.S. Government web links does not constitute endorsement by the Department of Defense (DoD) of the websites nor the information, products, or services contained therein. Such websites are provided consistent with the purpose of this publication.
R. Christopher Call, Michael O’Connor, and Keith Ruskin are anesthesiologists.
Image credit: Shutterstock.com