Physicians actually make good pilots. But, in 1966 they received a bad rap due to a report published by the FAA’s Chief of Aviation Medicine, Dr. Stanley Mohler. Mohler had a point. Physicians were crashing planes at a higher rate as opposed to the general aviation population. His report would be a cautionary tale for all of us. It would be three decades before we even began to understand why smart people do stupid things.
Aviation and medicine are highly regulated fields. But it was more of a Wild West show in the 1960s for both. The death rate for physician pilots in 1966 was four times higher than the general pilot population. The rate was so concerning that a follow-up study was conducted to examine the next six years.
A downward trend of physician flight deaths occurred in subsequent years but the myth that physicians make bad pilots was already cemented into the culture. Especially when the studies cited risk-taking attitudes and judgment as underlying causes of physician pilot crashes. Mixing limited flight experience with high-performance aircraft didn’t help either. But there ended up being a broader lesson to the myth that had merit: we would later discover the Dunning-Kruger effect.
In 1999, psychologist David Dunning and his graduate student Justin Kruger published their research that in their own words revealed, “incompetent people … cannot recognize just how incompetent they are.” Now, this isn’t to suggest physician pilots are incompetent. That’s not the point. The point is we all suffer from cognitive bias. We all believe we are smarter and more capable than we really are.
Smart people are not immune to this phenomenon. In fact, the effect reveals the most competent individuals tend actually to underestimate their ability. But most people (75 percent) increasingly overestimate their ability. The key word here is increasingly.
Incompetence doesn’t leave individuals with empty thoughts, disillusioned or even cautious. Dunning explained that instead, the incompetent are filled with inappropriate confidence that feels to them like knowledge.
The result is people tend to overestimate their skill. They fail to recognize their own mistakes or lack of skill. They also are poor judges of genuine skill or the expertise of others. It is only the most competent people that tend to underestimate their relative ability — but they are in the vast minority and not much better.
The effect manifests as a sense of false confidence. When confronted with alternate facts the tendency is to become defensive, not introspective. This isn’t a curiosity of human psychology. The effect has been repeated in other fields and seems to be the default mode of human thought. It is a problem for both aviation and medicine.
It is important to note that the Dunning-Kruger effect is not synonymous with low IQ as it has often been misapplied as the more incompetent you are, the more knowledgeable you believe you are. The reality is everyone is affected, and those who think it only applies to those with lower IQ are merely demonstrating their own cognitive bias.
Dunning noted that the irony of the effect was that “the knowledge and intelligence that are required to be good at a task are often the same qualities needed to recognize that one is not good at that task — and if one lacks such knowledge and intelligence, one remains ignorant that one is not good at that task.”
What’s worse is we are terrible at assessing the competency of others. The April 2019 edition of Harvard Business Review noted in an article by Marcus Buckingham and Ashley Goodall entitled, “The Feedback Fallacy,” that for 40 years studies have repeatedly shown that people don’t have the objectivity to hold in their heads an abstract quality then accurately assess someone on that quality. Our own cognitive biases filter our understandings. It simply confirms the effect to an even broader degree of not only being unable to accurately assess our competence. We fail to accurately assess others as well.
Originally published in 2003, Laurence Gonzales touched on many of these failures in his book, Deep Survival. An emergency medicine colleague recommended the book as an insight into not only why skilled pilots make simple mistakes but also why do brilliant physicians make simply obvious errors. Gonzales cites Plato who understood that emotions trump reason and that to succeed we must utilize the reins of reason on the horse of emotion.
Reason and critical thinking skills seem to be the solution to this disturbing phenomenon we all face. Gonzales called the idea “about what you know that you don’t know you know and about what you don’t know that you’d better not think you know.” In other words, we all aren’t as smart as we believe.
In 2001, John Hopkins critical care specialist Peter Pronovost attempted to short circuit this knowledge deficit by utilizing the same technique that ultimately saved pilots when planes became more complicated than a human could intellectually manage left to their own mental capabilities. He developed a simple checklist to ensure all the steps were taken to prevent infections when lines were being introduced into a patient. The simple checklist saved many lives just as a checklist saved pilots. Had Pronovost developed a drug instead of a checklist that did the same thing he would have won a Nobel Prize.
The checklist was simply a tool. The real hero is the idea that we all aren’t as smart as we think we are. That is the first step in successfully flying a plane or practicing medicine. The next step is learning how to apply critical thought and reason to rein emotion — not the other way around.
Phillip Stephens is chief physician assistant, department of emergency medicine, Southeastern Regional Medical Center, Lumberton, North Carolina. He is the author of Winning Fights: 12 Proven Principles for Winning on the Street, in the Ring, at Life, and can be reached at his self-titled site, Dr. Phillip M. Stephens. Portions of this article are referenced from “Are Doctors Bad Pilots?” by G. Stuart Mendenhall, MD.
Image credit: Shutterstock.com