Doctors are intelligent people, but are we good thinkers? And how should we think?
There are two basic kinds of thinking: analytic and intuitive. (And maybe good and bad, so that’s four.) Within medicine, analytic thinking can perhaps be best exemplified in the evidence-based movement, which began in the early 1990’s. It was a gilded age, full of promise, and bolstered by the reality that computers would give physicians instant access to the most thoroughly researched standards of care. Within our specialty of internal medicine, we watched the sacred texts of medical wisdom — Cecil, Harrison’s, and Scientific American — get leapfrogged by electronic medical resources like UpToDate.
“A new paradigm for medical practice is emerging. Evidence-based medicine de-emphasizes intuition, unsystematic clinical experience, and pathophysiologic rationale as sufficient grounds for clinical decision making and stresses the examination of evidence from clinical research.”
– Journal of the American Medical Association, 1992
The beauty and allure of the new paradigm — practicing evidence-based medicine — was its elegant simplicity: We’ll do what works, and we won’t do what doesn’t work.
This accent on data and analytical thinking was a boon to younger doctors, who were more computer savvy and had little to no clinical experience (intuitive thinking) to lean back on. Older physicians who groused about these changes could be seen as backwards Luddites, in love with a romantic and antiquated notion of the doctor-patient relationship. William Osler was dead.
But evidence-based medicine has its problems, including that it’s only as good as the evidence it’s based on, and much of that comes from randomized controlled trials (RCT). Behold the RCT, The-Dispenser-of-All-Clinical-Knowledge, our most powerful tool in the quest to practice evidence-based medicine, and perfect in so many ways.
And yet imperfect. We study groups, but we treat individuals. On the face of it, when an RCT concludes that a particular drug reduced cardiovascular mortality by 20 percent, we assume — most of us at least — that the majority of the people in the study benefited from taking the drug, and that most of them had a 20 percent reduction in mortality. But RCTs suffer from a heterogeneity of benefit: often times a small minority of high-risk patients accounts for most of the positive outcomes seen in the trial. Many participants in the trial may have had no benefit. Some may have even been harmed.
And it isn’t just this heterogeneity of benefit that hobbles the RCT. A team at the University of Oxford’s Centre for Evidence-Based Medicine recently began monitoring clinical trials for switched outcomes. When they reviewed 67 different trials published in the top five medical journals in 2015-2016, they found that 58 of these trials were considered methodologically flawed. 40 percent of the trials had negative outcomes that they chose not to report. On average, each trial quietly included five clinical outcomes that their study was not designed to measure.
Of course, it would be ridiculous to suggest that we don’t need data or analytic thinking. But what makes the practice of medicine so unique is that we are attempting to apply objective science to our very human patients. And each patient is his or her own art form — both in terms of their unique physiology, but also in their unique psychology (that is, how they express their illness).
And so, physicians must also rely on heuristic thinking — from the Greek word heuriskein, meaning “to discover.” A heuristic is a mental shortcut, a way of quickly and intuitively organizing disparate clues into something we can recognize and work with. If we are to embrace heuristic thinking, we’ll have to overcome our blind faith in anything with a p-value of less than 0.05, and our disdain for the idea of “acting on a hunch.” Yes, analytical thinking is deliberative, deductive, and rule-following, but it is not always as “logical” as we suppose it to be, and it is not intellectually superior to intuitive thinking. Rather, it is intellectually complementary.
Heuristic thinking is not for dummies. It is an unconscious, context-sensitive, associative process that rapidly makes connections. It allows us to make pragmatic decisions, and to conclude that something is wrong even before we know what that something is. It is a highly skilled way of thinking, and it takes practice.
Yes, heuristic thinking can be flawed. Psychologist Daniel Kahneman won a Nobel Prize in economics for his research into how humans think, and he has catalogued the various forms of cognitive biases that intuitive thinking is vulnerable to. The “gambler’s fallacy” demonstrates our tendency to see independent events (“luck”) as being streaky. We feel that five “heads” in a row means a “tails” is overdue, even though each coin toss is independent of the other. Diagnosing your first pheochromocytoma might lead you to think that you’ll never see one again, or it might lead you to falsely over-represent the prevalence of the problem because, having finally seen a case, the diagnosis now seems more “real” and therefore more probable in your mind.
We all want to be good at what we do, and yet the pace of discovery in medical science continues to quicken. As writer/poet/farmer Wendell Berry has pointed out, “The radii of knowledge have only pushed back—and enlarged—the circumference of our mystery.”
The more we know, the more we realize how much we don’t know. And so as Deepika Mohan, MD, et al. noted in a recent Viewpoint article in JAMA, expert clinicians will have to have “… unparalleled ability to parse complexity and sift through uncertainty.” To do that, we’ll have to embrace both rational/analytic and intuitive/associative thinking. We’ll need our computers and our brains.
“Gosh, it would be awful pleasin’, to reason out the reason, for things I can’t explain,” sang Dorothy’s friend and confidant, the Scarecrow. He wanted what we want: “I’d unravel every riddle, for any individ’al, in trouble or in pain.”
Craig Bowron and Michael Cummings are internal medicine physicians. The article originally appeared in the Journal of the Minneapolis Heart Institute Foundation.
Image credit: Shutterstock.com