About eight years ago I was desperate to improve my golf game. I just couldn’t straighten out my drives or hit my irons crisply. (Yes, I’m fully aware that this is a first world problem). I decided to try golf camp in Palm Springs for a few days.
My sensei, a crusty ex-touring pro named Artie McNickle, watched me hit several dozen balls on the driving range, video recorder running. “So, did you figure it out?” I asked with hint of sarcasm after my last shot. I thought I was a hard case.
“Sure.”
“How long did it take you?” I asked.
“One or two swings. But you looked like you were having a good time, so I didn’t have the heart to stop you.”
Artie patiently told me what I was doing wrong. Though it made sense in theory, when I tried to follow his directions, I didn’t get very far.
“Let’s look at the video,” he offered. “Whose swing do you really admire?”
I named Ernie Els and Tiger Woods, two pros with silky smooth yet powerful swings.
“Fine,” he said. On a video screen, I appeared on the left side. As if by magic, Ernie Els was on the right.
“OK, let’s see where your club is about 18 inches into your backswing.” My hands were low, and the club formed a straight line with my arms; my wrists hadn’t begun to cock. He then showed Els at the exact same point in his swing. His club was facing skyward, forming an acute angle with his wrists. This wasn’t a subtle difference in technique; it was an enormous one.We then reviewed my follow-through. Once again, about two feet after I’d struck the ball, my club extended straight out from my arms. Conversely, Els had his wrists swiveled about 45 degrees counterclockwise, his right hand rotated powerfully over his left.
Tiger’s swing was slightly different, but similar in all the ways that mattered.
My trek to Palm Springs shaved 2-3 shots off my score (and they haven’t come back). I’m convinced it was the video that did it.
This kind of video coaching has become standard procedure in major league sports. My son Doug, who works in baseball operations, tells me that every major and Triple A minor league game is video recorded. Not only do players watch themselves to see what they’re doing right and wrong, they watch how other teams play pitchers or hitters who resemble them in style and build.
Yet we hardly ever use this extraordinarily powerful tool in healthcare. Thankfully, that’s beginning to change.
Earlier this year, Johns Hopkins surgeon Marty Makary published a JAMA article entitled “The Power of Video Recording.” It’s a thoughtful and eye-opening piece, well worth a read.
Makary reviews several ways video can be used for peer review, quality improvement, and coaching. I’ve previously described the use of video monitoring for hand hygiene: a study performed at Long Island’s North Shore Hospital found a staggering uptick — from 7 to 82 percent — in hand hygiene performance in ICUs that were monitored with cameras pointed to the sinks and gel dispensers. The key was that there was someone in Bangalore, India reviewing the video feed every hour and sending back compliance data, which was posted on an electronic tote board — real-time feedback, either positive or negative. Makary also cites a study in which gastroenterologists whose colonoscopies were video recorded improved their performance by one-third. As Makary observes, the obstacle to doing this is not a technical one: there’s plenty of video equipment in the operating room.
“Procedures ranging from cardiac stent placement to arthroscopic surgery are performed using sophisticated video equipment; however, the record button is turned off.”
Adding to this literature, one of the most impressive health services research studies in recent memory was published last week in the New England Journal of Medicine. In it, John Birkmeyer, a surgeon and researcher at the University of Michigan, described the results of a study in which 20 bariatric surgeons submitted videos that demonstrated their surgical technique. The recordings were rated by several peers on a 1-5 scale, where 1 was the skill expected of a general surgery chief resident who hadn’t yet performed this complex operation, 5 was that of a master bariatric surgeon, and 3 was that of the average bariatric surgeon. I’m not sure why, but I would have naively guessed that — though the recordings might reveal a quirk or two — everybody’s technique would be pretty good, and not terribly dissimilar.
I would have been dead wrong. The reviewers, who were blinded to both surgeon and institution, used a lot of the terrain on the grading scale. Surgeons in the top quartile averaged a 4.4 (on 5 domains, including exposure, flow, and gentleness), while the lowest quartile surgeons had a mean rating of 2.9. The ratings did not correlate with years in practice, fellowship training, or teaching vs. nonteaching hospital.
Instead, they correlated strongly with surgical volume: lowest quartile surgeons averaged 106 bariatric procedures in the prior year, while highest quartile performers averaged 241. (Of course, this doesn’t answer the age-old chicken vs. egg question of whether better performers get more cases, or more cases make better performers. But it does support the use of volume as a proxy for quality, at least until video is more readily available.)
Here’s the amazing thing: after adjustment for any patient differences that might have influenced outcomes, surgeons who were rated in the top quartile technically had far better outcomes than those in lowest quartile. The better technicians got through their cases more quickly (98 vs. 137 minutes) and had lower infection rates and lower overall complication rates. Their patients required readmission, return to ER, or reoperation less than half as often as the patients of their less skilled colleagues. Finally, their mortality rate was one-fifth as high (0.05% vs. 0.26%), all significant differences.
Importantly, all of the participating surgeons were volunteers, and the videos were selected by the surgeons themselves. This makes it likely that the variations observed in the study might understate the real world differences. Scary stuff.
In 2011, Atul Gawande, in one of his wonderful New Yorker pieces, wrote about coaching. Gawande described how he invited a senior colleague, a retired Brigham surgeon named Robert Osteen, to observe him performing a thyroidectomy, a procedure that he had done roughly a thousand times. One piece of Osteen’s advice bore a remarkable resemblance to what I heard from Artie McNickle on the driving range:
Osteen also asked me to pay more attention to my elbows. At various points during the operation, he observed, my right elbow rose to the level of my shoulder, on occasion higher. “You cannot achieve precision with your elbow in the air,” he said.
Yet as helpful as Gawande found the coaching, it was awkward to have the observer in the room — seen by peers, other staff, and especially patients. One patient, seeing Osteen in the corner, asked, “Who’s that?” Gawande called Osteen “a colleague,” adding, “I asked him along to observe and see if he saw things I could improve.” After seeing a look on the patient’s face “somewhere between puzzlement and alarm,” Gawande added, “He’s like a coach.” The patient did not seem reassured.
Just think how much easier it would have been if Osteen had been watching a video feed. (In fact, Gawande’s Brigham group has been experimenting with just that.) Similarly, in the famous Northern New England Cardiovascular Study, cardiovascular surgeons traveled to each other’s hospitals to watch their peers perform surgery, providing honest feedback about matters ranging from surgical technique to teamwork. The result: a 24% decrease in surgical mortality. As fabulous as these results were, the study — performed more than 20 years ago — has not been replicated, undoubtedly because of the hassle and expense of the intervention. Here too, video could make such observation and feedback far more routine.
I had the chance to try Google Glass a few weeks ago, through a company that Google is working with to identify “use cases” in healthcare. Well, here’s one: how about if novice surgeons — or all surgeons — periodically did operations that were observed, in real time, by certified experts, who then provided them rapid, perhaps even real time, feedback.
While the primary use of such information should be for coaching and improvement, after watching the videos (examples that illustrate good and poor technique accompany the Birkmeyer article), I would not want a lowest quartile surgeon rummaging around my abdomen. (Even I, a complete novice when it comes to surgical technique, could easily distinguish between the assured, polished motions of the experts and the hesitant, clumsy moves of the lower performers.) Putting on my ABIM hat, this study suggests that we need to move briskly into measuring the technical proficiency of proceduralists … and perhaps everyone else. One could easily imagine differences like these in the quality of history taking, physical examination, and end-of-life discussions.
As with all quality measures, the primary use should be for improvement. But the surgeon whose technical performance remains poor even after feedback and practice should not be certified, at least for that procedure. In a recent interview, Birkmeyer endorsed this stance, while pointing out the many knotty issues it raises, such as where to set the threshold.
The instinct to go to the video is an area in which the young ‘uns have a big advantage over geezers like me. My wife did a story in the New York Times last year about the digital revolution in doctoring. The Times brought along a videographer to follow one of our VA-based teams on rounds. When the team was visiting one of its first patients, the 78-year-old man had a grand mal seizure. Several of the team members gathered around the patient to attend to his airway and circulation. An intern stood at the foot of the bed and promptly pulled out his cell phone.
Did he go on Epocrates to check the dose of Lorazepam? On UptoDate to find the management algorithm for new onset seizures? No, he began video recording the seizure. A Times reporter asked him why. “I wanted to record his activity…. So rather than describe to [the neurologist] what took place, I can just show them a video of what took place, and they’ll be able to assess better and treat the patient.”
With tools like smart phones and Google Glass, the technical obstacles to the widespread use of video are beginning to melt away. Of course, other barriers — patient privacy, clinician pride, archiving, cost, the “eewww” factor – will remain. Let’s work through these quickly, so that we can take full advantage of this remarkable tool to improve our patient care.
Bob Wachter is professor of medicine, University of California, San Francisco. He coined the term “hospitalist” and is one of the nation’s leading experts in health care quality and patient safety. He is author of Understanding Patient Safety, Second Edition, and blogs at Wachter’s World, where this article originally appeared.