It was a warm summer in Maine but felt like a cold winter in the confines of the hospital building. I was seated next to a physician who was quietly working on a big screen monitor. It was a slow workday as I pondered tips and tricks to make the physician workflow efficient while training providers on implementing electronic health records. I stared at the computer screen and the physician was hard at work. She was ordering a controlled substance for a patient. She reviewed the prescription drug monitoring program database (PDMP) as required by law. The patient had high overdose risk scores.
We both paused. Was the patient drug shopping? The physician overrode the warning and prescribed the medication. She exclaimed, “These are false flags; the patient has a complex medical history and needs this medication.” That was a physician that thought outside algorithms, paid attention, and knew her patient. I started to feel discomfort with the processes for prescribing opiates.
Was the PDMP using a flawed algorithm in labeling patients?
My mind raced back to when I was under anesthesia for a tooth extraction. I had awakened unexpectedly in the middle of the procedure and, in my dreamlike state, needed to be restrained to undergo more sedation.
My wisdom teeth were out, and the team was ecstatic to find that there wasn’t any permanent nerve damage as it was a complicated extraction. I was given prescribed pain meds, but it wasn’t enough to manage my pain. My provider wrote me another script the following day. I went to the local drug store to fill the script, and after 20 minutes, the pharmacist approached me. He explained that he couldn’t fill the medication. I was bewildered, and he could see it in my eyes.
I didn’t sleep the night before because I was severely in pain; this medication would give me the relief that I desperately needed. I couldn’t grasp what was going on. I didn’t recall being prescribed opiates in the past. Was I labeled as a drug seeker? I shrugged off the thought and proceeded to explain to the pharmacist, who was standing at a distance but directly in front of me; my face was swollen, and I had gauze in my mouth, which was hard to communicate. Still, I managed to explain to him why I had been prescribed another medication to manage my pain. He wasn’t convinced and went back behind the counter. As I looked around, I could see shoppers staring at me. I urged the pharmacist to call the doctor’s office to confirm my story. After another 20 minutes, I received my script.
Was the pharmacist basing his analysis on a flawed algorithm?
The United States has an opioid prescribing algorithm statewide PDMP that gathers data on patterns on drug-seeking to reduce the opioid overdose crisis. In the Wired article “The Pain Was Unbearable. So Why Did Doctors Turn Her Away? “Author Maia Szalvitz explained, “There are state-level prescription drug databases – electronic registries that track scripts for certain controlled substances in real-time; it gives authorities a set of eyes onto the pharmaceutical market. The monitoring process follows the number of pharmacies a patient has visited, combinations of prescriptions received, and health care organizations visited.”
But the system overrides physician decisions and experience. It also creates a big brother system that is more akin to a scene from the book 1984 than what we have come to expect from interactions between ourselves and our medical providers. What generates a flag is unknown, and the results can create a confusing playing field for those in genuine need of medication. According to the article, “20% of patients that were flagged by the algorithm as drug seekers are cancer patients with complex medical treatment.”
Do we really need to put cancer patients on the same list as Walter White from Breaking Bad?
The future of medicine will be more and more reliant on artificial intelligence based on algorithms. The advancement of such technologies makes medicine more efficient but can also lead to false conclusions if doctors are too dependent upon technology. After all, even though it’s a computer algorithm, it was devised by a human. Like a social media platform, artificial intelligence using an extensive data system constantly grows and changes depending on the input. This means that physicians need to be aware, not rely solely on system conclusions and look for zebras.
Note: The anecdote in the first two paragraphs is a fictionalization of my real-world experiences. It’s not based on a specific case, and I am respectful of patients’ stories and HIPAA regulations. I have never worked or been to Maine, although it would be nice to visit.
Afua Aning is a physician informaticist.
Image credit: Shutterstock.com