Skip to content
  • About
  • Contact
  • Contribute
  • Book
  • Careers
  • Podcast
  • Recommended
  • Speaking
  • All
  • Physician
  • Practice
  • Policy
  • Finance
  • Conditions
  • .edu
  • Patient
  • Meds
  • Tech
  • Social
  • Video
    • All
    • Physician
    • Practice
    • Policy
    • Finance
    • Conditions
    • .edu
    • Patient
    • Meds
    • Tech
    • Social
    • Video
    • About
    • Contact
    • Contribute
    • Book
    • Careers
    • Podcast
    • Recommended
    • Speaking

Reducing medical errors from health care AI: lessons from Claude Shannon and Max Planck on precision in medicine

Neil Anand, MD
Tech
November 17, 2024
Share
Tweet
Share

In 1948, Claude Shannon revolutionized the world of communication with his theory of information, showing that precision and efficiency could emerge from chaos. Roughly 40 years earlier, Max Planck had done something similar in physics by discovering the rules of quantum mechanics, reducing uncertainty in an unpredictable universe. These two minds, though working in entirely different fields, shared a common vision: to bring order out of entropy. Today, their legacies hold surprising relevance in one of the most advanced frontiers of modern medicine, artificial intelligence (AI) in health care.

AI has become an essential tool in diagnosing diseases, predicting patient outcomes, and guiding complex treatments. Yet, despite the promise of precision, AI systems in health care remain susceptible to a dangerous form of entropy—a creeping disorder that can lead to systemic errors, missed diagnoses, and faulty recommendations. As more hospitals and medical facilities rely on these technologies, the stakes are as high as ever. The dream of reducing medical error through AI has, in some cases, transformed into a new breed of error, one rooted in the uncertainty of the machine’s algorithms.

In Shannon’s world, noise was the enemy. It was any interference that could distort or corrupt a message as it moved from sender to receiver. To combat this, Shannon developed systems of redundancy and error correction, ensuring that even with noise present, the message could still be received with clarity. The application of his ideas in health care AI is strikingly direct: (1) the “message,” which is the patient’s medical data, is a series of symptoms, imaging results, and historical records; (2) the “noise” is the complexity of translating this data by artificial intelligence into accurate diagnoses and treatment plans.

In theory, these health care artificial intelligence programs have the capacity to process vast amounts of data, identifying even the most subtle patterns while filtering out irrelevant noise, ultimately making valuable predictions about future behaviors and outcomes. Even more impressive, the software becomes smarter with each use. The fact that machine learning algorithms aren’t more prevalent in modern medical practice likely has more to do with limitations in data availability and computing power than with the validity of the technology itself. The concept is solid, and if machine learning isn’t fully integrated now, it’s certainly on the horizon.

Health care professionals must realize that machine learning researchers grapple with the constant tradeoff between accuracy and intelligibility. Accuracy refers to how often the algorithm provides the correct answer. Intelligibility, on the other hand, relates to our ability to understand how or why the algorithm reached its conclusion. As machine learning software grows more accurate, it often becomes less intelligible because it learns and improves without relying on explicit instructions. The most accurate models frequently become the least understandable, and vice versa. This forces machine learning developers to strike a balance, deciding how much accuracy they’re willing to sacrifice to make the system more understandable.

The problem emerges when health care AI, fed vast amounts of data, begins to lose clarity in its predictions. One case involved an AI diagnostic system used to predict patient outcomes for those suffering from pneumonia. The system performed well, except in one critical instance: it incorrectly concluded that asthma patients had better survival rates. This misleading result stemmed from the system’s reliance on historical data, where patients with asthma were treated more aggressively, skewing the predictions. Here, health care AI created informational noise, a false assumption that led to a critical misinterpretation of risk.

Shannon’s solution to noise was error correction, ensuring that the system could detect when something was wrong. In the same way, health care AI needs robust feedback loops, automated methods of identifying when its conclusions stray too far from reality. Just as Shannon’s redundancy codes can correct transmission errors, health care AI systems should be designed with self-correction capabilities that can recognize when predictions are distorted by data biases or statistical outliers.

Max Planck also brought precision to the unpredictable world of subatomic particles. His quantum theory was based on the understanding that the universe, at its smallest scales, isn’t chaotic but governed by discrete laws. His insightful genius transformed physics, allowing scientists and physicists to predict outcomes with extraordinary accuracy. In health care AI, precision is equally important. Yet the unpredictability of machine learning algorithms often mirrors the chaotic universe that Planck sought to tame. This lack of precision is akin to Planck’s early world of chaos, before his solutions in quantum theory provided order. Planck’s brilliance was recognizing that if we broke down complex systems into small, manageable units, precision could be achieved.

In the case of health care AI, precision can be achieved by ensuring that the training data is representative of all patient demographics. If health care AI is to reduce medical entropy, it must be trained and retrained on diverse datasets, ensuring that its predictive models apply equally across racial, ethnic, and gender lines. Just as Planck’s discovery of quantum “packets” brought precision to physics, diversity in AI data can bring precision to health care AI’s medical judgments.

Medical AI errors are unlike the traditional human errors of misdiagnosis or surgical mistakes. They are systemic errors often rooted in the data, algorithms, and processes that underpin the AI systems themselves. These errors arise not from negligence or fatigue but from the very foundation of AI design. It is here that Shannon and Planck’s principles become vital. Take, for example, a health care AI system deployed to predict which patients in the ICU are at the highest risk of death. If the AI system misinterpreted patient data to such an extent that it predicted lower-risk patients would die sooner than high-risk ones, the AI would prompt doctors to focus attention on the wrong individuals. One could envision how uncontrolled AI-driven medical entropy would cause increasing disorder in our health care system, leading to catastrophic results.

Human lives are on the line, and each misstep in the AI algorithm represents a potential catastrophe. Much like quantum systems that evolve based on probabilities, health care AI systems must be adaptive, learning from their errors, recalibrating based on new data, and continuously refining their predictive models. This is how entropy is reduced in an environment where the potential for chaos is ever-present. While AI in health care promises to revolutionize medicine, the cost of unmanaged entropy is far too high. When AI systems fail, it is not just a matter of missed phone calls or dropped internet connections—it is the misdiagnosis of cancer, the incorrect assignment of priority in the ICU, or the faulty prediction of survival rates.

Health care AI systems must be designed with real-time feedback that mimics Shannon’s error-correcting codes. These feedback loops can identify when predictions deviate from reality and adjust accordingly, reducing the noise that leads to AI misdiagnoses or improper AI treatment plans. Just as Planck achieved precision through a detailed understanding of atomic behavior, health care AI must reach its potential by accounting for the diversity of human biology. The more diverse the data, the more precise and accurate the health care AI becomes, ensuring that its predictions hold true for all patients.

Claude Shannon and Max Planck taught us that accuracy matters. The health care AI systems we build must reflect their commitment to precision. Just as Shannon fought against noise and Planck sought order from chaos, health care AI must strive to reduce the entropy of errors that currently plague it. It is only by incorporating robust error correction, embracing data diversity, and ensuring continuous learning that health care AI can fulfill its promise of improving patient outcomes without introducing new dangers. The future of medicine, like the future of communication and physics, depends on our ability to tame uncertainty and bring order to complex systems. Shannon and Planck showed us how, and now it’s time for health care AI to follow their lead. In the end, reducing health care AI entropy is not just about preventing miscommunication or miscalculation—it’s about saving human lives.

ADVERTISEMENT

Neil Anand is an anesthesiologist.

Prev

Why physicians can't let go of the golden RVU ring

November 17, 2024 Kevin 0
…
Next

Why body image and food choices are more connected than you think [PODCAST]

November 17, 2024 Kevin 0
…

Tagged as: Health IT

Post navigation

< Previous Post
Why physicians can't let go of the golden RVU ring
Next Post >
Why body image and food choices are more connected than you think [PODCAST]

ADVERTISEMENT

ADVERTISEMENT

ADVERTISEMENT

More by Neil Anand, MD

  • How AI is revolutionizing health care through the lens of Alice in Wonderland

    Neil Anand, MD
  • The infamous Corrupted Blood incident: What a World of Warcraft computer game pandemic can teach physicians about public health crises

    Neil Anand, MD
  • The weaponization of predictive data analytics, red flags, and the chronic pain gender gap has become a radioactive crisis in U.S. health care

    Neil Anand, MD

Related Posts

  • Why the health care industry must prioritize health equity

    George T. Mathew, MD, MBA
  • Improve mental health by improving how we finance health care

    Steven Siegel, MD, PhD
  • Proactive care is the linchpin for saving America’s health care system

    Ronald A. Paulus, MD, MBA
  • Health care workers should not be targets

    Lori E. Johnson
  • To “fix” health care delivery, turn to a value-based health care system

    David Bernstein, MD, MBA
  • Health care’s hidden problem: hospital primary care losses

    Christopher Habig, MBA

More in Tech

  • Health care’s data problem: the real obstacle to AI success

    Jay Anders, MD
  • What ChatGPT’s tone reveals about our cultural values

    Jenny Shields, PhD
  • Bridging the digital divide: Addressing health inequities through home-based AI solutions

    Dr. Sreeram Mullankandy
  • Staying stone free with AI: How smart tech is revolutionizing kidney stone prevention

    Robert Chan, MD
  • Medical school admissions are racing toward an AI-driven disaster

    Newlyn Joseph, MD
  • AI in health care: the black box of prior authorization

    P. Dileep Kumar, MD, MBA
  • Most Popular

  • Past Week

    • What’s driving medical students away from primary care?

      ​​Vineeth Amba, MPH, Archita Goyal, and Wayne Altman, MD | Education
    • A faster path to becoming a doctor is possible—here’s how

      Ankit Jain | Education
    • Make cognitive testing as routine as a blood pressure check

      Joshua Baker and James Jackson, PsyD | Conditions
    • The dreaded question: Do you have boys or girls?

      Pamela Adelstein, MD | Physician
    • A world without antidepressants: What could possibly go wrong?

      Tomi Mitchell, MD | Meds
    • Rethinking patient payments: Why billing is the new frontline of patient care [PODCAST]

      The Podcast by KevinMD | Podcast
  • Past 6 Months

    • What’s driving medical students away from primary care?

      ​​Vineeth Amba, MPH, Archita Goyal, and Wayne Altman, MD | Education
    • The silent crisis hurting pain patients and their doctors

      Kayvan Haddadan, MD | Physician
    • Internal Medicine 2025: inspiration at the annual meeting

      American College of Physicians | Physician
    • What happened to real care in health care?

      Christopher H. Foster, PhD, MPA | Policy
    • Are quotas a solution to physician shortages?

      Jacob Murphy | Education
    • The hidden bias in how we treat chronic pain

      Richard A. Lawhern, PhD | Meds
  • Recent Posts

    • The broken health care system doesn’t have to break you

      Jessie Mahoney, MD | Physician
    • Why great patient outcomes don’t protect female doctors from burnout [PODCAST]

      The Podcast by KevinMD | Podcast
    • Why ADHD in women is finally getting the attention it deserves

      Arti Lal, MD | Conditions
    • How a $75 million jet brought down America’s boldest doctor

      Arthur Lazarus, MD, MBA | Physician
    • Why ruling out sepsis in emergency departments can be lifesaving

      Claude M. D'Antonio, Jr., MD | Conditions
    • The hidden cost of delaying back surgery

      Gbolahan Okubadejo, MD | Conditions

Subscribe to KevinMD and never miss a story!

Get free updates delivered free to your inbox.


Find jobs at
Careers by KevinMD.com

Search thousands of physician, PA, NP, and CRNA jobs now.

Learn more

Leave a Comment

Founded in 2004 by Kevin Pho, MD, KevinMD.com is the web’s leading platform where physicians, advanced practitioners, nurses, medical students, and patients share their insight and tell their stories.

Social

  • Like on Facebook
  • Follow on Twitter
  • Connect on Linkedin
  • Subscribe on Youtube
  • Instagram

ADVERTISEMENT

ADVERTISEMENT

ADVERTISEMENT

ADVERTISEMENT

  • Most Popular

  • Past Week

    • What’s driving medical students away from primary care?

      ​​Vineeth Amba, MPH, Archita Goyal, and Wayne Altman, MD | Education
    • A faster path to becoming a doctor is possible—here’s how

      Ankit Jain | Education
    • Make cognitive testing as routine as a blood pressure check

      Joshua Baker and James Jackson, PsyD | Conditions
    • The dreaded question: Do you have boys or girls?

      Pamela Adelstein, MD | Physician
    • A world without antidepressants: What could possibly go wrong?

      Tomi Mitchell, MD | Meds
    • Rethinking patient payments: Why billing is the new frontline of patient care [PODCAST]

      The Podcast by KevinMD | Podcast
  • Past 6 Months

    • What’s driving medical students away from primary care?

      ​​Vineeth Amba, MPH, Archita Goyal, and Wayne Altman, MD | Education
    • The silent crisis hurting pain patients and their doctors

      Kayvan Haddadan, MD | Physician
    • Internal Medicine 2025: inspiration at the annual meeting

      American College of Physicians | Physician
    • What happened to real care in health care?

      Christopher H. Foster, PhD, MPA | Policy
    • Are quotas a solution to physician shortages?

      Jacob Murphy | Education
    • The hidden bias in how we treat chronic pain

      Richard A. Lawhern, PhD | Meds
  • Recent Posts

    • The broken health care system doesn’t have to break you

      Jessie Mahoney, MD | Physician
    • Why great patient outcomes don’t protect female doctors from burnout [PODCAST]

      The Podcast by KevinMD | Podcast
    • Why ADHD in women is finally getting the attention it deserves

      Arti Lal, MD | Conditions
    • How a $75 million jet brought down America’s boldest doctor

      Arthur Lazarus, MD, MBA | Physician
    • Why ruling out sepsis in emergency departments can be lifesaving

      Claude M. D'Antonio, Jr., MD | Conditions
    • The hidden cost of delaying back surgery

      Gbolahan Okubadejo, MD | Conditions

MedPage Today Professional

An Everyday Health Property Medpage Today
  • Terms of Use | Disclaimer
  • Privacy Policy
  • DMCA Policy
All Content © KevinMD, LLC
Site by Outthink Group

Leave a Comment

Comments are moderated before they are published. Please read the comment policy.

Loading Comments...