Skip to content
  • About
  • Contact
  • Contribute
  • Book
  • Careers
  • Podcast
  • Recommended
  • Speaking
  • All
  • Physician
  • Practice
  • Policy
  • Finance
  • Conditions
  • .edu
  • Patient
  • Meds
  • Tech
  • Social
  • Video
    • All
    • Physician
    • Practice
    • Policy
    • Finance
    • Conditions
    • .edu
    • Patient
    • Meds
    • Tech
    • Social
    • Video
    • About
    • Contact
    • Contribute
    • Book
    • Careers
    • Podcast
    • Recommended
    • Speaking

Is it time to embrace augmented empathy while using artificial intelligence in health care?

Vanessa D‘Amario, PhD & Vijay Rajput, MD
Tech
September 30, 2025
Share
Tweet
Share

Introduction: a double-edged disruptor

Artificial intelligence (AI) has quickly insinuated itself and is transforming nearly every corner of modern life. Health care is no exception. With the rise of advanced chatbots, symptom checkers, and health-focused algorithms, patients now have 24/7 access to vast medical knowledge at their fingertips in seconds. The excitement is understandable: AI can demystify medical “doctor-speak” or jargon, suggest possible diagnoses and treatment options, and empower patients to become more engaged in their care, becoming “activated,” meaning participating with confidence to manage their care and take more personal responsibility for following the prescribed treatment plan from their health care professionals.

As with any disruptive tool, there are pros and cons, meaning opportunities and pitfalls. Used wisely, AI can be a valuable assistant in your health journey. Used incorrectly or unwisely, it can become misleading or worse, dangerous. The key is balance: taking advantages of the benefits without ignoring the downside risks.

The power and perils of augmented empathy

Empathy is not just a feel-good virtue in health care. It improves clinical outcomes, enhances treatment adherence, and bolsters patient satisfaction. Yet modern medicine makes sustaining empathy difficult. Clinician burnout, administrative overload, and time pressures for efficiency erode the capacity and time needed to connect and listen. The AI might offer a paradoxical solution: Augmented Empathy.

Rather than replacing physicians, AI can act as an empathy extender. Tools like Abridge, Suki, and Nuance DAX automate documentation, allowing clinicians to focus on human connection. Others use sentiment analysis to detect patient distress in speech or text, flagging emotional cues that might otherwise be missed. Some visionaries predict emotionally attuned AI as a “bedside companion,” offering kind language along with clinicians to coach for tough conversations. In this sense, AI could augment the human touch, not substitute for it.

Of course, AI does not feel. It does not love, fear, or show care. It simulates compassion by predicting language patterns. When a chatbot tells a patient, “You are not alone in this,” it is not speaking from concern, but from agentic AI. Speaking from agentic AI is like a blind person describing color, trained on empathetic text, yet lacking direct experience of connection, or belonging.

The debate is currently open. Some argue that simulated empathy is hollow and risks misleading patients, creating the illusion of a real presence, especially for the vulnerable. Others argue that what matters is the subjective experience, and if AI provides relief there is no strong objective against its use. Instead of viewing AI as perspective, we might see it as a messenger, expressing collective empathy encoded in text. It reflects beautiful thoughts shaped by millions, not independent feeling. Vigilant and informed use is essential. The New Yorker article on losing loneliness captured this vividly: one person, neglected by their spouse, turned to their AI chatbot for comfort and felt more understood. Emerging research suggests that AI tools sometimes outperform humans in perceived empathy. Experimental chatbots like Woebot and Therabot show promise for mental health support, improving outcomes for users with anxiety and depression.

This debate mirrors a longstanding tension in clinical care: the place of detachment. Physicians are often trained to maintain emotional boundaries: not because they do not care, but because too much emotional entanglement can lead to burnout or impaired judgment. They say the right words, adopt a compassionate tone, and provide comfort, sometimes more out of duty than emotional resonance. This resonates with the Stoic ideal of apatheia, calm clarity in service of others. But too much detachment can slide into emotional labor or moral injury when clinicians feel forced to simulate care without time or support to truly feel it.

In this context, AI’s performance of empathy may mirror not only the structure of medical compassion, but also its deepest ethical tensions. So, if we accept this kind of professional empathy from humans, might we also accept it from machines, especially when outcomes can be enhanced?

The Hippocratic Oath urges physicians to avoid harm, yet the impact of empathetic AI on mental health remains unknown. Patients may grow emotionally dependent, ignore medical advice, or reject essential treatment. Granting AI agency risks undermining clinical judgment. If patients feel more comfortable confiding in bots than doctors, or if clinicians outsource difficult conversations to AI, we risk unraveling medicine’s trust based fiduciary foundation. Physicians must retain control to ensure ethical, safe, and patient-centered care.

As Dr. Eric Topol reminds us in Deep Medicine, the future of health care must be “deeply human” even as it becomes increasingly digital with AI and Agentic AI. Similarly, and ironically, the precursor version of ChatGPT, Davinci3, scolds its user and author of the book “The AI revolution in medicine” for thinking of outsourcing empathy and companionship for an old mother to an AI, and it reminds us of the beauty of human connection.

These tools are not a panacea, and we still do not understand what their effect on patients will be. Studies warn of AI’s lack of authenticity, the potential for bias, and the danger of undermining therapeutic trust. Ethicists urge caution: Empathy must never be reduced to a script.

ADVERTISEMENT

The solution is not to resist AI or surrender to it, but to integrate it wisely and deliberately. We can call it Augmented Empathy where AI tools support, rather than replace empathetic care. Just as a stethoscope amplifies sound, AI can amplify emotional signals and cues. It can:

  • Flag rising anxiety in a patient’s voice.
  • Suggest compassionate phrasing for difficult conversations.
  • Track and coach clinicians on empathetic communication.
  • Offer nonjudgmental listening when human resources are scarce.

Used ethically, these tools restore cognitive bandwidth, guide better interactions, and keep humanity at the center of medicine. The goal is not for AI to be empathetic, but to help clinicians stay empathetic.

AI can draft progress notes, triage messages, and suggest comforting words; but only humans can be present. If we are not careful, the rise of AI could make health care more efficient but less human. The future of empathy in medicine does not rest on whether machines can care. It rests on how clinicians use machines to care better. As educators and clinician, we must train the next generation of physicians not only in how to use AI, but in how to preserve their humanism and professionalism alongside it.

Vijay Rajput is an internal medicine physician. Vanessa D’Amario is a business school scientist.

Prev

Why physician leadership should be taught from day one of medical school

September 29, 2025 Kevin 0
…
Next

One doctor’s journey to making an AI study tool less corrosive to critical thinking

September 30, 2025 Kevin 0
…

Tagged as: Health IT

Post navigation

< Previous Post
Why physician leadership should be taught from day one of medical school
Next Post >
One doctor’s journey to making an AI study tool less corrosive to critical thinking

ADVERTISEMENT

More by Vanessa D‘Amario, PhD & Vijay Rajput, MD

  • Why medical organizations must end their silence

    Marilyn Uzdavines, JD & Vijay Rajput, MD
  • Stop doing peer reviews for free

    Vijay Rajput, MD
  • When embarrassment is a teacher in medicine

    Vijay Rajput, MD

Related Posts

  • Why the health care industry must prioritize health equity

    George T. Mathew, MD, MBA
  • Bridging the rural surgical care gap with rotating health care teams

    Ankit Jain
  • What happened to real care in health care?

    Christopher H. Foster, PhD, MPA
  • To “fix” health care delivery, turn to a value-based health care system

    David Bernstein, MD, MBA
  • Health care’s hidden problem: hospital primary care losses

    Christopher Habig, MBA
  • Melting the iron triangle: Prioritizing health equity in dynamic, innovative health care landscapes

    Nina Cloven, MHA

More in Tech

  • The loss of storytelling with ambient AI systems

    Alexandria Phan, MD
  • The consequences of adopting AI in medicine

    Jordan Liz, PhD
  • Why AI in medicine elevates humanity instead of replacing it

    Tod Stillson, MD
  • How an AI medical scribe saved my practice

    Ashten Duncan, MD
  • Innovation in medicine: 6 strategies for docs

    Jalene Jacob, MD, MBA
  • AI in medical imaging: When algorithms block the view

    Gerald Kuo
  • Most Popular

  • Past Week

    • The blind men and the elephant: a parable for modern pain management

      Richard A. Lawhern, PhD | Conditions
    • Why insurance must cover home blood pressure monitors

      Soneesh Kothagundla | Conditions
    • Why patient trust in physicians is declining

      Mansi Kotwal, MD, MPH | Physician
    • The dangers of oral steroids for seasonal illness

      Megan Milne, PharmD | Meds
    • Catching type 1 diabetes before it becomes life-threatening [PODCAST]

      The Podcast by KevinMD | Podcast
    • Preventing physician burnout: an educational approach

      William Lynes, MD | Physician
  • Past 6 Months

    • The blind men and the elephant: a parable for modern pain management

      Richard A. Lawhern, PhD | Conditions
    • Is primary care becoming a triage station?

      J. Leonard Lichtenfeld, MD | Physician
    • Psychiatrists are physicians: a key distinction

      Farid Sabet-Sharghi, MD | Physician
    • Why feeling unlike yourself is a sign of physician emotional overload

      Stephanie Wellington, MD | Physician
    • The U.S. gastroenterologist shortage explained

      Brian Hudes, MD | Physician
    • Accountable care cooperatives: a community-owned health care fix

      David K. Cundiff, MD | Policy
  • Recent Posts

    • The anticoagulant evidence controversy: a whistleblower’s perspective

      David K. Cundiff, MD | Meds
    • 5 things health care must stop doing to improve physician well-being

      Christie Mulholland, MD | Physician
    • Is tramadol really ineffective and risky?

      John A. Bumpus, PhD | Meds
    • Why patient trust in physicians is declining

      Mansi Kotwal, MD, MPH | Physician
    • Mindfulness in the journey: Finding rewards in the middle

      Diane W. Shannon, MD, MPH | Physician
    • Treating your bone density like a retirement account [PODCAST]

      The Podcast by KevinMD | Podcast

Subscribe to KevinMD and never miss a story!

Get free updates delivered free to your inbox.


Find jobs at
Careers by KevinMD.com

Search thousands of physician, PA, NP, and CRNA jobs now.

Learn more

Leave a Comment

Founded in 2004 by Kevin Pho, MD, KevinMD.com is the web’s leading platform where physicians, advanced practitioners, nurses, medical students, and patients share their insight and tell their stories.

Social

  • Like on Facebook
  • Follow on Twitter
  • Connect on Linkedin
  • Subscribe on Youtube
  • Instagram

ADVERTISEMENT

ADVERTISEMENT

  • Most Popular

  • Past Week

    • The blind men and the elephant: a parable for modern pain management

      Richard A. Lawhern, PhD | Conditions
    • Why insurance must cover home blood pressure monitors

      Soneesh Kothagundla | Conditions
    • Why patient trust in physicians is declining

      Mansi Kotwal, MD, MPH | Physician
    • The dangers of oral steroids for seasonal illness

      Megan Milne, PharmD | Meds
    • Catching type 1 diabetes before it becomes life-threatening [PODCAST]

      The Podcast by KevinMD | Podcast
    • Preventing physician burnout: an educational approach

      William Lynes, MD | Physician
  • Past 6 Months

    • The blind men and the elephant: a parable for modern pain management

      Richard A. Lawhern, PhD | Conditions
    • Is primary care becoming a triage station?

      J. Leonard Lichtenfeld, MD | Physician
    • Psychiatrists are physicians: a key distinction

      Farid Sabet-Sharghi, MD | Physician
    • Why feeling unlike yourself is a sign of physician emotional overload

      Stephanie Wellington, MD | Physician
    • The U.S. gastroenterologist shortage explained

      Brian Hudes, MD | Physician
    • Accountable care cooperatives: a community-owned health care fix

      David K. Cundiff, MD | Policy
  • Recent Posts

    • The anticoagulant evidence controversy: a whistleblower’s perspective

      David K. Cundiff, MD | Meds
    • 5 things health care must stop doing to improve physician well-being

      Christie Mulholland, MD | Physician
    • Is tramadol really ineffective and risky?

      John A. Bumpus, PhD | Meds
    • Why patient trust in physicians is declining

      Mansi Kotwal, MD, MPH | Physician
    • Mindfulness in the journey: Finding rewards in the middle

      Diane W. Shannon, MD, MPH | Physician
    • Treating your bone density like a retirement account [PODCAST]

      The Podcast by KevinMD | Podcast

MedPage Today Professional

An Everyday Health Property Medpage Today
  • Terms of Use | Disclaimer
  • Privacy Policy
  • DMCA Policy
All Content © KevinMD, LLC
Site by Outthink Group

Leave a Comment

Comments are moderated before they are published. Please read the comment policy.

Loading Comments...