Skip to content
  • About
  • Contact
  • Contribute
  • Book
  • Careers
  • Podcast
  • Recommended
  • Speaking
  • All
  • Physician
  • Practice
  • Policy
  • Finance
  • Conditions
  • .edu
  • Patient
  • Meds
  • Tech
  • Social
  • Video
    • All
    • Physician
    • Practice
    • Policy
    • Finance
    • Conditions
    • .edu
    • Patient
    • Meds
    • Tech
    • Social
    • Video
    • About
    • Contact
    • Contribute
    • Book
    • Careers
    • Podcast
    • Recommended
    • Speaking

Is it time to embrace augmented empathy while using artificial intelligence in health care?

Vanessa D‘Amario, PhD & Vijay Rajput, MD
Tech
September 30, 2025
Share
Tweet
Share

Introduction: a double-edged disruptor

Artificial intelligence (AI) has quickly insinuated itself and is transforming nearly every corner of modern life. Health care is no exception. With the rise of advanced chatbots, symptom checkers, and health-focused algorithms, patients now have 24/7 access to vast medical knowledge at their fingertips in seconds. The excitement is understandable: AI can demystify medical “doctor-speak” or jargon, suggest possible diagnoses and treatment options, and empower patients to become more engaged in their care, becoming “activated,” meaning participating with confidence to manage their care and take more personal responsibility for following the prescribed treatment plan from their health care professionals.

As with any disruptive tool, there are pros and cons, meaning opportunities and pitfalls. Used wisely, AI can be a valuable assistant in your health journey. Used incorrectly or unwisely, it can become misleading or worse, dangerous. The key is balance: taking advantages of the benefits without ignoring the downside risks.

The power and perils of augmented empathy

Empathy is not just a feel-good virtue in health care. It improves clinical outcomes, enhances treatment adherence, and bolsters patient satisfaction. Yet modern medicine makes sustaining empathy difficult. Clinician burnout, administrative overload, and time pressures for efficiency erode the capacity and time needed to connect and listen. The AI might offer a paradoxical solution: Augmented Empathy.

Rather than replacing physicians, AI can act as an empathy extender. Tools like Abridge, Suki, and Nuance DAX automate documentation, allowing clinicians to focus on human connection. Others use sentiment analysis to detect patient distress in speech or text, flagging emotional cues that might otherwise be missed. Some visionaries predict emotionally attuned AI as a “bedside companion,” offering kind language along with clinicians to coach for tough conversations. In this sense, AI could augment the human touch, not substitute for it.

Of course, AI does not feel. It does not love, fear, or show care. It simulates compassion by predicting language patterns. When a chatbot tells a patient, “You are not alone in this,” it is not speaking from concern, but from agentic AI. Speaking from agentic AI is like a blind person describing color, trained on empathetic text, yet lacking direct experience of connection, or belonging.

The debate is currently open. Some argue that simulated empathy is hollow and risks misleading patients, creating the illusion of a real presence, especially for the vulnerable. Others argue that what matters is the subjective experience, and if AI provides relief there is no strong objective against its use. Instead of viewing AI as perspective, we might see it as a messenger, expressing collective empathy encoded in text. It reflects beautiful thoughts shaped by millions, not independent feeling. Vigilant and informed use is essential. The New Yorker article on losing loneliness captured this vividly: one person, neglected by their spouse, turned to their AI chatbot for comfort and felt more understood. Emerging research suggests that AI tools sometimes outperform humans in perceived empathy. Experimental chatbots like Woebot and Therabot show promise for mental health support, improving outcomes for users with anxiety and depression.

This debate mirrors a longstanding tension in clinical care: the place of detachment. Physicians are often trained to maintain emotional boundaries: not because they do not care, but because too much emotional entanglement can lead to burnout or impaired judgment. They say the right words, adopt a compassionate tone, and provide comfort, sometimes more out of duty than emotional resonance. This resonates with the Stoic ideal of apatheia, calm clarity in service of others. But too much detachment can slide into emotional labor or moral injury when clinicians feel forced to simulate care without time or support to truly feel it.

In this context, AI’s performance of empathy may mirror not only the structure of medical compassion, but also its deepest ethical tensions. So, if we accept this kind of professional empathy from humans, might we also accept it from machines, especially when outcomes can be enhanced?

The Hippocratic Oath urges physicians to avoid harm, yet the impact of empathetic AI on mental health remains unknown. Patients may grow emotionally dependent, ignore medical advice, or reject essential treatment. Granting AI agency risks undermining clinical judgment. If patients feel more comfortable confiding in bots than doctors, or if clinicians outsource difficult conversations to AI, we risk unraveling medicine’s trust based fiduciary foundation. Physicians must retain control to ensure ethical, safe, and patient-centered care.

As Dr. Eric Topol reminds us in Deep Medicine, the future of health care must be “deeply human” even as it becomes increasingly digital with AI and Agentic AI. Similarly, and ironically, the precursor version of ChatGPT, Davinci3, scolds its user and author of the book “The AI revolution in medicine” for thinking of outsourcing empathy and companionship for an old mother to an AI, and it reminds us of the beauty of human connection.

These tools are not a panacea, and we still do not understand what their effect on patients will be. Studies warn of AI’s lack of authenticity, the potential for bias, and the danger of undermining therapeutic trust. Ethicists urge caution: Empathy must never be reduced to a script.

ADVERTISEMENT

The solution is not to resist AI or surrender to it, but to integrate it wisely and deliberately. We can call it Augmented Empathy where AI tools support, rather than replace empathetic care. Just as a stethoscope amplifies sound, AI can amplify emotional signals and cues. It can:

  • Flag rising anxiety in a patient’s voice.
  • Suggest compassionate phrasing for difficult conversations.
  • Track and coach clinicians on empathetic communication.
  • Offer nonjudgmental listening when human resources are scarce.

Used ethically, these tools restore cognitive bandwidth, guide better interactions, and keep humanity at the center of medicine. The goal is not for AI to be empathetic, but to help clinicians stay empathetic.

AI can draft progress notes, triage messages, and suggest comforting words; but only humans can be present. If we are not careful, the rise of AI could make health care more efficient but less human. The future of empathy in medicine does not rest on whether machines can care. It rests on how clinicians use machines to care better. As educators and clinician, we must train the next generation of physicians not only in how to use AI, but in how to preserve their humanism and professionalism alongside it.

Vijay Rajput is an internal medicine physician. Vanessa D’Amario is a business school scientist.

Prev

Why physician leadership should be taught from day one of medical school

September 29, 2025 Kevin 0
…
Next

One doctor’s journey to making an AI study tool less corrosive to critical thinking

September 30, 2025 Kevin 0
…

Tagged as: Health IT

Post navigation

< Previous Post
Why physician leadership should be taught from day one of medical school
Next Post >
One doctor’s journey to making an AI study tool less corrosive to critical thinking

ADVERTISEMENT

More by Vanessa D‘Amario, PhD & Vijay Rajput, MD

  • AI in medical education: the risk to professional identity formation

    Vijay Rajput, MD
  • Why medical organizations must end their silence

    Marilyn Uzdavines, JD & Vijay Rajput, MD
  • Stop doing peer reviews for free

    Vijay Rajput, MD

Related Posts

  • Why the health care industry must prioritize health equity

    George T. Mathew, MD, MBA
  • Bridging the rural surgical care gap with rotating health care teams

    Ankit Jain
  • What happened to real care in health care?

    Christopher H. Foster, PhD, MPA
  • To “fix” health care delivery, turn to a value-based health care system

    David Bernstein, MD, MBA
  • Health care’s hidden problem: hospital primary care losses

    Christopher Habig, MBA
  • Melting the iron triangle: Prioritizing health equity in dynamic, innovative health care landscapes

    Nina Cloven, MHA

More in Tech

  • Why voicemail in outpatient care is failing patients and staff

    Dan Ouellet
  • Building a clinical simulation app without an MD: a developer’s guide

    Helena Kaso, MPA
  • AI-enabled clinical data abstraction: a nurse’s perspective

    Pamela Ashenfelter, RN
  • Agentic AI in medicine: the danger of automating the doctor

    Shiv K. Goel, MD
  • Will AI replace primary care physicians?

    P. Dileep Kumar, MD, MBA
  • AI in medicine: Why it won’t replace doctors but will redefine them

    Tod Stillson, MD
  • Most Popular

  • Past Week

    • Health care as a human right vs. commodity: Resolving the paradox

      Timothy Lesaca, MD | Physician
    • My wife’s story: How DEA and CDC guidelines destroyed our golden years

      Monty Goddard & Richard A. Lawhern, PhD | Conditions
    • The gastroenterologist shortage: Why supply is falling behind demand

      Brian Hudes, MD | Physician
    • Why voicemail in outpatient care is failing patients and staff

      Dan Ouellet | Tech
    • Alex Pretti’s death: Why politics belongs in emergency medicine

      Marilyn McCullum, RN | Conditions
    • U.S. opioid policy history: How politics replaced science in pain care

      Richard A. Lawhern, PhD & Stephen E. Nadeau, MD | Meds
  • Past 6 Months

    • How environmental justice and health disparities connect to climate change

      Kaitlynn Esemaya, Alexis Thompson, Annique McLune, and Anamaria Ancheta | Policy
    • Will AI replace primary care physicians?

      P. Dileep Kumar, MD, MBA | Tech
    • A physician father on the Dobbs decision and reproductive rights

      Travis Walker, MD, MPH | Physician
    • What is the minority tax in medicine?

      Tharini Nagarkar and Maranda C. Ward, EdD, MPH | Education
    • Why the U.S. health care system is failing patients and physicians

      John C. Hagan III, MD | Policy
    • Alex Pretti: a physician’s open letter defending his legacy

      Mousson Berrouet, DO | Physician
  • Recent Posts

    • A physician’s quiet reflection on January 1, 2026

      Dr. Damane Zehra | Conditions
    • AI censorship threatens the lifeline of caregiver support [PODCAST]

      The Podcast by KevinMD | Podcast
    • Demedicalize dying: Why end-of-life care needs a spiritual reset

      Kevin Haselhorst, MD | Physician
    • Physician due process: Surviving the court of public opinion

      Muhamad Aly Rifai, MD | Physician
    • Spaced repetition in medicine: Why current apps fail clinicians

      Dr. Sunakshi Bhatia | Physician
    • When the doctor becomes the patient: a breast cancer diagnosis

      Sue Hwang, MD | Conditions

Subscribe to KevinMD and never miss a story!

Get free updates delivered free to your inbox.


Find jobs at
Careers by KevinMD.com

Search thousands of physician, PA, NP, and CRNA jobs now.

Learn more

Leave a Comment

Founded in 2004 by Kevin Pho, MD, KevinMD.com is the web’s leading platform where physicians, advanced practitioners, nurses, medical students, and patients share their insight and tell their stories.

Social

  • Like on Facebook
  • Follow on Twitter
  • Connect on Linkedin
  • Subscribe on Youtube
  • Instagram

ADVERTISEMENT

ADVERTISEMENT

  • Most Popular

  • Past Week

    • Health care as a human right vs. commodity: Resolving the paradox

      Timothy Lesaca, MD | Physician
    • My wife’s story: How DEA and CDC guidelines destroyed our golden years

      Monty Goddard & Richard A. Lawhern, PhD | Conditions
    • The gastroenterologist shortage: Why supply is falling behind demand

      Brian Hudes, MD | Physician
    • Why voicemail in outpatient care is failing patients and staff

      Dan Ouellet | Tech
    • Alex Pretti’s death: Why politics belongs in emergency medicine

      Marilyn McCullum, RN | Conditions
    • U.S. opioid policy history: How politics replaced science in pain care

      Richard A. Lawhern, PhD & Stephen E. Nadeau, MD | Meds
  • Past 6 Months

    • How environmental justice and health disparities connect to climate change

      Kaitlynn Esemaya, Alexis Thompson, Annique McLune, and Anamaria Ancheta | Policy
    • Will AI replace primary care physicians?

      P. Dileep Kumar, MD, MBA | Tech
    • A physician father on the Dobbs decision and reproductive rights

      Travis Walker, MD, MPH | Physician
    • What is the minority tax in medicine?

      Tharini Nagarkar and Maranda C. Ward, EdD, MPH | Education
    • Why the U.S. health care system is failing patients and physicians

      John C. Hagan III, MD | Policy
    • Alex Pretti: a physician’s open letter defending his legacy

      Mousson Berrouet, DO | Physician
  • Recent Posts

    • A physician’s quiet reflection on January 1, 2026

      Dr. Damane Zehra | Conditions
    • AI censorship threatens the lifeline of caregiver support [PODCAST]

      The Podcast by KevinMD | Podcast
    • Demedicalize dying: Why end-of-life care needs a spiritual reset

      Kevin Haselhorst, MD | Physician
    • Physician due process: Surviving the court of public opinion

      Muhamad Aly Rifai, MD | Physician
    • Spaced repetition in medicine: Why current apps fail clinicians

      Dr. Sunakshi Bhatia | Physician
    • When the doctor becomes the patient: a breast cancer diagnosis

      Sue Hwang, MD | Conditions

MedPage Today Professional

An Everyday Health Property Medpage Today
  • Terms of Use | Disclaimer
  • Privacy Policy
  • DMCA Policy
All Content © KevinMD, LLC
Site by Outthink Group

Leave a Comment

Comments are moderated before they are published. Please read the comment policy.

Loading Comments...