Skip to content
  • About
  • Contact
  • Contribute
  • Book
  • Careers
  • Podcast
  • Recommended
  • Speaking
  • All
  • Physician
  • Practice
  • Policy
  • Finance
  • Conditions
  • .edu
  • Patient
  • Meds
  • Tech
  • Social
  • Video
    • All
    • Physician
    • Practice
    • Policy
    • Finance
    • Conditions
    • .edu
    • Patient
    • Meds
    • Tech
    • Social
    • Video
    • About
    • Contact
    • Contribute
    • Book
    • Careers
    • Podcast
    • Recommended
    • Speaking

AI in your health care: a double-edged digital disruptor

Alan P. Feren, MD
Tech
September 26, 2025
Share
Tweet
Share

Introduction: a double-edged disruptor

Artificial intelligence (AI) has quickly insinuated itself and is transforming nearly every corner of modern life. Health care is no exception. With the rise of advanced chatbots, symptom checkers, and health-focused algorithms, patients now have 24/7 access to vast medical knowledge at their fingertips in seconds. The excitement is understandable: AI can demystify medical “doctor-speak” or jargon, suggest possible diagnoses and treatment options, and empower patients to become more engaged in their care, becoming “activated,” meaning participating with confidence to manage their care and take more personal responsibility for following the prescribed treatment plan from their health care professionals.

As with any disruptive tool, there are pros and cons, meaning opportunities and pitfalls. Used wisely, AI can be a valuable assistant in your health journey. Used incorrectly or unwisely, it can become misleading or worse, dangerous. The key is balance: taking advantages of the benefits without ignoring the downside risks.

The upside: What AI can do for patients

  • Easy access to medical information 24/7, without waiting for appointments.
  • Improves health literacy by translating medical jargon into understandable language.
  • Decision support: helps patients generate better questions and identify red flags.
  • Convenience and potential cost savings by triaging minor concerns at home (similar in many respects to existing nurse advice lines that are available).
  • Personalization through reminders, lifestyle suggestions, and chronic disease support.
  • Record organization, including summaries of test results and visit notes (note, most of these are already present and available in EMRs if being used).
  • Educational role in prevention, offering evidence-based lifestyle strategies.

The downside: What AI cannot (and should not) do

  • Risk of misdiagnosis if symptoms are oversimplified or context is missing.
  • Delays in professional care when patients over-rely on AI.
  • False confidence in AI outputs, which cannot replace exams or lab testing.
  • Data privacy and security concerns, with sensitive health data potentially stored or shared.
  • Bias in algorithms, which may misrepresent underrepresented populations.
  • No accountability if something goes wrong.
  • Anxiety from long differential lists, sometimes highlighting rare diseases.
  • Inadequacy in complex cases requiring nuanced judgment or physical exam.

Best practices: How to use AI safely in your health care

Given all of this information, how is one to know how to safely use AI as it applies to their personal wellbeing and more importantly, their health care? The concept of “best practices” must be applied as in many other industries: like finance, cybersecurity, manufacturing, engineering, aviation, transportation, construction, and architecture. The application of a best practices approach in using AI in health care is the only way to protect patients’ safety, provide consistent care that is reliable and useful, and is an adjunct but not a replacement for professional care. Articulated below is an AI best practices outline for consideration:

  • Use AI for education, NOT diagnosis: It is excellent for preparing questions, looking up/clarifying medical terms, and exploring general guidelines — but it is not your doctor.
  • Know the red flags that always require care: chest pain, shortness of breath, neurological deficits, severe pain, heavy and/or continuous bleeding, allergic reactions, rapidly worsening symptoms, and emergencies in infants or pregnancy all mean skip AI and call the appropriate resource for help and/or 911.
  • Provide full context in your queries: Age, sex, important medical history like that involving your heart, lungs, kidneys, liver, medications, allergies, exposures, and objective numbers (like blood pressure or glucose). This will make outputs more useful.
  • Demand probabilities, not certainties: Ask for several possible explanations, what could make each more or less likely, and which are dangerous to miss.
  • Verify with authoritative sources: Always cross-check AI answers with NIH, CDC, specialty societies, or your own clinician.
  • Never change prescriptions or a treatment plan based on AI: Do not start, stop, or adjust medications without medical supervision.
  • Protect your privacy: Limit personal identifiers when using AI tools; be mindful of how your data may be stored or shared. Be aware that AI is not HIPAA compliant.
  • Recognize bias and blind spots: AI may miss conditions in women, children, older adults, and underrepresented groups.
  • Use AI as a prep tool for your doctor visit: Summarize your concern, timeline, signs and symptoms, what makes them better/worse, treatments tried, and top questions.
  • Trust your instincts: You know your body better than anyone. If something feels wrong, seek real care irrespective of what AI suggests.

In conclusion: an assistant or partner, but not a replacement

AI has enormous potential to empower patients, increase understanding, and streamline care — but it must be treated as an adjunct or partner tool, not a replacement for clinicians. Remember to keep in mind AI’s limits, verify its output against trusted sources, and follow best practices. AI is most useful for helping patients become more informed and engaged, while accountability remains with licensed professionals.

The future of health care is increasingly digital with EMRs, Google, and now AI — but safety, individual context, and human judgment will always remain the essential ingredients for optimal clinical outcomes.

AI safety and best practices: one-page guide for patients

When to use AI

  • Education on conditions, terminology, and guidelines
  • Preparing questions for your doctor
  • Lifestyle and wellness information (diet, sleep, exercise)

When NOT to use AI

  • To make or confirm a diagnosis
  • To decide on urgent/emergency care
  • To start, stop, or change prescription meds/doses

Red flag situations: Seek immediate care

ADVERTISEMENT

  • Chest pain, trouble breathing, new weakness/numbness/confusion
  • Severe headache “worst ever,” high fever with stiff neck
  • Severe abdominal pain, heavy/continuous bleeding, allergic reaction
  • Rapidly worsening symptoms, pregnancy concerns, infants with illness
  • Bleeding that continues despite pressure, dressings, or ice

Best practices for safe use

  • Give complete context (age, history, meds, allergies, exposures)
  • Ask AI for probabilities, uncertainty, and danger signs
  • Cross-check AI with NIH, CDC, Mayo Clinic, Cleveland Clinic, or specialty societies
  • Protect privacy: Avoid sharing identifying information
  • Do not self-medicate or change prescriptions based on AI advice

Practical tips

  • Use AI as a prep tool for your doctor visit
  • Keep a concise note with concern, timeline, questions, meds/allergies, key numbers
  • Document symptoms and test results with dates
  • Trust your instincts

Quick prompt template

“I am using this for education only. Give possible causes, danger signs, and questions to ask my clinician. I am a [age/sex], with [conditions/meds/allergies]. Symptoms started [when], are [better/worse with]. What would make this urgent?”

Alan P. Feren is an otolaryngologist.

Prev

The crisis of antisemitism in our hospitals

September 26, 2025 Kevin 0
…
Next

How one physician redesigned her practice to find joy in primary care again [PODCAST]

September 26, 2025 Kevin 3
…

Tagged as: Health IT

Post navigation

< Previous Post
The crisis of antisemitism in our hospitals
Next Post >
How one physician redesigned her practice to find joy in primary care again [PODCAST]

ADVERTISEMENT

More by Alan P. Feren, MD

  • Why health self-advocacy is an essential life skill

    Alan P. Feren, MD & Joyce Griggs
  • Why agency and partnership are vital in modern health care

    Alan P. Feren, MD
  • Medical gaslighting has emerged as a troubling issue for both patients and health care providers

    Alan P. Feren, MD

Related Posts

  • High-deductible health plans: a barrier to care for chronic conditions

    Shirin Hund, MD
  • Expanding health care access and equity through telehealth

    Gjanje L. Smith, MD, MPH, Wanneh A. Dixon, and Maria Phillips, JD
  • Medicare’s cobra effect: How a well-intentioned policy spiraled into a health care crisis

    Robert Pearl, MD
  • 5 things I learned from Nepali health care

    Simona Adhikari
  • Why the health care industry must prioritize health equity

    George T. Mathew, MD, MBA
  • Bridging the rural surgical care gap with rotating health care teams

    Ankit Jain

More in Tech

  • Physicians must lead the vetting of AI

    Saurabh Gupta, MD
  • Why Medicare must embrace AI support

    Ronke Lawal
  • Modernizing health care with AI and workflow

    Christina Johns, MD
  • How to adopt AI in health care responsibly

    Dave Wessinger
  • Is it time for the VA to embrace virtual care?

    Kent Dicks
  • Systematic neglect of mental health

    Ronke Lawal
  • Most Popular

  • Past Week

    • Rebuilding the backbone of health care [PODCAST]

      The Podcast by KevinMD | Podcast
    • Why you should get your Lp(a) tested

      Monzur Morshed, MD and Kaysan Morshed | Conditions
    • The psychological trauma of polarization

      Farid Sabet-Sharghi, MD | Physician
    • Why CPT coding ambiguity harms doctors

      Muhamad Aly Rifai, MD | Physician
    • Why physicians must not suffer in silence [PODCAST]

      The Podcast by KevinMD | Podcast
    • Why physicians must lead the vetting of medical AI [PODCAST]

      The Podcast by KevinMD | Podcast
  • Past 6 Months

    • Rebuilding the backbone of health care [PODCAST]

      The Podcast by KevinMD | Podcast
    • The dangerous racial bias in dermatology AI

      Alex Siauw | Tech
    • When language barriers become a medical emergency

      Monzur Morshed, MD and Kaysan Morshed | Physician
    • The dismantling of public health infrastructure

      Ronald L. Lindsay, MD | Physician
    • A doctor’s letter from a federal prison

      L. Joseph Parker, MD | Physician
    • The high cost of PCSK9 inhibitors like Repatha

      Larry Kaskel, MD | Conditions
  • Recent Posts

    • Why physicians must lead the vetting of medical AI [PODCAST]

      The Podcast by KevinMD | Podcast
    • Dealing with physician negative feedback

      Jessie Mahoney, MD | Physician
    • Deaths in custody highlight crisis in Philly prisons

      Kendall Major, MD, Tommy Gautier, MD, Alyssa Lambrecht, DO, and Elle Saine, MD | Policy
    • Why CPT coding ambiguity harms doctors

      Muhamad Aly Rifai, MD | Physician
    • Why health care needs empathy, not just algorithms

      Muhammad Abdullah Khan | Conditions
    • Moral injury, toxic shame, and the new DSM Z code

      Brian Lynch, MD | Physician

Subscribe to KevinMD and never miss a story!

Get free updates delivered free to your inbox.


Find jobs at
Careers by KevinMD.com

Search thousands of physician, PA, NP, and CRNA jobs now.

Learn more

Leave a Comment

Founded in 2004 by Kevin Pho, MD, KevinMD.com is the web’s leading platform where physicians, advanced practitioners, nurses, medical students, and patients share their insight and tell their stories.

Social

  • Like on Facebook
  • Follow on Twitter
  • Connect on Linkedin
  • Subscribe on Youtube
  • Instagram

ADVERTISEMENT

ADVERTISEMENT

  • Most Popular

  • Past Week

    • Rebuilding the backbone of health care [PODCAST]

      The Podcast by KevinMD | Podcast
    • Why you should get your Lp(a) tested

      Monzur Morshed, MD and Kaysan Morshed | Conditions
    • The psychological trauma of polarization

      Farid Sabet-Sharghi, MD | Physician
    • Why CPT coding ambiguity harms doctors

      Muhamad Aly Rifai, MD | Physician
    • Why physicians must not suffer in silence [PODCAST]

      The Podcast by KevinMD | Podcast
    • Why physicians must lead the vetting of medical AI [PODCAST]

      The Podcast by KevinMD | Podcast
  • Past 6 Months

    • Rebuilding the backbone of health care [PODCAST]

      The Podcast by KevinMD | Podcast
    • The dangerous racial bias in dermatology AI

      Alex Siauw | Tech
    • When language barriers become a medical emergency

      Monzur Morshed, MD and Kaysan Morshed | Physician
    • The dismantling of public health infrastructure

      Ronald L. Lindsay, MD | Physician
    • A doctor’s letter from a federal prison

      L. Joseph Parker, MD | Physician
    • The high cost of PCSK9 inhibitors like Repatha

      Larry Kaskel, MD | Conditions
  • Recent Posts

    • Why physicians must lead the vetting of medical AI [PODCAST]

      The Podcast by KevinMD | Podcast
    • Dealing with physician negative feedback

      Jessie Mahoney, MD | Physician
    • Deaths in custody highlight crisis in Philly prisons

      Kendall Major, MD, Tommy Gautier, MD, Alyssa Lambrecht, DO, and Elle Saine, MD | Policy
    • Why CPT coding ambiguity harms doctors

      Muhamad Aly Rifai, MD | Physician
    • Why health care needs empathy, not just algorithms

      Muhammad Abdullah Khan | Conditions
    • Moral injury, toxic shame, and the new DSM Z code

      Brian Lynch, MD | Physician

MedPage Today Professional

An Everyday Health Property Medpage Today
  • Terms of Use | Disclaimer
  • Privacy Policy
  • DMCA Policy
All Content © KevinMD, LLC
Site by Outthink Group

Leave a Comment

Comments are moderated before they are published. Please read the comment policy.

Loading Comments...