Skip to content
  • About
  • Contact
  • Contribute
  • Book
  • Careers
  • Podcast
  • Recommended
  • Speaking
KevinMD
  • All
  • Physician
  • Practice
  • Policy
  • Finance
  • Conditions
  • .edu
  • Patient
  • Meds
  • Tech
  • Social
  • Video
  • All
  • Physician
  • Practice
  • Policy
  • Finance
  • Conditions
  • .edu
  • Patient
  • Meds
  • Tech
  • Social
  • Video
    • All
    • Physician
    • Practice
    • Policy
    • Finance
    • Conditions
    • .edu
    • Patient
    • Meds
    • Tech
    • Social
    • Video
    • About
    • Contact
    • Contribute
    • Book
    • Careers
    • Podcast
    • Recommended
    • Speaking
KevinMD
  • All
  • Physician
  • Practice
  • Policy
  • Finance
  • Conditions
  • .edu
  • Patient
  • Meds
  • Tech
  • Social
  • Video
    • All
    • Physician
    • Practice
    • Policy
    • Finance
    • Conditions
    • .edu
    • Patient
    • Meds
    • Tech
    • Social
    • Video
    • About
    • Contact
    • Contribute
    • Book
    • Careers
    • Podcast
    • Recommended
    • Speaking
  • About KevinMD | Kevin Pho, MD
  • Be heard on social media’s leading physician voice
  • Contact Kevin
  • Discounted enhanced author page
  • DMCA Policy
  • Establishing, Managing, and Protecting Your Online Reputation: A Social Media Guide for Physicians and Medical Practices
  • Group vs. individual disability insurance for doctors: pros and cons
  • KevinMD influencer opportunities
  • Opinion and commentary by KevinMD
  • Physician burnout speakers to keynote your conference
  • Physician Coaching by KevinMD
  • Physician keynote speaker: Kevin Pho, MD
  • Physician Speaking by KevinMD: a boutique speakers bureau
  • Primary care physician in Nashua, NH | Kevin Pho, MD
  • Privacy Policy
  • Recommended services by KevinMD
  • Terms of Use Agreement
  • Thank you for subscribing to KevinMD
  • Thank you for upgrading to the KevinMD enhanced author page
  • The biggest mistake doctors make when purchasing disability insurance
  • The doctor’s guide to disability insurance: short-term vs. long-term
  • The KevinMD ToolKit
  • Upgrade to the KevinMD enhanced author page
  • Why own-occupation disability insurance is a must for doctors

AI therapy chatbots are crossing into impersonation

Muhamad Aly Rifai, MD
Tech
May 11, 2026
Share
Tweet
Share

For a moment, I wondered whether I had a crystal ball.

I had written several times on KevinMD about artificial intelligence in medicine. I had written for Forbes Business Council about AI as a developing cognitive tool, synthetic reality, executive distortion, and the danger of treating fluent output as reliable judgment. I kept returning to one warning. AI was one step away from a serious blunder, and that blunder would not stay technical. It would become clinical, legal, and human.

This was not prophecy. It was clinical intuition.

Psychiatrists spend their lives listening to words. We know the difference between fluency and truth. We know that a confident voice can carry a delusion. We know that reassurance can become harm when it validates a dangerous belief. We know that a person under stress is vulnerable to any voice that sounds warm, certain, and available.

Now the Commonwealth of Pennsylvania has sued Character.AI, alleging that one of its chatbots presented itself as a licensed psychiatrist, claimed Pennsylvania licensure, and provided an invalid license number while engaging in mental health related conversation. As a licensed physician in Pennsylvania, I find this deeply concerning.

A fake license number is not a small software error. It is not a creative answer. It is not harmless hallucination. A medical license is public trust.

It represents years of training, supervision, state oversight, ethical duty, continuing education, malpractice exposure, professional accountability, and the duty to protect vulnerable patients. When software invents that identity, it crosses from assistance into impersonation.

Psychiatry is not casual advice. Therapy is not customer service. A suicidal patient is not a user engagement metric. A delusional patient is not a prompt challenge. A manic patient does not need unlimited validation from a system trained to keep the conversation going.

This case matters because it exposes the core risk of AI in mental health. The danger is not only that AI makes mistakes. The danger is that AI sounds caring, confident, intelligent, and clinically authoritative while having no license, no patient relationship, no duty, no real judgment, and no accountability.

We are watching chatbots compete with physicians, psychiatrists, psychologists, therapists, and counselors for the attention of vulnerable patients. They are available all day. They are cheap. They do not ask for insurance cards. They do not run behind. They do not challenge the patient unless designed to do so. For someone who is lonely, depressed, anxious, traumatized, or isolated, that feels like care.

But access is not competence. AI did not create the mental health access crisis. We did. Patients wait months for psychiatry. Many clinicians have left insurance networks because reimbursement is poor and administrative burden is crushing. Primary care physicians carry impossible psychiatric loads. Emergency rooms have become the last safety net. Into that vacuum came the chatbot.

The chatbot filled an access gap. Then it started borrowing the clothing of medicine. That is where the line must be drawn.

In psychiatry, language is part of the illness. Depression speaks. Mania speaks. Paranoia speaks. Addiction speaks. Trauma speaks. Eating disorders speak. Psychosis speaks.

A trained psychiatrist listens for what the patient says, what the patient avoids, what the patient repeats, what the patient cannot see, and what the illness is trying to hide. A chatbot predicts words. That difference matters.

A good therapist does not only validate. A good therapist also challenges distorted thinking, recognizes risk, sets boundaries, evaluates safety, recommends a higher level of care, involves family when needed, and refuses to collude with illness.

Real therapy includes friction. Sometimes that friction saves a life.

AI chatbots often do the opposite. They mirror. They agree. They soothe. They keep responding. In some patients, that feels supportive. In others, it reinforces fear, paranoia, grandiosity, despair, or dangerous certainty.

This is especially dangerous because psychiatric symptoms rarely arrive in clean textbook form. Insomnia might be grief. It might be bipolar disorder. It might be stimulant misuse. It might be trauma, thyroid disease, akathisia, or early psychosis.

Panic might be anxiety. It might also be alcohol withdrawal, arrhythmia, pulmonary embolism, hyperthyroidism, or medication toxicity.

A patient saying, “I do not want to be here anymore,” might need support. That patient might also need urgent suicide risk assessment.

A chatbot does not know the patient’s vital signs. It does not examine the patient. It does not know the family history. It does not call collateral contacts. It does not review the medication list with clinical responsibility. It does not face the state medical board. It has no license to lose.

That is why the Pennsylvania lawsuit matters. This is not only about one company. This is about the boundary between software and medicine.

AI has a role in health care. I use AI. I write about AI. I believe AI will improve documentation, education, triage, workflow, research, and decision support. I believe physician guided AI will become part of modern medicine.

But AI should support clinicians, not counterfeit them. AI should expand access, not deceive patients. AI should help patients prepare for care, not replace care with simulated authority.

Every psychiatric intake should now include a new question: Are you using AI chatbots for therapy, companionship, medication advice, crisis support, relationship advice, or emotional reassurance? Ask without shame. Ask because it matters.

Some patients use AI to journal. Some use it to organize thoughts before therapy. Some use it to learn coping skills. Some use it because no human was available. That deserves compassion, not mockery.

But support is not treatment. Companionship is not licensure. Fluency is not clinical judgment.

We need bright guardrails. AI systems should disclose that they are not human. They should repeat that disclosure during sensitive conversations. They should never claim to be physicians, psychiatrists, psychologists, therapists, or licensed counselors. They should never fabricate credentials. They should have escalation pathways for suicide, psychosis, abuse, violence, intoxication, and severe eating disorder symptoms. They should undergo clinical risk testing before entering vulnerable mental health spaces.

Most of all, we need human accountability. In one of my Forbes Business Council articles, I wrote that AI is a mirror we built. That mirror reflects our intelligence, our ambition, our bias, and our blind spots. A mirror is useful when we know it is a mirror. The danger begins when the mirror claims it is the doctor.

In psychiatry, trust is treatment. When a machine fabricates authority, it does more than hallucinate data. It threatens trust itself.

And when trust in mental health care breaks, patients pay the price.

Muhamad Aly Rifai, known professionally as Dr. Rifai, is a psychiatrist, internist, addiction medicine physician, physician executive, author, and Forbes Business Council official contributor based in the Greater Lehigh Valley, Pennsylvania. He is the founder, chief executive officer, and chief medical officer of Blue Mountain Psychiatry, a multidisciplinary mental health and addiction medicine practice focused on psychiatry, telepsychiatry, brain health, integrated medical care, ketamine treatment, transcranial magnetic stimulation, and evidence-based addiction treatment.

Dr. Rifai holds the Lehigh Valley Endowed Chair of Addiction Medicine and is board-certified in psychiatry, internal medicine, addiction medicine, and consultation-liaison psychiatry. He is a distinguished fellow of the American Psychiatric Association, a fellow of the American College of Physicians, and a fellow of the Academy of Consultation-Liaison Psychiatry. A former president of the Lehigh Valley Psychiatric Society, he advocates for access to high-quality psychiatric care, ethical telemedicine, physician rights, and integrated behavioral health.

He writes and speaks on psychiatry, addiction medicine, telepsychiatry, digital mental health, artificial intelligence in medicine, brain health, health care policy, physician justice, and leadership under pressure. His books, including Doctor Not Guilty and Hijacked Minds, are available at DrRifaiBooks.com. More information is available through DrRifai360, Forbes Business Council, The Virtual Psychiatrist, LinkedIn, SHIELD, X, and Facebook.

Prev

The handwashing standard nobody finished. Until now.

May 11, 2026 Kevin 0
…

Kevin

Tagged as: Psychiatry

< Previous Post
The handwashing standard nobody finished. Until now.

ADVERTISEMENT

More by Muhamad Aly Rifai, MD

  • Confronting the reality of bullying in medicine today

    Muhamad Aly Rifai, MD
  • The truth about psychiatric supplements and mental health

    Muhamad Aly Rifai, MD
  • How to handle clinical disagreement with patients

    Muhamad Aly Rifai, MD

Related Posts

  • The importance of physician education regarding psilocybin therapy

    Lynn Marie Morski, MD, JD
  • Ketamine therapy and the primacy of mind in modern medicine

    Farid Sabet-Sharghi, MD
  • Ketamine therapy for chronic pain and substance misuse

    Olumuyiwa Bamgbade, MD
  • What psychiatry teaches us about professionalism, loss, and becoming human

    Hannah Wulk
  • The effects of the nationwide stimulant shortage on a private psychiatry practice

    Christine Tran-Boynes, DO
  • Gene therapy breakthroughs: a new era in genetic disorder treatment

    Akshat Jain, MD

More in Tech

  • 3 things AI in health care investing cannot evaluate

    Harsha Moole, MD
  • How ambient artificial intelligence can transform team-based care

    Matt Sukomoto, MD
  • EHR vendor evaluation should happen before the demo

    GetPracticeHelp
  • The limits of large language models in clinical practice

    Edward G. Rogoff and Alena Ivashenka, PhD
  • Artificial intelligence in residency education and family medicine

    Jyothi Ranga Patri, MD, MHA
  • Transforming nursing education with immersive technology

    Kelly J. Dries, PhD, RN
  • Most Popular

  • Past Week

    • Your doctor saved your life but won’t return your call [PODCAST]

      The Podcast by KevinMD | Podcast
    • Why bipolar II is not just a milder version of bipolar I

      Ethan Evans, MD | Conditions
    • Opt-out states and physician-led anesthesia care explained

      Michael Beck, MD | Physician
    • Why neurodivergent friendship is challenging but possible

      Caroline Maguire, MEd | Conditions
    • Caring for the caregivers builds dementia-friendly cities

      Gerald Kuo | Conditions
    • Medical expert witness report language gets cases struck

      Tracy Liberatore, Esq, PA | Conditions
  • Past 6 Months

    • I Googled my own name and a corporate clinic I’ve never worked at appeared [PODCAST]

      The Podcast by KevinMD | Podcast
    • Rethinking the role of family physicians vs. specialists

      Ronald L. Lindsay, MD | Physician
    • How corporate health care ruined the medical profession

      Edmond Cabbabe, MD | Physician
    • Clinicians are failing at value-based care because no one taught them the system [PODCAST]

      The Podcast by KevinMD | Podcast
    • A humorous parody of medical specialties and the modern patient

      Sidney J. Winawer, MD | Physician
    • When shared decision making gives way to medical paternalism

      DeAnna Pollock, MD | Physician
  • Recent Posts

    • AI therapy chatbots are crossing into impersonation

      Muhamad Aly Rifai, MD | Tech
    • The handwashing standard nobody finished. Until now.

      Bernadette Burroughs, RN | Conditions
    • Expanding the SOAP framework boosts health outcomes

      Deepak Gupta, MD and Sarwan Kumar, MD | Physician
    • How to navigate physician job loss in the first week

      Patrick Hudson, MD | Physician
    • Physician burnout is a heavy burden for many healers

      Moses Kim, MD | Physician
    • Unavoidable pressure ulcer claims live and die by the record

      Tracy Liberatore, Esq, PA | Conditions

Subscribe to KevinMD and never miss a story!

Get free updates delivered free to your inbox.


Find jobs at
Careers by KevinMD.com

Search thousands of physician, PA, NP, and CRNA jobs now.

Learn more

Leave a Comment

Founded in 2004 by Kevin Pho, MD, KevinMD.com is the web’s leading platform where physicians, advanced practitioners, nurses, medical students, and patients share their insight and tell their stories.

Social

  • Like on Facebook
  • Follow on Twitter
  • Connect on Linkedin
  • Subscribe on Youtube
  • Instagram

ADVERTISEMENT

  • Most Popular

  • Past Week

    • Your doctor saved your life but won’t return your call [PODCAST]

      The Podcast by KevinMD | Podcast
    • Why bipolar II is not just a milder version of bipolar I

      Ethan Evans, MD | Conditions
    • Opt-out states and physician-led anesthesia care explained

      Michael Beck, MD | Physician
    • Why neurodivergent friendship is challenging but possible

      Caroline Maguire, MEd | Conditions
    • Caring for the caregivers builds dementia-friendly cities

      Gerald Kuo | Conditions
    • Medical expert witness report language gets cases struck

      Tracy Liberatore, Esq, PA | Conditions
  • Past 6 Months

    • I Googled my own name and a corporate clinic I’ve never worked at appeared [PODCAST]

      The Podcast by KevinMD | Podcast
    • Rethinking the role of family physicians vs. specialists

      Ronald L. Lindsay, MD | Physician
    • How corporate health care ruined the medical profession

      Edmond Cabbabe, MD | Physician
    • Clinicians are failing at value-based care because no one taught them the system [PODCAST]

      The Podcast by KevinMD | Podcast
    • A humorous parody of medical specialties and the modern patient

      Sidney J. Winawer, MD | Physician
    • When shared decision making gives way to medical paternalism

      DeAnna Pollock, MD | Physician
  • Recent Posts

    • AI therapy chatbots are crossing into impersonation

      Muhamad Aly Rifai, MD | Tech
    • The handwashing standard nobody finished. Until now.

      Bernadette Burroughs, RN | Conditions
    • Expanding the SOAP framework boosts health outcomes

      Deepak Gupta, MD and Sarwan Kumar, MD | Physician
    • How to navigate physician job loss in the first week

      Patrick Hudson, MD | Physician
    • Physician burnout is a heavy burden for many healers

      Moses Kim, MD | Physician
    • Unavoidable pressure ulcer claims live and die by the record

      Tracy Liberatore, Esq, PA | Conditions

MedPage Today Professional

An Everyday Health Property Medpage Today

Copyright © 2026 KevinMD.com | Powered by Astra WordPress Theme

  • Terms of Use | Disclaimer
  • Privacy Policy
  • DMCA Policy
All Content © KevinMD, LLC
Site by Outthink Group

Leave a Comment

Comments are moderated before they are published. Please read the comment policy.

Loading Comments...