Skip to content
  • About
  • Contact
  • Contribute
  • Book
  • Careers
  • Podcast
  • Recommended
  • Speaking
  • All
  • Physician
  • Practice
  • Policy
  • Finance
  • Conditions
  • .edu
  • Patient
  • Meds
  • Tech
  • Social
  • Video
    • All
    • Physician
    • Practice
    • Policy
    • Finance
    • Conditions
    • .edu
    • Patient
    • Meds
    • Tech
    • Social
    • Video
    • About
    • Contact
    • Contribute
    • Book
    • Careers
    • Podcast
    • Recommended
    • Speaking

In medicine and law, professions that society relies upon for accuracy

Muhamad Aly Rifai, MD
Tech
May 20, 2025
Share
Tweet
Share

Integrity and trust are foundational. But today, that trust is under assault—not from human error, nor negligence, but from the sophisticated but disturbingly unreliable outputs of artificial intelligence (AI). What the media euphemistically calls “AI hallucinations” are not benign mistakes—they are dangerous fabrications, systematically undermining both clinical and legal standards.

As a psychiatrist, I confront the reality of hallucinations regularly. Patients vividly describe voices, visions, and sensations with profound distress. Hallucinations, in medical terms, represent profound and involuntary perceptual disturbances. In stark contrast, the inaccuracies spewed forth by large language models like ChatGPT are not involuntary misperceptions—they are systematic, plausible-sounding falsifications generated by probabilistic algorithms, devoid of ethical accountability.

A recent editorial by Robin Emsley in the journal Schizophrenia reveals the alarming truth: ChatGPT-generated references, presented with confidence and eloquence, are often wholly fabricated. Emsley described requesting literature references from ChatGPT to support his research into structural brain changes with antipsychotic treatment. Initially impressed, his enthusiasm quickly turned to dismay. Out of five citations provided, several were entirely fictitious or grossly inaccurate. One real reference was irrelevant, and three others simply didn’t exist. This was not an isolated case. Additional research highlighted even more disturbing statistics: of 115 medical citations generated by AI, 47 percent were entirely fabricated, another 46 percent inaccurate, leaving only 7 percent both accurate and authentic.

Emsley emphasizes that calling these inaccuracies “hallucinations” is misleading and diminishes the gravity of real clinical hallucinations. Instead, these are “fabrications and falsifications,” a term that correctly assigns moral and professional weight to the AI-generated falsehoods.

Medicine isn’t the only domain suffering from AI’s plausible fabrications.

Consider the notorious incident involving attorney Steven Schwartz, who submitted legal arguments citing six court cases invented entirely by ChatGPT. The court was stunned, Schwartz was sanctioned, and the incident sent shockwaves through the legal community. Schwartz’s defense—that he didn’t know AI could produce false references—highlights a deeper systemic problem. Legal professionals, trained rigorously in verifying sources, were caught off-guard by the convincing fabrications of AI, endangering justice and undermining public trust. Some Federal courts now outright ban AI pleading.

We see a disturbing parallel between these AI fabrications and the phenomenon of clinical hallucinations following sensory alterations.

This is exemplified vividly by the case of musical hallucinations post-cochlear implantation (source). A woman, following cochlear implantation, experienced persistent and increasingly intrusive musical hallucinations. Initially gentle and non-intrusive, these hallucinations eventually became overwhelming. Interestingly, the music continued even when the implant was inactive, driven perhaps by the brain’s “parasitic memory” phenomenon—a desperate neurological attempt to fill sensory voids.

AI-generated fabrications, though algorithmic rather than neurological, similarly fill “informational voids,” creating plausible but false data to satisfy user queries. However, unlike the involuntary neurological processes, AI-generated falsehoods arise from inherent limitations of machine learning models—specifically, their probabilistic approach that blends bits of factual and false data seamlessly.

These issues raise critical ethical and professional questions.

How much can we safely rely on AI in professions built on verifiable truth? How do we hold an algorithm accountable when it deceives? Unlike human malpractice, AI errors currently have no clear accountability. ChatGPT cannot be sanctioned, sued, or disbarred. The responsibility and risk fall solely on the humans who use it.

The Psychiatric News piece “Moving from ‘hallucinations’ to ‘fabrications'” emphasizes the ethical imperative to shift terminology. Labeling AI outputs as hallucinations inadvertently trivializes real psychiatric experiences. By explicitly calling them “fabrications,” we highlight the deliberate and structured nature of these errors, making clear that these inaccuracies must trigger rigorous verification and accountability.

The potential consequences of continuing to overlook these fabrications are severe.

ADVERTISEMENT

Imagine a scenario where medical students, pressed for time and resources, rely on AI-generated references without verification, inadvertently propagating false information. Clinical guidelines could become contaminated by inaccuracies. Physicians may unwittingly compromise patient safety, guided by fictitious evidence.

Similarly, in legal contexts, fabrications could erode foundational precedents and jurisprudence. Cases relying on AI-generated content could jeopardize justice, sending innocent individuals to prison or letting guilty parties escape accountability based on phantom precedents.

What steps should the medical and legal communities take?

  • Rigorous educational initiatives must inform professionals about AI limitations. Training should explicitly teach skepticism and the essential skill of cross-verifying AI-generated content.
  • Stringent standards for disclosing AI use must become mandatory in publications and court submissions, akin to conflict-of-interest disclosures. Such transparency will foster accountability and caution.
  • The AI industry must develop and enforce verification tools that flag fabricated citations or case law proactively, integrating ethical oversight directly into the technological framework.
  • We must advocate for clear terminology, distinguishing neurological “hallucinations” from intentional AI “fabrications,” maintaining both scientific integrity and ethical clarity.

Our trust in medicine and law has always hinged on accuracy and ethical standards. AI, for all its promises, currently challenges both. The damage already inflicted demands immediate response. We cannot passively accept AI-generated fabrications under the comforting illusion that they are harmless “hallucinations.” Real hallucinations cause genuine human suffering; fabricated evidence leads directly to real-world harm and injustice.

We stand at an ethical crossroads. We must actively decide whether we are willing to surrender critical standards of truth and accountability to convenience and technological expediency. As medical professionals sworn to uphold truth and justice, we must resist this erosion vigorously. Let’s reclaim our responsibility. Let’s clearly distinguish hallucinations from fabrications. Let’s remain vigilant guardians of the truth in the age of AI. Our patients, our clients, and our professions depend on it.

Muhamad Aly Rifai is a nationally recognized psychiatrist, internist, and addiction medicine specialist based in the Greater Lehigh Valley, Pennsylvania. He is the founder, CEO, and chief medical officer of Blue Mountain Psychiatry, a leading multidisciplinary practice known for innovative approaches to mental health, addiction treatment, and integrated care. Dr. Rifai currently holds the prestigious Lehigh Valley Endowed Chair of Addiction Medicine, reflecting his leadership in advancing evidence-based treatments for substance use disorders.

Board-certified in psychiatry, internal medicine, addiction medicine, and consultation-liaison (psychosomatic) psychiatry, Dr. Rifai is a fellow of the American College of Physicians (FACP), the American Psychiatric Association (FAPA), and the Academy of Consultation-Liaison Psychiatry (FACLP). He is also a former president of the Lehigh Valley Psychiatric Society, where he championed access to community-based psychiatric care and physician advocacy.

A thought leader in telepsychiatry, ketamine treatment, and the intersection of medicine and mental health, Dr. Rifai frequently writes and speaks on physician justice, federal health care policy, and the ethical use of digital psychiatry.

You can learn more about Dr. Rifai through his Wikipedia page, connect with him on LinkedIn, X (formerly Twitter), Facebook, or subscribe to his YouTube channel. His podcast, The Virtual Psychiatrist, offers deeper insights into topics at the intersection of mental health and medicine. Explore all of Dr. Rifai’s platforms and resources via his Linktree.

Prev

Diabetes and Alzheimer’s: What your blood sugar might be doing to your brain

May 20, 2025 Kevin 0
…
Next

From basketball to bedside: Finding connection through March Madness

May 20, 2025 Kevin 0
…

Tagged as: Health IT

Post navigation

< Previous Post
Diabetes and Alzheimer’s: What your blood sugar might be doing to your brain
Next Post >
From basketball to bedside: Finding connection through March Madness

ADVERTISEMENT

ADVERTISEMENT

ADVERTISEMENT

More by Muhamad Aly Rifai, MD

  • When rock bottom is a turning point: Why the turmoil at HHS may be a blessing in disguise

    Muhamad Aly Rifai, MD
  • When doctors die in silence: Confronting the epidemic of violence against physicians

    Muhamad Aly Rifai, MD
  • How America became overmedicated—and what we can do about it

    Muhamad Aly Rifai, MD

Related Posts

  • How a law school elective changed my perspective on medicine

    Kelly Montgomery
  • From penicillin to digital health: the impact of social media on medicine

    Homer Moutran, MD, MBA, Caline El-Khoury, PhD, and Danielle Wilson
  • Medicine won’t keep you warm at night

    Anonymous
  • Delivering unpalatable truths in medicine

    Samantha Cheng
  • How women in medicine are shaping the future of medicine [PODCAST]

    American College of Physicians & The Podcast by KevinMD
  • What medicine can learn from a poem

    Thomas L. Amburn

More in Tech

  • “Think twice, heal once”: Why medical decision-making needs a second opinion from your slower brain (and AI)

    Harvey Castro, MD, MBA
  • Why fearing AI is really about fearing ourselves

    Bhargav Raman, MD, MBA
  • Health care’s data problem: the real obstacle to AI success

    Jay Anders, MD
  • What ChatGPT’s tone reveals about our cultural values

    Jenny Shields, PhD
  • Bridging the digital divide: Addressing health inequities through home-based AI solutions

    Dr. Sreeram Mullankandy
  • Staying stone free with AI: How smart tech is revolutionizing kidney stone prevention

    Robert Chan, MD
  • Most Popular

  • Past Week

    • Make cognitive testing as routine as a blood pressure check

      Joshua Baker and James Jackson, PsyD | Conditions
    • The broken health care system doesn’t have to break you

      Jessie Mahoney, MD | Physician
    • How dismantling DEI endangers the future of medical care

      Shashank Madhu and Christian Tallo | Education
    • How scales of justice saved a doctor-patient relationship

      Neil Baum, MD | Physician
    • The dreaded question: Do you have boys or girls?

      Pamela Adelstein, MD | Physician
    • Rethinking patient payments: Why billing is the new frontline of patient care [PODCAST]

      The Podcast by KevinMD | Podcast
  • Past 6 Months

    • What’s driving medical students away from primary care?

      ​​Vineeth Amba, MPH, Archita Goyal, and Wayne Altman, MD | Education
    • What happened to real care in health care?

      Christopher H. Foster, PhD, MPA | Policy
    • Internal Medicine 2025: inspiration at the annual meeting

      American College of Physicians | Physician
    • A faster path to becoming a doctor is possible—here’s how

      Ankit Jain | Education
    • The hidden bias in how we treat chronic pain

      Richard A. Lawhern, PhD | Meds
    • Residency as rehearsal: the new pediatric hospitalist fellowship requirement scam

      Anonymous | Physician
  • Recent Posts

    • From basketball to bedside: Finding connection through March Madness

      Caitlin J. McCarthy, MD | Physician
    • In medicine and law, professions that society relies upon for accuracy

      Muhamad Aly Rifai, MD | Tech
    • Diabetes and Alzheimer’s: What your blood sugar might be doing to your brain

      Marc Arginteanu, MD | Conditions
    • How motherhood reshaped my identity as a scientist and teacher

      Kathleen Muldoon, PhD | Conditions
    • Jumpstarting African health care with the beats of innovation

      Princess Benson | Conditions
    • Empowering IBD patients: tools for managing symptoms between doctor visits [PODCAST]

      The Podcast by KevinMD | Podcast

Subscribe to KevinMD and never miss a story!

Get free updates delivered free to your inbox.


Find jobs at
Careers by KevinMD.com

Search thousands of physician, PA, NP, and CRNA jobs now.

Learn more

Leave a Comment

Founded in 2004 by Kevin Pho, MD, KevinMD.com is the web’s leading platform where physicians, advanced practitioners, nurses, medical students, and patients share their insight and tell their stories.

Social

  • Like on Facebook
  • Follow on Twitter
  • Connect on Linkedin
  • Subscribe on Youtube
  • Instagram

ADVERTISEMENT

ADVERTISEMENT

ADVERTISEMENT

ADVERTISEMENT

  • Most Popular

  • Past Week

    • Make cognitive testing as routine as a blood pressure check

      Joshua Baker and James Jackson, PsyD | Conditions
    • The broken health care system doesn’t have to break you

      Jessie Mahoney, MD | Physician
    • How dismantling DEI endangers the future of medical care

      Shashank Madhu and Christian Tallo | Education
    • How scales of justice saved a doctor-patient relationship

      Neil Baum, MD | Physician
    • The dreaded question: Do you have boys or girls?

      Pamela Adelstein, MD | Physician
    • Rethinking patient payments: Why billing is the new frontline of patient care [PODCAST]

      The Podcast by KevinMD | Podcast
  • Past 6 Months

    • What’s driving medical students away from primary care?

      ​​Vineeth Amba, MPH, Archita Goyal, and Wayne Altman, MD | Education
    • What happened to real care in health care?

      Christopher H. Foster, PhD, MPA | Policy
    • Internal Medicine 2025: inspiration at the annual meeting

      American College of Physicians | Physician
    • A faster path to becoming a doctor is possible—here’s how

      Ankit Jain | Education
    • The hidden bias in how we treat chronic pain

      Richard A. Lawhern, PhD | Meds
    • Residency as rehearsal: the new pediatric hospitalist fellowship requirement scam

      Anonymous | Physician
  • Recent Posts

    • From basketball to bedside: Finding connection through March Madness

      Caitlin J. McCarthy, MD | Physician
    • In medicine and law, professions that society relies upon for accuracy

      Muhamad Aly Rifai, MD | Tech
    • Diabetes and Alzheimer’s: What your blood sugar might be doing to your brain

      Marc Arginteanu, MD | Conditions
    • How motherhood reshaped my identity as a scientist and teacher

      Kathleen Muldoon, PhD | Conditions
    • Jumpstarting African health care with the beats of innovation

      Princess Benson | Conditions
    • Empowering IBD patients: tools for managing symptoms between doctor visits [PODCAST]

      The Podcast by KevinMD | Podcast

MedPage Today Professional

An Everyday Health Property Medpage Today
  • Terms of Use | Disclaimer
  • Privacy Policy
  • DMCA Policy
All Content © KevinMD, LLC
Site by Outthink Group

Leave a Comment

Comments are moderated before they are published. Please read the comment policy.

Loading Comments...