Skip to content
  • About
  • Contact
  • Contribute
  • Book
  • Careers
  • Podcast
  • Recommended
  • Speaking
KevinMD
  • All
  • Physician
  • Practice
  • Policy
  • Finance
  • Conditions
  • .edu
  • Patient
  • Meds
  • Tech
  • Social
  • Video
  • All
  • Physician
  • Practice
  • Policy
  • Finance
  • Conditions
  • .edu
  • Patient
  • Meds
  • Tech
  • Social
  • Video
    • All
    • Physician
    • Practice
    • Policy
    • Finance
    • Conditions
    • .edu
    • Patient
    • Meds
    • Tech
    • Social
    • Video
    • About
    • Contact
    • Contribute
    • Book
    • Careers
    • Podcast
    • Recommended
    • Speaking
KevinMD
  • All
  • Physician
  • Practice
  • Policy
  • Finance
  • Conditions
  • .edu
  • Patient
  • Meds
  • Tech
  • Social
  • Video
    • All
    • Physician
    • Practice
    • Policy
    • Finance
    • Conditions
    • .edu
    • Patient
    • Meds
    • Tech
    • Social
    • Video
    • About
    • Contact
    • Contribute
    • Book
    • Careers
    • Podcast
    • Recommended
    • Speaking
  • About KevinMD | Kevin Pho, MD
  • Be heard on social media’s leading physician voice
  • Contact Kevin
  • Discounted enhanced author page
  • DMCA Policy
  • Establishing, Managing, and Protecting Your Online Reputation: A Social Media Guide for Physicians and Medical Practices
  • Group vs. individual disability insurance for doctors: pros and cons
  • KevinMD influencer opportunities
  • Opinion and commentary by KevinMD
  • Physician burnout speakers to keynote your conference
  • Physician Coaching by KevinMD
  • Physician keynote speaker: Kevin Pho, MD
  • Physician Speaking by KevinMD: a boutique speakers bureau
  • Primary care physician in Nashua, NH | Kevin Pho, MD
  • Privacy Policy
  • Recommended services by KevinMD
  • Terms of Use Agreement
  • Thank you for subscribing to KevinMD
  • Thank you for upgrading to the KevinMD enhanced author page
  • The biggest mistake doctors make when purchasing disability insurance
  • The doctor’s guide to disability insurance: short-term vs. long-term
  • The KevinMD ToolKit
  • Upgrade to the KevinMD enhanced author page
  • Why own-occupation disability insurance is a must for doctors

In medicine and law, professions that society relies upon for accuracy

Muhamad Aly Rifai, MD
Tech
May 20, 2025
Share
Tweet
Share

Integrity and trust are foundational. But today, that trust is under assault—not from human error, nor negligence, but from the sophisticated but disturbingly unreliable outputs of artificial intelligence (AI). What the media euphemistically calls “AI hallucinations” are not benign mistakes—they are dangerous fabrications, systematically undermining both clinical and legal standards.

As a psychiatrist, I confront the reality of hallucinations regularly. Patients vividly describe voices, visions, and sensations with profound distress. Hallucinations, in medical terms, represent profound and involuntary perceptual disturbances. In stark contrast, the inaccuracies spewed forth by large language models like ChatGPT are not involuntary misperceptions—they are systematic, plausible-sounding falsifications generated by probabilistic algorithms, devoid of ethical accountability.

A recent editorial by Robin Emsley in the journal Schizophrenia reveals the alarming truth: ChatGPT-generated references, presented with confidence and eloquence, are often wholly fabricated. Emsley described requesting literature references from ChatGPT to support his research into structural brain changes with antipsychotic treatment. Initially impressed, his enthusiasm quickly turned to dismay. Out of five citations provided, several were entirely fictitious or grossly inaccurate. One real reference was irrelevant, and three others simply didn’t exist. This was not an isolated case. Additional research highlighted even more disturbing statistics: of 115 medical citations generated by AI, 47 percent were entirely fabricated, another 46 percent inaccurate, leaving only 7 percent both accurate and authentic.

Emsley emphasizes that calling these inaccuracies “hallucinations” is misleading and diminishes the gravity of real clinical hallucinations. Instead, these are “fabrications and falsifications,” a term that correctly assigns moral and professional weight to the AI-generated falsehoods.

Medicine isn’t the only domain suffering from AI’s plausible fabrications.

Consider the notorious incident involving attorney Steven Schwartz, who submitted legal arguments citing six court cases invented entirely by ChatGPT. The court was stunned, Schwartz was sanctioned, and the incident sent shockwaves through the legal community. Schwartz’s defense—that he didn’t know AI could produce false references—highlights a deeper systemic problem. Legal professionals, trained rigorously in verifying sources, were caught off-guard by the convincing fabrications of AI, endangering justice and undermining public trust. Some Federal courts now outright ban AI pleading.

We see a disturbing parallel between these AI fabrications and the phenomenon of clinical hallucinations following sensory alterations.

This is exemplified vividly by the case of musical hallucinations post-cochlear implantation (source). A woman, following cochlear implantation, experienced persistent and increasingly intrusive musical hallucinations. Initially gentle and non-intrusive, these hallucinations eventually became overwhelming. Interestingly, the music continued even when the implant was inactive, driven perhaps by the brain’s “parasitic memory” phenomenon—a desperate neurological attempt to fill sensory voids.

AI-generated fabrications, though algorithmic rather than neurological, similarly fill “informational voids,” creating plausible but false data to satisfy user queries. However, unlike the involuntary neurological processes, AI-generated falsehoods arise from inherent limitations of machine learning models—specifically, their probabilistic approach that blends bits of factual and false data seamlessly.

These issues raise critical ethical and professional questions.

How much can we safely rely on AI in professions built on verifiable truth? How do we hold an algorithm accountable when it deceives? Unlike human malpractice, AI errors currently have no clear accountability. ChatGPT cannot be sanctioned, sued, or disbarred. The responsibility and risk fall solely on the humans who use it.

The Psychiatric News piece “Moving from ‘hallucinations’ to ‘fabrications'” emphasizes the ethical imperative to shift terminology. Labeling AI outputs as hallucinations inadvertently trivializes real psychiatric experiences. By explicitly calling them “fabrications,” we highlight the deliberate and structured nature of these errors, making clear that these inaccuracies must trigger rigorous verification and accountability.

The potential consequences of continuing to overlook these fabrications are severe.

Imagine a scenario where medical students, pressed for time and resources, rely on AI-generated references without verification, inadvertently propagating false information. Clinical guidelines could become contaminated by inaccuracies. Physicians may unwittingly compromise patient safety, guided by fictitious evidence.

Similarly, in legal contexts, fabrications could erode foundational precedents and jurisprudence. Cases relying on AI-generated content could jeopardize justice, sending innocent individuals to prison or letting guilty parties escape accountability based on phantom precedents.

What steps should the medical and legal communities take?

  • Rigorous educational initiatives must inform professionals about AI limitations. Training should explicitly teach skepticism and the essential skill of cross-verifying AI-generated content.
  • Stringent standards for disclosing AI use must become mandatory in publications and court submissions, akin to conflict-of-interest disclosures. Such transparency will foster accountability and caution.
  • The AI industry must develop and enforce verification tools that flag fabricated citations or case law proactively, integrating ethical oversight directly into the technological framework.
  • We must advocate for clear terminology, distinguishing neurological “hallucinations” from intentional AI “fabrications,” maintaining both scientific integrity and ethical clarity.

Our trust in medicine and law has always hinged on accuracy and ethical standards. AI, for all its promises, currently challenges both. The damage already inflicted demands immediate response. We cannot passively accept AI-generated fabrications under the comforting illusion that they are harmless “hallucinations.” Real hallucinations cause genuine human suffering; fabricated evidence leads directly to real-world harm and injustice.

We stand at an ethical crossroads. We must actively decide whether we are willing to surrender critical standards of truth and accountability to convenience and technological expediency. As medical professionals sworn to uphold truth and justice, we must resist this erosion vigorously. Let’s reclaim our responsibility. Let’s clearly distinguish hallucinations from fabrications. Let’s remain vigilant guardians of the truth in the age of AI. Our patients, our clients, and our professions depend on it.

Muhamad Aly Rifai is a nationally recognized psychiatrist, internist, and addiction medicine specialist based in the Greater Lehigh Valley, Pennsylvania. He is the founder, CEO, and chief medical officer of Blue Mountain Psychiatry, a leading multidisciplinary practice known for innovative approaches to mental health, addiction treatment, and integrated care. Dr. Rifai currently holds the prestigious Lehigh Valley Endowed Chair of Addiction Medicine, reflecting his leadership in advancing evidence-based treatments for substance use disorders.

Board-certified in psychiatry, internal medicine, addiction medicine, and consultation-liaison (psychosomatic) psychiatry, Dr. Rifai is a fellow of the American College of Physicians (FACP), the American Psychiatric Association (FAPA), and the Academy of Consultation-Liaison Psychiatry (FACLP). He is also a former president of the Lehigh Valley Psychiatric Society, where he championed access to community-based psychiatric care and physician advocacy.

A thought leader in telepsychiatry, ketamine treatment, and the intersection of medicine and mental health, Dr. Rifai frequently writes and speaks on physician justice, federal health care policy, and the ethical use of digital psychiatry.

You can learn more about Dr. Rifai through his Wikipedia page, connect with him on LinkedIn, X (formerly Twitter), Facebook, or subscribe to his YouTube channel. His podcast, The Virtual Psychiatrist, offers deeper insights into topics at the intersection of mental health and medicine. Explore all of Dr. Rifai’s platforms and resources via his Linktree.

Prev

Diabetes and Alzheimer’s: What your blood sugar might be doing to your brain

May 20, 2025 Kevin 0
…
Next

From basketball to bedside: Finding connection through March Madness

May 20, 2025 Kevin 0
…

Tagged as: Health IT

< Previous Post
Diabetes and Alzheimer’s: What your blood sugar might be doing to your brain
Next Post >
From basketball to bedside: Finding connection through March Madness

ADVERTISEMENT

More by Muhamad Aly Rifai, MD

  • The truth about psychiatric supplements and mental health

    Muhamad Aly Rifai, MD
  • How to handle clinical disagreement with patients

    Muhamad Aly Rifai, MD
  • Physician due process: Surviving the court of public opinion

    Muhamad Aly Rifai, MD

Related Posts

  • How a law school elective changed my perspective on medicine

    Kelly Montgomery
  • From penicillin to digital health: the impact of social media on medicine

    Homer Moutran, MD, MBA, Caline El-Khoury, PhD, and Danielle Wilson
  • Medicine won’t keep you warm at night

    Anonymous
  • Delivering unpalatable truths in medicine

    Samantha Cheng
  • How women in medicine are shaping the future of medicine [PODCAST]

    American College of Physicians & The Podcast by KevinMD
  • What medicine can learn from a poem

    Thomas L. Amburn

More in Tech

  • Navigating the cybersecurity challenges of artificial intelligence in medicine

    Francisco M. Torres, MD & Purab Patel
  • AI in clinical documentation: the hidden risk of automation bias

    Gagandeep Rai
  • Can AI scribes give clinicians time to teach again?

    Lynn McComas, DNP, ANP-C
  • Health care cyberattacks expose a critical national security failure

    Kristen Cline, BSN, RN
  • AI agents in health care: What they say when we aren’t listening

    Alp Köksal
  • The hidden risks and rewards of AI scribes in medicine

    Arthur Lazarus, MD, MBA
  • Most Popular

  • Past Week

    • Politics and fear have replaced science in U.S. pain management [PODCAST]

      The Podcast by KevinMD | Podcast
    • Evidence-based medicine vs. clinical judgment: a medical student’s perspective

      Jay Pendyala | Education
    • The controversy over Maintenance of Certification for grandfathered physicians

      Bernard Leo Remakus, MD | Physician
    • How hindsight bias distorts clinical medicine

      Olumuyiwa Bamgbade, MD | Physician
    • When side effects are actually a cry for help with medication costs

      Shuchita Gupta, MD | Physician
    • The hidden math behind physician hiring costs and recruitment

      Timothy Lesaca, MD | Physician
  • Past 6 Months

    • The dangers of vertical integration in health care

      Stephanie Waggel, MD | Policy
    • Why does sex work seem like a more viable path than medicine in 2026?

      Corina Fratila, MD | Physician
    • The 9 laws of health care quality: Why metrics miss the point

      Constantine Ioannou, MD | Physician
    • Politics and fear have replaced science in U.S. pain management [PODCAST]

      The Podcast by KevinMD | Podcast
    • From Singapore to Canada: a blueprint for primary care transformation

      Ivy Oandasan, MD | Policy
    • How board certification fuels the physician shortage crisis

      Brian Hudes, MD | Physician
  • Recent Posts

    • Why measuring muscle mass matters more than tracking your weight [PODCAST]

      The Podcast by KevinMD | Podcast
    • Health insurance incentives and alternatives to opioids for chronic pain

      Molly Candon, PhD and Daniel Clauw, MD | Conditions
    • Independent medical practice: Why private clinics are essential

      Marcelo Hochman, MD | Physician
    • How hindsight bias distorts clinical medicine

      Olumuyiwa Bamgbade, MD | Physician
    • Do no harm: Why physician burnout requires bottom-up reform

      Desiree Francis, MD | Physician
    • Institutional distrust in health care: Why a doctor lost faith

      Joshua Mirrer, MD | Physician

Subscribe to KevinMD and never miss a story!

Get free updates delivered free to your inbox.


Find jobs at
Careers by KevinMD.com

Search thousands of physician, PA, NP, and CRNA jobs now.

Learn more

Leave a Comment

Founded in 2004 by Kevin Pho, MD, KevinMD.com is the web’s leading platform where physicians, advanced practitioners, nurses, medical students, and patients share their insight and tell their stories.

Social

  • Like on Facebook
  • Follow on Twitter
  • Connect on Linkedin
  • Subscribe on Youtube
  • Instagram

ADVERTISEMENT

  • Most Popular

  • Past Week

    • Politics and fear have replaced science in U.S. pain management [PODCAST]

      The Podcast by KevinMD | Podcast
    • Evidence-based medicine vs. clinical judgment: a medical student’s perspective

      Jay Pendyala | Education
    • The controversy over Maintenance of Certification for grandfathered physicians

      Bernard Leo Remakus, MD | Physician
    • How hindsight bias distorts clinical medicine

      Olumuyiwa Bamgbade, MD | Physician
    • When side effects are actually a cry for help with medication costs

      Shuchita Gupta, MD | Physician
    • The hidden math behind physician hiring costs and recruitment

      Timothy Lesaca, MD | Physician
  • Past 6 Months

    • The dangers of vertical integration in health care

      Stephanie Waggel, MD | Policy
    • Why does sex work seem like a more viable path than medicine in 2026?

      Corina Fratila, MD | Physician
    • The 9 laws of health care quality: Why metrics miss the point

      Constantine Ioannou, MD | Physician
    • Politics and fear have replaced science in U.S. pain management [PODCAST]

      The Podcast by KevinMD | Podcast
    • From Singapore to Canada: a blueprint for primary care transformation

      Ivy Oandasan, MD | Policy
    • How board certification fuels the physician shortage crisis

      Brian Hudes, MD | Physician
  • Recent Posts

    • Why measuring muscle mass matters more than tracking your weight [PODCAST]

      The Podcast by KevinMD | Podcast
    • Health insurance incentives and alternatives to opioids for chronic pain

      Molly Candon, PhD and Daniel Clauw, MD | Conditions
    • Independent medical practice: Why private clinics are essential

      Marcelo Hochman, MD | Physician
    • How hindsight bias distorts clinical medicine

      Olumuyiwa Bamgbade, MD | Physician
    • Do no harm: Why physician burnout requires bottom-up reform

      Desiree Francis, MD | Physician
    • Institutional distrust in health care: Why a doctor lost faith

      Joshua Mirrer, MD | Physician

MedPage Today Professional

An Everyday Health Property Medpage Today

Copyright © 2026 KevinMD.com | Powered by Astra WordPress Theme

  • Terms of Use | Disclaimer
  • Privacy Policy
  • DMCA Policy
All Content © KevinMD, LLC
Site by Outthink Group

Leave a Comment

Comments are moderated before they are published. Please read the comment policy.

Loading Comments...