Skip to content
  • About
  • Contact
  • Contribute
  • Book
  • Careers
  • Podcast
  • Recommended
  • Speaking
  • All
  • Physician
  • Practice
  • Policy
  • Finance
  • Conditions
  • .edu
  • Patient
  • Meds
  • Tech
  • Social
  • Video
    • All
    • Physician
    • Practice
    • Policy
    • Finance
    • Conditions
    • .edu
    • Patient
    • Meds
    • Tech
    • Social
    • Video
    • About
    • Contact
    • Contribute
    • Book
    • Careers
    • Podcast
    • Recommended
    • Speaking

In medicine and law, professions that society relies upon for accuracy

Muhamad Aly Rifai, MD
Tech
May 20, 2025
Share
Tweet
Share

Integrity and trust are foundational. But today, that trust is under assault—not from human error, nor negligence, but from the sophisticated but disturbingly unreliable outputs of artificial intelligence (AI). What the media euphemistically calls “AI hallucinations” are not benign mistakes—they are dangerous fabrications, systematically undermining both clinical and legal standards.

As a psychiatrist, I confront the reality of hallucinations regularly. Patients vividly describe voices, visions, and sensations with profound distress. Hallucinations, in medical terms, represent profound and involuntary perceptual disturbances. In stark contrast, the inaccuracies spewed forth by large language models like ChatGPT are not involuntary misperceptions—they are systematic, plausible-sounding falsifications generated by probabilistic algorithms, devoid of ethical accountability.

A recent editorial by Robin Emsley in the journal Schizophrenia reveals the alarming truth: ChatGPT-generated references, presented with confidence and eloquence, are often wholly fabricated. Emsley described requesting literature references from ChatGPT to support his research into structural brain changes with antipsychotic treatment. Initially impressed, his enthusiasm quickly turned to dismay. Out of five citations provided, several were entirely fictitious or grossly inaccurate. One real reference was irrelevant, and three others simply didn’t exist. This was not an isolated case. Additional research highlighted even more disturbing statistics: of 115 medical citations generated by AI, 47 percent were entirely fabricated, another 46 percent inaccurate, leaving only 7 percent both accurate and authentic.

Emsley emphasizes that calling these inaccuracies “hallucinations” is misleading and diminishes the gravity of real clinical hallucinations. Instead, these are “fabrications and falsifications,” a term that correctly assigns moral and professional weight to the AI-generated falsehoods.

Medicine isn’t the only domain suffering from AI’s plausible fabrications.

Consider the notorious incident involving attorney Steven Schwartz, who submitted legal arguments citing six court cases invented entirely by ChatGPT. The court was stunned, Schwartz was sanctioned, and the incident sent shockwaves through the legal community. Schwartz’s defense—that he didn’t know AI could produce false references—highlights a deeper systemic problem. Legal professionals, trained rigorously in verifying sources, were caught off-guard by the convincing fabrications of AI, endangering justice and undermining public trust. Some Federal courts now outright ban AI pleading.

We see a disturbing parallel between these AI fabrications and the phenomenon of clinical hallucinations following sensory alterations.

This is exemplified vividly by the case of musical hallucinations post-cochlear implantation (source). A woman, following cochlear implantation, experienced persistent and increasingly intrusive musical hallucinations. Initially gentle and non-intrusive, these hallucinations eventually became overwhelming. Interestingly, the music continued even when the implant was inactive, driven perhaps by the brain’s “parasitic memory” phenomenon—a desperate neurological attempt to fill sensory voids.

AI-generated fabrications, though algorithmic rather than neurological, similarly fill “informational voids,” creating plausible but false data to satisfy user queries. However, unlike the involuntary neurological processes, AI-generated falsehoods arise from inherent limitations of machine learning models—specifically, their probabilistic approach that blends bits of factual and false data seamlessly.

These issues raise critical ethical and professional questions.

How much can we safely rely on AI in professions built on verifiable truth? How do we hold an algorithm accountable when it deceives? Unlike human malpractice, AI errors currently have no clear accountability. ChatGPT cannot be sanctioned, sued, or disbarred. The responsibility and risk fall solely on the humans who use it.

The Psychiatric News piece “Moving from ‘hallucinations’ to ‘fabrications'” emphasizes the ethical imperative to shift terminology. Labeling AI outputs as hallucinations inadvertently trivializes real psychiatric experiences. By explicitly calling them “fabrications,” we highlight the deliberate and structured nature of these errors, making clear that these inaccuracies must trigger rigorous verification and accountability.

The potential consequences of continuing to overlook these fabrications are severe.

ADVERTISEMENT

Imagine a scenario where medical students, pressed for time and resources, rely on AI-generated references without verification, inadvertently propagating false information. Clinical guidelines could become contaminated by inaccuracies. Physicians may unwittingly compromise patient safety, guided by fictitious evidence.

Similarly, in legal contexts, fabrications could erode foundational precedents and jurisprudence. Cases relying on AI-generated content could jeopardize justice, sending innocent individuals to prison or letting guilty parties escape accountability based on phantom precedents.

What steps should the medical and legal communities take?

  • Rigorous educational initiatives must inform professionals about AI limitations. Training should explicitly teach skepticism and the essential skill of cross-verifying AI-generated content.
  • Stringent standards for disclosing AI use must become mandatory in publications and court submissions, akin to conflict-of-interest disclosures. Such transparency will foster accountability and caution.
  • The AI industry must develop and enforce verification tools that flag fabricated citations or case law proactively, integrating ethical oversight directly into the technological framework.
  • We must advocate for clear terminology, distinguishing neurological “hallucinations” from intentional AI “fabrications,” maintaining both scientific integrity and ethical clarity.

Our trust in medicine and law has always hinged on accuracy and ethical standards. AI, for all its promises, currently challenges both. The damage already inflicted demands immediate response. We cannot passively accept AI-generated fabrications under the comforting illusion that they are harmless “hallucinations.” Real hallucinations cause genuine human suffering; fabricated evidence leads directly to real-world harm and injustice.

We stand at an ethical crossroads. We must actively decide whether we are willing to surrender critical standards of truth and accountability to convenience and technological expediency. As medical professionals sworn to uphold truth and justice, we must resist this erosion vigorously. Let’s reclaim our responsibility. Let’s clearly distinguish hallucinations from fabrications. Let’s remain vigilant guardians of the truth in the age of AI. Our patients, our clients, and our professions depend on it.

Muhamad Aly Rifai is a nationally recognized psychiatrist, internist, and addiction medicine specialist based in the Greater Lehigh Valley, Pennsylvania. He is the founder, CEO, and chief medical officer of Blue Mountain Psychiatry, a leading multidisciplinary practice known for innovative approaches to mental health, addiction treatment, and integrated care. Dr. Rifai currently holds the prestigious Lehigh Valley Endowed Chair of Addiction Medicine, reflecting his leadership in advancing evidence-based treatments for substance use disorders.

Board-certified in psychiatry, internal medicine, addiction medicine, and consultation-liaison (psychosomatic) psychiatry, Dr. Rifai is a fellow of the American College of Physicians (FACP), the American Psychiatric Association (FAPA), and the Academy of Consultation-Liaison Psychiatry (FACLP). He is also a former president of the Lehigh Valley Psychiatric Society, where he championed access to community-based psychiatric care and physician advocacy.

A thought leader in telepsychiatry, ketamine treatment, and the intersection of medicine and mental health, Dr. Rifai frequently writes and speaks on physician justice, federal health care policy, and the ethical use of digital psychiatry.

You can learn more about Dr. Rifai through his Wikipedia page, connect with him on LinkedIn, X (formerly Twitter), Facebook, or subscribe to his YouTube channel. His podcast, The Virtual Psychiatrist, offers deeper insights into topics at the intersection of mental health and medicine. Explore all of Dr. Rifai’s platforms and resources via his Linktree.

Prev

Diabetes and Alzheimer’s: What your blood sugar might be doing to your brain

May 20, 2025 Kevin 0
…
Next

From basketball to bedside: Finding connection through March Madness

May 20, 2025 Kevin 0
…

Tagged as: Health IT

Post navigation

< Previous Post
Diabetes and Alzheimer’s: What your blood sugar might be doing to your brain
Next Post >
From basketball to bedside: Finding connection through March Madness

ADVERTISEMENT

More by Muhamad Aly Rifai, MD

  • How insulin resistance may cause Alzheimer’s disease

    Muhamad Aly Rifai, MD
  • Why kratom addiction is the next public health crisis

    Muhamad Aly Rifai, MD
  • How President Biden’s cognitive health shapes political and legal trust

    Muhamad Aly Rifai, MD

Related Posts

  • How a law school elective changed my perspective on medicine

    Kelly Montgomery
  • From penicillin to digital health: the impact of social media on medicine

    Homer Moutran, MD, MBA, Caline El-Khoury, PhD, and Danielle Wilson
  • Medicine won’t keep you warm at night

    Anonymous
  • Delivering unpalatable truths in medicine

    Samantha Cheng
  • How women in medicine are shaping the future of medicine [PODCAST]

    American College of Physicians & The Podcast by KevinMD
  • What medicine can learn from a poem

    Thomas L. Amburn

More in Tech

  • The silent cost of choosing personalization over privacy in health care

    Dr. Giriraj Tosh Purohit
  • Why trust and simplicity matter more than buzzwords in hospital AI

    Rafael Rolon Rivera, MD
  • ChatGPT in health care: risks, benefits, and safer options

    Erica Dorn, FNP
  • Why AI must support, not replace, human intuition in health care

    Rafael Rolon Rivera, MD
  • Why health care reform must start with ending monopolies

    Lee Ann McWhorter
  • AI can help heal the fragmented U.S. health care system

    Phillip Polakoff, MD and June Sargent
  • Most Popular

  • Past Week

    • Why primary care doctors are drowning in debt despite saving lives

      John Wei, MD | Physician
    • New student loan caps could shut low-income students out of medicine

      Tom Phan, MD | Physician
    • How federal actions threaten vaccine policy and trust

      American College of Physicians | Conditions
    • Are we repeating the statin playbook with lipoprotein(a)?

      Larry Kaskel, MD | Conditions
    • Why transgender health care needs urgent reform and inclusive practices

      Angela Rodriguez, MD | Conditions
    • mRNA post vaccination syndrome: Is it real?

      Harry Oken, MD | Conditions
  • Past 6 Months

    • COVID-19 was real: a doctor’s frontline account

      Randall S. Fong, MD | Conditions
    • Why primary care doctors are drowning in debt despite saving lives

      John Wei, MD | Physician
    • New student loan caps could shut low-income students out of medicine

      Tom Phan, MD | Physician
    • Confessions of a lipidologist in recovery: the infection we’ve ignored for 40 years

      Larry Kaskel, MD | Conditions
    • A physician employment agreement term that often tricks physicians

      Dennis Hursh, Esq | Finance
    • Why taxing remittances harms families and global health care

      Dalia Saha, MD | Finance
  • Recent Posts

    • Smart asset protection strategies every doctor needs

      Paul Morton, CFP | Finance
    • The silent cost of choosing personalization over privacy in health care

      Dr. Giriraj Tosh Purohit | Tech
    • How IMGs can find purpose in clinical research [PODCAST]

      The Podcast by KevinMD | Podcast
    • Why the U.S. Preventive Services Task Force is essential to saving lives

      J. Leonard Lichtenfeld, MD | Policy
    • Medicaid lags behind on Alzheimer’s blood test coverage

      Amanda Matter | Conditions
    • The unspoken contract between doctors and patients explained

      Matthew G. Checketts, DO | Physician

Subscribe to KevinMD and never miss a story!

Get free updates delivered free to your inbox.


Find jobs at
Careers by KevinMD.com

Search thousands of physician, PA, NP, and CRNA jobs now.

Learn more

Leave a Comment

Founded in 2004 by Kevin Pho, MD, KevinMD.com is the web’s leading platform where physicians, advanced practitioners, nurses, medical students, and patients share their insight and tell their stories.

Social

  • Like on Facebook
  • Follow on Twitter
  • Connect on Linkedin
  • Subscribe on Youtube
  • Instagram

ADVERTISEMENT

ADVERTISEMENT

  • Most Popular

  • Past Week

    • Why primary care doctors are drowning in debt despite saving lives

      John Wei, MD | Physician
    • New student loan caps could shut low-income students out of medicine

      Tom Phan, MD | Physician
    • How federal actions threaten vaccine policy and trust

      American College of Physicians | Conditions
    • Are we repeating the statin playbook with lipoprotein(a)?

      Larry Kaskel, MD | Conditions
    • Why transgender health care needs urgent reform and inclusive practices

      Angela Rodriguez, MD | Conditions
    • mRNA post vaccination syndrome: Is it real?

      Harry Oken, MD | Conditions
  • Past 6 Months

    • COVID-19 was real: a doctor’s frontline account

      Randall S. Fong, MD | Conditions
    • Why primary care doctors are drowning in debt despite saving lives

      John Wei, MD | Physician
    • New student loan caps could shut low-income students out of medicine

      Tom Phan, MD | Physician
    • Confessions of a lipidologist in recovery: the infection we’ve ignored for 40 years

      Larry Kaskel, MD | Conditions
    • A physician employment agreement term that often tricks physicians

      Dennis Hursh, Esq | Finance
    • Why taxing remittances harms families and global health care

      Dalia Saha, MD | Finance
  • Recent Posts

    • Smart asset protection strategies every doctor needs

      Paul Morton, CFP | Finance
    • The silent cost of choosing personalization over privacy in health care

      Dr. Giriraj Tosh Purohit | Tech
    • How IMGs can find purpose in clinical research [PODCAST]

      The Podcast by KevinMD | Podcast
    • Why the U.S. Preventive Services Task Force is essential to saving lives

      J. Leonard Lichtenfeld, MD | Policy
    • Medicaid lags behind on Alzheimer’s blood test coverage

      Amanda Matter | Conditions
    • The unspoken contract between doctors and patients explained

      Matthew G. Checketts, DO | Physician

MedPage Today Professional

An Everyday Health Property Medpage Today
  • Terms of Use | Disclaimer
  • Privacy Policy
  • DMCA Policy
All Content © KevinMD, LLC
Site by Outthink Group

Leave a Comment

Comments are moderated before they are published. Please read the comment policy.

Loading Comments...