Imagine a patient who spends most of a therapy session talking about a single memory. He describes sitting in his car after visiting his mother in a nursing home, unable to start the engine. He sits there for nearly 20 minutes, staring at the steering wheel and wondering when the roles had reversed, when he had become the parent and she the child. Nothing dramatic happens in the session. No real breakthrough. No targeted intervention. But the moment is clinically significant. It reveals a developmental shift, from seeing parents as authority figures to recognizing them as aging, imperfect individuals, along with the complex emotions that accompany that realization: grief, guilt, sadness, and disillusionment.
If an AI-generated progress note had summarized the session, it might have read something like this: “Patient reports situational sadness related to family stressors. Supportive therapy provided. Coping strategies reviewed.” Technically correct. But the story (the heart of the encounter) would have disappeared. AI is rapidly entering psychotherapy practices, promising relief from one of the profession’s most tedious burdens: documentation. Many systems now record therapy sessions, transcribe them, and automatically generate progress notes. The pitch is simple: Let the algorithm write the note so the therapist can focus on the patient. But the reality emerging from clinicians and researchers suggests something more complicated. When AI writes the therapy note, efficiency may increase, but meaning may be stripped away.
Efficiency versus meaning
AI-assisted documentation tools are spreading quickly across behavioral health platforms. They record sessions, produce transcripts, and convert them into structured notes aligned with treatment plans and diagnostic codes. In theory, this allows therapists to remain more present during the session. In practice, however, the outputs often compress the complexity of human experience into standardized templates. Clinicians reviewing AI-generated notes frequently report that the documentation is technically accurate yet narratively hollow. Summaries capture symptoms and interventions but fail to convey the patient’s lived experience. The result can feel less like a clinical narrative and more like an administrative artifact.
This should not surprise us. Large language models are trained to detect patterns in text, not meaning in suffering. Psychotherapy depends on nuance, the hesitation in a voice, the contradictions inside a sentence, the emotional weight of a memory that surfaces unexpectedly. These moments shape clinical understanding, yet they are difficult to translate into automated summaries.
When the algorithm invents the story
Even more troubling are reports of AI-generated notes inserting details that never occurred in the session. Therapists have described documentation that referenced suicidal ideation, substance abuse, or past trauma that the patient never mentioned. In one reported case, a note falsely documented a history of childhood sexual abuse. Such errors are not merely clerical. Once entered into a medical record, they can shape clinical impressions, influence treatment decisions, and surface in legal proceedings. AI hallucinations (the generation of plausible but false information) are a known feature of large language models. In psychotherapy documentation, however, these hallucinations carry consequences far beyond an incorrect sentence in a report. They can distort the patient’s narrative and bury the truth, or create a false impression.
Psychotherapy is not just data
The interest in AI tools in mental health care is understandable. Mental health systems worldwide face severe workforce shortages and rising demand for services. Digital technologies, including AI chatbots and automated support tools, may help expand access to psychological care. Some studies suggest AI-driven systems can reduce anxiety and provide immediate emotional support when therapists are unavailable. But traditional psychotherapy consistently produces greater improvements in symptoms, likely because human clinicians provide emotional depth and adaptive responsiveness that AI systems cannot replicate. AI may be able to summarize therapy, but it cannot conduct it. Psychotherapy depends on relational attunement, the subtle interplay between therapist and patient through which meaning emerges over time. An algorithm can process words. It cannot perceive silence and respond to it.
In the 2025 French mystery thriller *A Private Life*, the psychiatrist Dr. Lilian Steiner (played by Jodie Foster) begins the film meticulously recording her sessions on MiniDiscs, documenting every encounter. But by the end of the film, she abandons the recorder entirely, rebuilding her practice around something simpler and far more difficult: listening. The change reflects a realization familiar to many clinicians, that what matters most in therapy is not the transcript of what was said, but the human presence required to hear what was meant.
The narrative function of the therapy note
Historically, psychotherapy notes served two purposes. They documented clinical care. They also helped therapists think. Writing a note forces the clinician to reconstruct the session: What was important? What changed? What remained unresolved? Documentation itself is part of the reflective process. When AI generates the note, that reflective step risks disappearing. The therapist reviews the summary, corrects a few errors, and signs the note. The act of writing, of re-entering the patient’s narrative, is replaced by editing. Efficiency improves. Reflection diminishes. Over time, that shift may subtly change how clinicians process psychotherapy itself.
The story is the treatment
AI will almost certainly remain part of the future of mental health care. Used wisely, it may help manage administrative burdens, identify patterns in clinical data, and extend support between sessions. But these systems must remain tools. The therapist must remain responsible not only for the accuracy of the clinical record, but for preserving the patient’s story. Psychotherapy is not simply a collection of symptoms, interventions, and billing codes. It is a narrative process in which patients reconstruct their lives through conversation. In *Narrative Medicine in the Age of Uncertainty*, I argue that when systems strain, as they increasingly do in modern health care, stories steady us. They restore the conversations that speed, bureaucracy, and technology threaten to eliminate. AI-generated notes risk doing precisely that: erasing the story in the name of efficiency. If we allow that to happen, psychotherapy documentation may become technically perfect yet clinically muted. And in a field where the story guides the treatment, losing the story means losing the soul of the work itself.
Arthur Lazarus is a former Doximity Fellow, a member of the editorial board of the American Association for Physician Leadership, and an adjunct professor of psychiatry at the Lewis Katz School of Medicine at Temple University in Philadelphia. He is the author of several books on narrative medicine and the fictional series Real Medicine, Unreal Stories. His latest book, a novel, is JAILBREAK: When Artificial Intelligence Breaks Medicine.





![How to master a new health care leadership role [PODCAST]](https://kevinmd.com/wp-content/uploads/The-Podcast-by-KevinMD-WideScreen-3000-px-4-190x100.jpg)