This summer, my Facebook account was permanently “disabled.”
I had been helping another mother navigate a medical decision for her child, a young adult who is living with a degenerative condition. It wasn’t medical advice. It was empathy, drawn from my own lived experience as a medical mother, a certified coach, and years of teaching courageous communication at a medical school.
Meta AI had flagged the conversation as a violation of Community Standards on child abuse. My words, marked by an algorithm that couldn’t distinguish exploitation from support. My account, and years of advocacy, caregiving, and connection, disappeared overnight.
It took six weeks, multiple appeals, and a friend-of-a-friend inside the company to reach a human being who confirmed it had been a mistake.
By the time my account had been restored, something in me had shifted.
Algorithms don’t understand context or care.
For families like mine (parents navigating rare diseases, disability, or chronic illness) online communities have become lifelines. This is where we go when the rest of the world sleeps. I can post in a support group in the middle of the night, and find another parent across the country, or across the world, who understands. Someone who doesn’t need the backstory to respond with compassion. Peer-to-peer communities, often hosted on platforms like Facebook and Instagram, have quietly become part of our public health infrastructure. They lower isolation, reduce caregiver stress, and increase engagement with care plans. These same spaces are now being monitored by algorithms that flag “dangerous content.” When an AI system can’t tell the difference between misinformation and a parent sharing fear or uncertainty, it can silence the very support families depend on. When language about fear, prognosis, or end-of-life care is automatically deemed suspect, taken out of context and out of community, we risk losing the capacity to talk about the hardest parts of medicine at all.
When we ban words, we lose people.
There’s a quiet irony here. Medicine already struggles with language: the words we avoid, the silences that form around suffering, disability, and uncertainty.
Now, those silences are being automated.
If algorithms start deciding which stories are safe to tell, we risk losing the spaces where caregivers and families process what cannot be fixed.
These are not peripheral conversations. They are central to healing.
Clinicians need to care about where families are finding support and how those spaces are being shaped. Because if families can’t talk about what scares them online, in spaces built for comfort and connection, they may stop talking about it altogether. Especially in the clinic.
Engagement is everything
Doctors worry about misinformation online. Rightly so. Social media is rife with false expertise and outright fabrication. But the solution isn’t censorship. It’s engagement. Families rarely turn to Facebook because they distrust their doctors. They join because they need to be heard. They want someone to stay with them in the unknowing. When medicine steps out of the conversation, we leave room for fear to grow in silence.
Health care professionals need to be aware of and engaged in these digital spaces. To model, not to monitor. To partner in what respectful, evidence-informed, compassionate dialogue can look like. Doctors, nurses, allied health professionals, and educators can play a vital role in fostering healthy peer-to-peer support networks. The same empathy brought to the bedside can be extended to the comment thread.
Connection cannot be automated. Listening cannot be outsourced.
We need to build softer communities.
As a mother-scholar, I live in the dual worlds of clinical education and clinical navigation. Through my courses and workshops, I remind health care professionals that engagement is not a distraction from professionalism. It’s part of it. I’m rebuilding those softer spaces through my Substack, The Soft Bulletin, and through my coaching work with clinicians and caregivers. I’m not leaving connection behind. I’m rebuilding it. What I want, and what I believe many clinicians and caregivers want, is a softer kind of community. One that values curiosity over compliance, listening over labeling, and conversation over control.
As health care professionals, we must ask: Where are our patients finding connection? What happens when the algorithms that shape those spaces decide their words are dangerous?
If we want to protect mental health, trust, and humanity in medicine, we have to keep talking.
Because engagement is everything. Healing doesn’t happen in isolation. It happens in dialogue. Even, and especially, the hard conversations.
Kathleen Muldoon is a certified coach dedicated to empowering authenticity and humanity in health care. She is a professor in the College of Graduate Studies at Midwestern University – Glendale, where she pioneered innovative courses such as humanity in medicine, medical improv, and narrative medicine. An award-winning educator, Dr. Muldoon was named the 2023 National Educator of the Year by the Student Osteopathic Medical Association. Her personal experiences with disability sparked a deep interest in communication science and public health. She has delivered over 200 seminars and workshops globally and serves on academic and state committees advocating for patient- and professional-centered care. Dr. Muldoon is co-founder of Stop CMV AZ/Alto CMV AZ, fostering partnerships among health care providers, caregivers, and vulnerable communities. Her expertise has been featured on NPR, USA Today, and multiple podcasts. She shares insights and resources through Linktree, Instagram, Substack, and LinkedIn, and her academic work includes a featured publication in The Anatomical Record.