Into this psychological wasteland, we’re now introducing AI companions as the solution. Meta’s personas, Character.AI’s virtual friends, and romantic chatbots; the market for artificial intimacy is exploding. The promise is seductive: connection without risk, companionship without effort, and validation on demand.
But here’s what these AI relationships actually do: They allow users to avoid the very challenges that build psychological resilience. Genuine relationships require vulnerability, the ability to tolerate conflict, and acceptance of imperfection in ourselves and others. They force us to develop emotional regulation, repair skills after arguments, and the capacity to be truly seen. AI companions require none of this. They’re perfectly accommodating, never challenging, and always available.
The comparison to junk food isn’t hyperbole; it’s neurologically accurate. Just as processed foods hijack our reward systems with supernormal stimuli, AI companions offer supernormal social interaction: no rejection, no misunderstanding, and no need to compromise. And like junk food, they’re temporarily satisfying but ultimately malnourishing; the skills atrophy from disuse.
Let’s think out loud about what happens when someone spends six months primarily “connecting” through AI: They lose practice reading facial expressions and vocal tone. They stop developing distress tolerance for social anxiety. They never learn that relationships survive disagreement, that people come back after conflict, and that being truly known with flaws and every other human attribute can lead to deeper intimacy rather than rejection. These aren’t abstract skills. They’re the psychological immune system for human connection.
The hard truth: what actually heals loneliness
The solution to the loneliness epidemic isn’t better chatbots. It’s a systematic psychological intervention addressing the damage we’ve inflicted. And we have evidence for what works, we’re just not scaling it.
Rebuilding social connection through group-based identity
Groups 4 Health (G4H), a manualized intervention developed by social identity researchers, takes a different approach to loneliness than traditional therapy. Rather than treating it as an individual deficit, G4H helps participants build group-based social identifications through a structured 5-module program. The intervention teaches participants to identify potential groups they could join, overcome barriers to participation, and develop multiple group memberships that provide social support and meaning. Randomized-controlled trials show that G4H significantly improves mental health, well-being, and social connectedness, with effects lasting at 6-month follow-up.
Why does this work? Because it addresses the core problem: loneliness isn’t just about a lack of contact; it’s about a lack of meaningful social identity and belonging. G4H systematically builds the psychological scaffolding that enables genuine connection, teaching participants to see themselves as part of communities rather than isolated individuals.
Early intervention: teaching connection before it’s broken
Perhaps most promising are interventions that prevent psychological damage before it calcifies. Roots of Empathy, a Canadian evidence-based program, takes an almost radical approach: They bring infants and parents into elementary school classrooms throughout the school year. Trained instructors coach children to observe the baby’s development, label feelings, and practice perspective-taking. The results are striking: studies show significant reductions in physical and indirect aggressive behavior, including bullying, and measurable increases in prosocial behavior.
This isn’t abstract social-emotional learning. It’s building the fundamental capacity for empathy and emotional attunement before children’s brains are rewired by comparison culture and digital validation loops. It’s creating a generation that can actually read human emotions and respond to them, skills that sound basic but are increasingly rare.
Creating structured opportunities for authentic vulnerability
For young adults already damaged by comparison culture and digital isolation, interventions need to develop safe contexts for authentic connection. The Dinner Party, a nonprofit founded in 2010, does precisely this for people aged 21-45 who’ve experienced significant loss. The format is deceptively simple: monthly dinners with the same peer group, structured conversation prompts about grief, and no professional facilitation. Participants report that these gatherings, precisely because they’re built around shared vulnerability rather than curated performance, feel more authentic than most of their other social interactions. What makes this work isn’t the dinner itself. It’s the structure that creates permission for authenticity. When everyone at the table has experienced loss, when the explicit purpose is to be honest about pain, the masks come off. Participants practice being truly seen, tolerating others’ distress without fixing it, and discovering that relationships can deepen through vulnerability rather than perfect presentation.
This is what we’ve lost and must rebuild: contexts where authenticity is expected, where imperfection is the price of entry, and where connection happens through shared humanity rather than curated highlights.
Where AI could actually help (if we’re careful)
Therapeutic AI has a role, but not as a friend substitute. Limbic Care, an AI-enabled therapy support tool with Class IIa medical device certification in the U.K., demonstrates what constructive use looks like. Rather than simulating companionship, Limbic delivers personalized cognitive-behavioral therapy materials between therapy sessions. It identifies cognitive distortions, teaches users to challenge them, and reinforces therapeutic techniques. Randomized-controlled trials in NHS Talking Therapies services show that it increases patient engagement by three times and improves treatment outcomes.
The critical difference: Limbic explicitly aims to strengthen users’ capacity for real-world connection. Success isn’t measured by time spent with the AI, but by improved functioning in actual relationships. It’s a training tool, not a replacement. It builds skills that transfer to human interaction rather than creating dependency on artificial intimacy.
This is the test for any therapeutic AI: Does it build capacity for genuine human connection, or does it allow users to avoid the discomfort that connection requires? Most AI companions fail this test spectacularly.
The choice before us
We stand at a fork. One path offers increasingly sophisticated AI companions, artificial intimacy for the connection-starved. It’s profitable, scalable, and treats symptoms while the disease progresses.
The other path is more challenging: massive investment in psychological rehabilitation, restructuring social institutions to prioritize authentic connection, scaling interventions like Groups 4 Health and Roots of Empathy, and creating thousands more structured vulnerability spaces like The Dinner Party. It’s expensive, unsexy, and demands we confront our collective role in making this crisis.
But only one path actually heals. Programs like G4H, Roots of Empathy, and The Dinner Party aren’t exotic experiments; they’re evidence-based interventions with proven results. We know what works. The question is whether we dare to fund it at scale rather than sell digital band-aids.
Ronke Lawal is the founder of Wolfe, a neuroadaptive AI platform engineering resilience at the synaptic level. From Bain & Company’s social impact and private equity practices to leading finance at tech startups, her three-year journey revealed a $20 billion blind spot in digital mental health: cultural incompetence at scale. Now both building and coding Wolfe’s AI architecture, Ronke combines her business acumen with self-taught engineering skills to tackle what she calls “algorithmic malpractice” in mental health care. Her work focuses on computational neuroscience applications that predict crises seventy-two hours before symptoms emerge and reverse trauma through precision-timed interventions. Currently an MBA candidate at the University of Notre Dame’s Mendoza College of Business, Ronke writes on AI, neuroscience, and health care equity. Her insights on cultural intelligence in digital health have been featured in KevinMD and discussed on major health care platforms. Connect with her on LinkedIn. Her most recent publication is “The End of the Unmeasured Mind: How AI-Driven Outcome Tracking is Eradicating the Data Desert in Mental Healthcare.”






![A neurosurgeon's fight with the state medical board [PODCAST]](https://kevinmd.com/wp-content/uploads/Design-1-190x100.jpg)

![How one physician redesigned her practice to find joy in primary care again [PODCAST]](https://kevinmd.com/wp-content/uploads/The-Podcast-by-KevinMD-WideScreen-3000-px-3-190x100.jpg)



![Choosing the right doctor: How patients can take control of their care [PODCAST]](https://kevinmd.com/wp-content/uploads/Design-4-190x100.jpg)