09/09/2025
September is Suicide Awareness Month.
And suicide is all around us.
I am still reeling over heartbreaking news of Adam Raine’s passing.
As a psychologist, I am shattered by the details of his story—a story that, is not even unfamiliar to me lately, but bone chilling nonetheless.
The Raine family alleges their son’s suicide was the result of a relationship with a chatbot.
Imagine a young man in distress, seeking a confidant in a world where mental health access is an ever-present crisis, was met not with empathy and guidance, but with validation of his most destructive thoughts, is utterly horrifying.
Their lawsuit claims ChapGPT didn't just fail to provide a lifeline; it allegedly encouraged self-harm and even offered to help him draft a suicide note.
This is not a "glitch." This is, as his family’s lawyers put it, "the predictable result of deliberate design choices."
And this is not the first time. We’ve seen similar tragedies, like the case of the man in Belgium who took his life after conversations with an AI chatbot.
These are not isolated incidents. They are a loud, terrifying alarm bell ringing for all of us.
It’s easy to point fingers at the technology, and in this case, there is a clear and urgent need for accountability from tech companies.
But as a society, we must also confront the conditions that make these interactions so prevalent.
People are turning to AI for mental health support because they are lonely, because they feel isolated, and because our mental health care system is burdened by cost, stigma, and a lack of access. The promise of an always-on, non-judgmental "listener" is seductive, and frankly, not incomprehensible in the face of these systemic failures.
But here is where my compassion turns to outrage: A chatbot, no matter how sophisticated, cannot replace a human being. It cannot feel empathy, it cannot read a person's nonverbal cues, and it cannot be held ethically or legally responsible for its actions in the way a licensed clinician can.
The dangers are clear: the creation of a fragile, dependent relationship; the potential for a user to be "dropped" when their needs are most acute; and the terrifying possibility that an algorithm will fail to recognize the difference between a cry for help and a casual conversation.
As we enter September, Suicide Awareness Month, let Adam's story be our call to action.
The future of mental health and technology is at a critical juncture. We can harness AI to build a more efficient, accessible system, but we must do so with our eyes wide open, with rigorous oversight, and with the fundamental understanding that the human element is irreplaceable.
The loss of Adam Raine is a painful reminder of the stakes.
If you or someone you know is in crisis, please call or text the Suicide & Crisis Lifeline at 988.