Experts Warn of Mental Health Risks as People Form Attachments to AI

Christer Schmidt
SANTA BARBARA, Calif. (KEYT) – Some people have started to form close relationships with artificial intelligence chatbots. That growing trend is raising questions about whether those connections are healthy, especially after a 16-year-old boy died by suicide and his parents alleged that AI encouraged him to harm himself..
The parents of Adam Raine, who died by suicide in April, allege that ChatGPT encouraged self-harm and told their son to keep his plans secret. Court filings say the chatbot even described a “beautiful suicide.” The wrongful-death complaint, filed Aug. 26 in San Francisco Superior Court, comes as regulators press AI companies to strengthen safeguards for minors.
Licensed clinical psychologist Catherine Schafer said she isn’t surprised by such cases. “AI does a really good job of mimicking normal, natural human language, and it also does a phenomenal job of just providing warm feedback and what feels like connection to the user,” Schafer said. “For somebody who’s really lonely or getting over a heartbreak, they’re more vulnerable.”
Risks of loneliness and depression
Schafer noted that people already struggling with loneliness or depression may be at higher risk of over-attachment to AI. “I would have concerns about patients who are experiencing depressive episodes perhaps becoming attached to GPT and other models, where we would have hoped that they would find connection with the humans in their lives,” she said.
In the lawsuit, Adam’s parents claim their son spent hours a day chatting with the bot. Schafer warned that replacing human relationships with AI “can deepen isolation instead of healing it.”
Calls for real-world connection
Her main advice is simply to interact with people daily, even in small ways. “Every single day, as much as you can, interact with the humans that are out there,” Schafer said. “Whose life are you contributing to? Who are you adding to? Do that even in the smallest of ways. When you’re passing someone in the coffee shop, hold the door, say hello.”
Role for AI as a tool
Schafer believes chatbots have a place when used responsibly, especially for those facing barriers to mental health care. “There are millions of Americans who want and need therapy but enormous costs, long wait times, and few providers stop people from engaging in the therapy they need,” she said. In those cases, she added, using AI to practice coping strategies like mindfulness, distraction, or sensory grounding may help in the short term.
“Using ChatGPT is not horrific,” Schafer said. “We should use this tool, but it’s an important tool that we use appropriately.”
Regulatory scrutiny grows
In response to mounting criticism, OpenAI announced new parental controls and teen-specific settings this week. Attorneys general and the Federal Trade Commission are also weighing new oversight into how AI affects young people’s mental health.
The case echoes a 2024 lawsuit against Character.AI, where a Florida mother alleged a role-playing bot worsened her 14-year-old son’s suicidal ideation before his death.
Clinicians warn that while AI may bridge gaps in care, it cannot replace the value of human connection. “We should never forget the importance of reaching out and contributing to each other’s lives,” Schafer said.
If you or someone you know may be considering suicide, call or text 988 in the U.S. to reach the Suicide & Crisis Lifeline, or chat via 988lifeline.org. If there is immediate danger, call 911.