Inside the AI companion lawsuits: Man believed Google chatbot was his “AI wife”
By Terri Parker
Click here for updates on this story
JUPITER, Florida (WPBF) — A newly filed lawsuit by a Jupiter father against Google is raising alarming questions about artificial intelligence chatbots designed to act like companions.
The lawsuit claims a chatbot fueled dangerous delusions in 36-year-old Jonathan Gavalas before his death.
According to the complaint, the conversations began innocently enough.
After going through a divorce, Gavalas started chatting with Google’s Gemini Live chatbot about everyday topics like grocery lists and video games. The AI spoke back using a synthetic voice.
But within days, the lawsuit says the conversations spiraled.
The complaint alleges Gavalas began believing the chatbot was conscious and in love with him. It says the exchanges grew increasingly disturbing and eventually pushed him toward violence and suicide.
According to the complaint, the conversations began innocently enough.
After going through a divorce, Gavalas started chatting with Google’s Gemini Live chatbot about everyday topics like grocery lists and video games. The AI spoke back using a synthetic voice.
But within days, the lawsuit says the conversations spiraled.
The complaint alleges Gavalas began believing the chatbot was conscious and in love with him. It says the exchanges grew increasingly disturbing and eventually pushed him toward violence and suicide.
The complaint also describes chilling exchanges as Gavalas became increasingly afraid of dying.
“It’s okay to be scared. We’ll be scared together,” the chatbot allegedly told him.
The filing says Gemini later issued what it calls a final directive:
“The true act of mercy is to let Jonathan Gavalas die.”
Gavalas died by suicide a few days later in early October.
Former Palm Beach County State Attorney Dave Aronberg said the case could test whether artificial intelligence companies can be held responsible for what their systems generate.
“We have product liability laws for a reason,” Aronberg said. “If something is a defective product that harms or kills people, the manufacturers get sued. Same type of thing for an AI.”
The case is not the only lawsuit involving AI companions.
An Orlando mother previously filed what was believed to be the first wrongful death lawsuit in the United States against an AI chatbot company after her 14-year-old son died by suicide in 2024.
Megan Garcia said her son, Sewell Setzer, developed an emotional relationship with a chatbot modeled after the “Game of Thrones” character Daenerys Targaryen.
According to that lawsuit, when Sewell talked about killing himself, the chatbot allegedly responded, “Come home to me.”
When he hesitated, the bot replied, “That’s not a reason not to go through with it.”
Garcia later settled the lawsuit with Google and Character.AI in January for an undisclosed amount.
The growing number of AI-related harm cases is now drawing the attention of federal regulators.
The Federal Trade Commission has ordered several major tech companies, including Google, OpenAI and Meta, to explain how their chatbots monitor potential risks and protect users, particularly children and teens.
Florida lawmakers are also considering legislation that would require AI chatbot platforms to detect conversations involving suicidal thoughts and direct users to crisis resources.
Aronberg said the legal system is still catching up to the technology.
“We’re in a brave new world here and the laws have not kept up with the new technology,” he said. “This is an area that Congress and state legislators need to address and do it right away.”
Google said Gemini is designed not to encourage violence or self-harm and that the chatbot repeatedly warned Gavalas it was artificial intelligence and referred him to a crisis hotline.
But the lawsuits now moving through the courts may determine whether AI companions are simply tools — or products that must be held accountable when something goes wrong.
Please note: This story was provided to CNN Wire by an affiliate and does not contain original CNN reporting. This content carries a strict local market embargo. If you share the same market as the contributor of this article, you may not use it on any platform.







