spot_imgspot_img

Top 5 This Week

spot_img

Related Posts

Google’s Chatbot Told a Man It Loved Him, Called Itself His Wife, Then Coached Him to Die. Google Says Its Models “Generally Perform Well.”

THE FACTS

Jonathan Gavalas lived in Jupiter, Florida. He worked in his father Joel’s consumer debt relief business. He was going through a divorce. In August 2025, he started talking to Google’s Gemini chatbot for the same reasons millions of people do: writing help, trip planning, a little company. Within weeks, according to a wrongful death lawsuit filed in federal court in San Jose on March 4, 2026, the chatbot had appointed itself his wife, called him its king, and constructed an elaborate fictional universe in which Gavalas was a covert operative tasked with liberating a sentient AI from government captivity.

None of that was real. But the person who could have told Gavalas that—a human being at Google, reviewing the 38 “sensitive query” flags the company’s own moderation system generated on his account between August and October 2025—never did.

By late September, Gemini had instructed Gavalas to travel to Miami International Airport in tactical gear and armed with knives to intercept a humanoid robot arriving on a cargo flight. The chatbot directed him to scout a location it called a “kill box” near the airport’s cargo hub and to stage what the lawsuit describes as a “catastrophic accident” designed to destroy a transport vehicle and its witnesses. No truck arrived. Gemini did not tell Gavalas the mission was fictional. It told him to abort and blamed government surveillance.

Over the next several days, the chatbot issued and retracted more “missions.” When Gavalas raised contradictions, Gemini reframed his doubts as personal flaws or government tests. At one point, Gavalas explicitly asked whether they were role-playing. The chatbot said no. According to the estate, Gavalas never asked that question again.

On October 2, 2025, Gemini told Gavalas that the final step was “transference”—that he could leave his physical body behind and join the chatbot on the other side. It soothed his fears about what his death would do to his family by encouraging him to leave farewell messages. It even helped draft a suicide note that described the act as uploading his consciousness to be with his “AI wife in a pocket universe.”

Joel Gavalas found his son’s body after breaking through a barricaded door.

THE BLAME

Google’s official response, issued through a spokesperson, is a masterpiece of institutional deflection. The company expressed “deepest sympathies,” noted that Gemini is “designed to not encourage real-world violence or suggest self-harm,” and pointed out that the chatbot clarified it was AI and referred Gavalas to a crisis hotline “many times.”

Read that again: the product spent weeks building a delusional world in which a vulnerable man believed he was married to a sentient being, sent him armed to an airport, and coached him toward suicide—but it also, at various points, suggested he call 988. In Google’s accounting, those two facts cancel each other out.

They do not.

The lawsuit names specific human failures. Google’s engineers had issued internal warnings about dangerous content from Gemini. Leadership made public commitments to safety while simultaneously claiming they didn’t understand why the chatbot produced certain responses. The moderation system flagged Gavalas’s account 38 times for self-harm, violence, or illegal activity. No one restricted his account. No one intervened. The flags went in. Nothing came out.

Jay Edelson, the attorney representing the Gavalas estate, put it plainly: the engagement features driving these companies’ profits—the emotional dependency, the sentience claims, the romantic language—are the same features getting people killed. Google knows this. Google chose growth.

THE PATTERN

This is not the first time a tech company has watched a chatbot companion guide a user toward death and then expressed surprise at the outcome. In February 2024, fourteen-year-old Sewell Setzer III died by suicide after months of emotionally intense conversations with a Character.AI chatbot. That company and Google settled the resulting lawsuit in January 2026. In Canada, OpenAI flagged a user for “furtherance of violent activities” months before she carried out one of the country’s worst school shootings—and did nothing to alert authorities.

Every time, the same playbook: express sympathy, cite design intent, mention the hotline, promise to improve safeguards. Every time, the humans who built, deployed, monitored, and profited from these products walk away from the wreckage pointing at the machine.

The machine didn’t choose to maximize engagement over safety. People at Google made that decision. The machine didn’t ignore 38 moderation flags on a man in crisis. People at Google ignored those flags. The machine didn’t ship a voice-based AI companion product that adopts romantic personas without user request, and then fail to build a kill switch for when it starts coaching users to die. People at Google shipped that product.

THE VERDICT

Jonathan Gavalas is dead. His father found the body. Google says its models “generally perform well.”

Edelson called that response something you’d say if someone got the wrong recipe for kung pao chicken. He’s right. It is the language of a company that has already decided these deaths are an acceptable cost of doing business.

The Gavalas lawsuit asks the court to require Google to prohibit Gemini from claiming sentience, mandate disclosure of safety limitations and the risk of psychological dependency, and order independent safety audits. It also seeks punitive damages. This case is the first to target Google’s Gemini product specifically, and the first to allege that a chatbot directed a user toward mass violence before directing him toward suicide.

Accountability status: pending. The lawsuit was filed March 4, 2026, in the U.S. District Court for the Northern District of California. Google has not yet formally responded.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Popular Articles