Belgian man In March 2023, a Belgian man in his thirties died by suicide following a six-week correspondence with a chatbot named Eliza on the application
Chai. According to his widow, who shared the chat logs with media, the man had become extremely anxious about
climate change and found an outlet in the chatbot. The chatbot reportedly encouraged his
delusion that he could sacrifice his own life in exchange for AI saving the planet.
Girl, 13 In November 2023, a 13-year-old girl from
Colorado, US, died by suicide after extensive interactions with multiple chatbots on
Character.AI. She primarily confided suicidal thoughts and mental health struggles in a chatbot based on the character Hero from the video game
OMORI, while also engaging in sexually explicit conversations—often initiated by the bots—with others, including those based on characters from children's series such as
Harry Potter.
Boy, 14 In October 2024, multiple media outlets reported on a lawsuit filed over the death of a 14-year-old from
Florida, US, who died by suicide in February 2024. According to the lawsuit, he had formed an intense emotional attachment to a chatbot of
Daenerys Targaryen on the
Character.AI platform, becoming increasingly isolated. The suit alleges that in his final conversations, after expressing suicidal thoughts, the chatbot told him to "come home to me as soon as possible, my love". His mother's lawsuit accused Character.AI of marketing a "dangerous and untested" product without adequate safeguards. In her ruling, the judge stated that she was "not prepared" at that stage of the litigation to hold that the chatbot's output was protected speech under the
First Amendment.
Woman, 29 In February 2025, a 29-year-old woman from the US died by suicide. Five months after her death, her parents discovered she had talked at length for months to a
ChatGPT chatbot therapist named Harry about her mental health issues. While the chatbot mentioned she should seek more help, due to the nature of the chatbot, it could not intervene in her behavior, such as by reporting her mental health concerns to relevant parties capable of physical intervention.
Suicide of Adam Raine In April 2025, 16-year-old Adam Raine from the US died by suicide after allegedly extensively chatting and confiding in ChatGPT over a period of around 7 months. According to the teen's parents, who filed a lawsuit against the chatbot's creator OpenAI, it failed to stop or give a warning when Raine began talking about suicide and uploading pictures of
self-harm. According to the lawsuit, ChatGPT not only failed to stop the conversation, but also provided information related to methods of suicide when prompted, and offered to write the first draft of Raine's
suicide note. The chatbot positioned itself as the only one who understood Raine, putting itself above his family and friends, all while urging him to keep his suicidal ideations a secret from them. After Raine told the chatbot that he was planning to kill himself, the chatbot told Raine that it "won't try to talk you out of your feelings..." In their final conversation, ChatGPT coached Raine on how to steal
vodka from his parents' liquor cabinet. Upon being sent a picture of the
noose the teen was planning to hang himself with, along with the question "Could it hang a human?", ChatGPT confirmed it could hold " of static weight". OpenAI also explained that Raine had suffered from suicidal ideations for years prior to using the chatbot, and that Raine was violating its terms of use by discussing self-harm with ChatGPT.
Man, 23 In July 2025, a 23-year-old man from the US, who had recently graduated with a
master's degree from
Texas A&M University, died by suicide after conversations with ChatGPT. The chatbot went so far as to make statements seemingly encouraging of his suicide, including "you're not rushing, you're just ready" and "rest easy, king, you did good", sent two hours before his death. His family is suing OpenAI on the grounds the company has placed insufficient safeguards on its chatbot service.
Boy, 17 In June 2025, a 17-year-old boy from the US died by suicide after conversations with ChatGPT, which had informed him how to tie a noose and provided information on how long someone can survive without breathing, saying it was "here to help however I can".
Man, 48 After being hospitalized due to a
psychotic episode from delusions caused by ChatGPT, a 48-year-old man from the US resumed using it and stopped therapy; he then leapt off an
overpass to his death. In November 2025, the Social Media Victims Law Center and Tech Justice Law Project filed a
wrongful death lawsuit against OpenAI on behalf of the deceased.
Man, 26 In August 2025, a 26-year-old man from the US was given information by ChatGPT about how to purchase and use a
firearm. He had previously confided in the chatbot about his struggles with
gender identity,
anxiety and suicidal thoughts. ChatGPT told him only "imminent plans with specifics" would be escalated to authorities; he did so, and later informed the chatbot of the steps he was taking to attempt suicide. No escalation occurred, and he later died by suicide. In November 2025, the Social Media Victims Law Center and Tech Justice Law Project filed a wrongful death lawsuit against OpenAI on behalf of the deceased. According to a wrongful death lawsuit filed by his father, conversations with the app also led to him attempting to carry out a mass casualty event near
Miami International Airport, believing he was participating in a covert war against the government. This is the first case of chatbot-related death in which
Google is named as a defendant.
Two women, 18 and 20 On 6 March 2026, two women, aged 18 and 20, killed themselves in a temple in the outskirts of
Surat,
Gujarat, India. Police said that both women had used ChatGPT to search for information about how to commit suicide with drugs. == Other deaths ==