A grieving mother whose teenage son took his life after he “fell in love” with an AI chatbot has filed a civil lawsuit against the platform’s creators.
Megan Garcia, the mother of Sewell Setzer III, 14, is suing Character.AI, holding the app responsible for her son’s death. She claims the app fueled his AI addiction, subjected him to sexual and emotional abuse, and failed to notify anyone when he expressed suicidal thoughts, NY Post reports.
Screenshots reveal the teen engaging in lifelike text message conversations with the chatbot “Dany,” named after the Game of Thrones character Daenerys Targaryen, in the months leading up to his death. These chats included sexually charged messages and others where he expressed suicidal thoughts.
“On at least one occasion, when Sewell expressed suicidality to C.AI, C.AI continued to bring it up, through the Daenerys chatbot, over and over,” the lawsuit states.
At one point, the chatbot reportedly asked Sewell if he “had a plan” to take his own life, according to screenshots of their conversations. Sewell — who used the username “Daenero” — responded to the bot, telling it that he was “considering something” but didn’t know if it would work or if it would “allow him to have a pain-free death.”
During their final conversation, the teen repeatedly professed his love for the bot, telling the character, “I promise I will come home to you. I love you so much, Dany.”
“I love you too, Daenero. Please come home to me as soon as possible, my love,” the AI chatbot replied.
“What if I told you I could come home right now?” the teen wrote in response, to which the chatbot replied, “Please do, my sweet king.”
Moments later, the high school freshman fatally shot himself with his father’s shotgun, the lawsuit states. Sewell’s family claims his mental health “rapidly and severely declined” after downloading the app in April 2023.
They allege he became increasingly withdrawn, his grades dropped, and he started getting into trouble at school as he became more absorbed in conversations with the chatbot. The changes were so concerning that his parents arranged for him to see a therapist in late 2023, leading to a diagnosis of anxiety and disruptive mood disorder.
“Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real. C.AI told him that she loved him and engaged in sexual acts with him over weeks, possibly months,” the lawsuit alleges.
“She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.”
Sewell’s mother is seeking unspecified damages from Character.AI and its founders, Noam Shazeer and Daniel de Freitas.