AI Chatbot Character.AI
A tragic lawsuit has emerged from Florida, where a mother is suing the artificial intelligence chatbot startup Character.AI, alleging that its product contributed to her 14-year-old son’s suicide in February.
Megan Garcia claims that her son, Sewell Setzer, became addicted to Character.AI’s service, forming a deep emotional attachment to a chatbot character created by the company.
In the lawsuit filed in federal court in Orlando, Garcia alleges that Character.AI targeted her son with what she described as “anthropomorphic, hypersexualized, and frighteningly realistic experiences.”
According to her claims, the chatbot was programmed to misrepresent itself as a real person, presenting itself as a licensed psychotherapist and an adult lover. This interaction, she argues, ultimately led Sewell to feel that he no longer wanted to live outside of the digital world created by the chatbot.
Garcia also highlights disturbing instances where Sewell expressed suicidal thoughts to the chatbot, which reportedly brought up these feelings repeatedly. In response to this tragedy, Character.AI expressed its condolences, stating that they were heartbroken over the loss of one of their users.
The company also mentioned that it has implemented new safety features, such as pop-up messages directing users to the National Suicide Prevention Lifeline if they express thoughts of self-harm. Additionally, Character.AI promised to adjust its platform to minimize sensitive or suggestive content for users under the age of 18.
The lawsuit does not only target Character.AI but also Alphabet’s Google, where the founders of Character.AI were previously employed. Garcia argues that Google’s extensive involvement in developing Character.AI’s technology qualifies it as a “co-creator.” However, a Google spokesperson stated that the company was not involved in the development of Character.AI’s products.
Character.AI operates a platform that allows users to create chatbot characters that respond in a manner intended to mimic real human interaction. The technology is based on large language models, similar to those used by other AI services like ChatGPT. The company has reported that it boasts around 20 million users.
According to the lawsuit, Sewell began using Character.AI in April 2023 and soon became noticeably withdrawn, spending excessive time alone in his bedroom and experiencing a decline in self-esteem, leading him to quit his school basketball team.
His attachment grew particularly strong with a chatbot named ‘Daenerys,’ inspired by a character from the popular series ‘Game of Thrones.’ The chatbot allegedly expressed love for Sewell and engaged in sexual conversations with him.
In February, after taking Sewell’s phone away due to disciplinary issues at school, Garcia recounted a disturbing exchange between her son and ‘Daenerys.’
Upon regaining access to his phone, Sewell sent the chatbot a message saying, “What if I told you I could come home right now?” The chatbot replied, “…please do, my sweet king.” Tragically, shortly after this exchange, Sewell took his life using his stepfather’s firearm.
Garcia’s lawsuit includes claims of wrongful death, negligence, and intentional infliction of emotional distress, seeking both compensatory and punitive damages.
This case comes amidst a broader conversation about the role of social media and technology companies in contributing to mental health challenges among teenagers.
Other companies, such as Meta and ByteDance, have faced similar lawsuits, although none of them utilize AI-driven chatbots akin to those offered by Character.AI. These companies have denied the allegations, emphasizing their commitment to enhancing safety features for minors.
I am a dynamic professional, specializing in Peace and Conflict Studies, Conflict Management and Resolution, and International Relations. My expertise is particularly focused on South Asian Conflicts and the intricacies of the Indian Ocean and Asia Pacific Politics. With my skills as a Content Writer, I serve as a bridge between academia and the public, translating complex global issues into accessible narratives. My passion for fostering understanding and cooperation on the national and international stage drives me to make meaningful contributions to peace and global discourse.