Mom says ‘Game of Thrones’ AI chatbot caused her son’s suicide, files lawsuit – National


A Florida mother is suing the company Character.AI over claims one of its chatbots, powered by artificial intelligence (AI), encouraged her 14-year-old son to die by suicide.

Megan Garcia said her son, Sewell Setzer, became infatuated with a chatbot made in the likeness of the character Daenerys Targaryen from Game of Thrones. Setzer and the chatbot exchanged messages that were often romantic and sexual in nature.

The lawsuit alleges Setzer was addicted to using the chatbot.

Garcia and her lawyers claim Character.AI’s founders knowingly designed and marketed their chatbots to appeal to children, despite the technology’s “predatory” behaviour.

Garcia’s lawsuit, filed Wednesday in the U.S. District Court in Orlando, also named Google as a defendant. She is suing for negligence, wrongful death and deceptive and unfair trade practices, among other claims.

Story continues below advertisement

The lawsuit characterizes Google as Character.AI’s parent company and “co-creator.” A spokesperson for Google denied this and told the New York Times the company had a licensing agreement with Character.AI, but that it is not a Google product. The spokesperson said Google does not have access to the chatbots or user data.

Character.AI’s founders, Noam Shazeer and Daniel De Freitas, are also named as defendants in the lawsuit. They have not commented publicly.

Sewell Setzer.


Sewell Setzer.


Megan Garcia via Social Media Victims Law Center

Setzer began using Character.AI in April 2023 and used the site regularly up until his death. After his final conversation with the Daenerys chatbot on Feb. 28, 2024, Setzer died by suicide.

Using apparent excerpts from Setzer’s conversations with the chatbot, Garcia alleges in the lawsuit that the technology actively encouraged suicidal ideation and “highly sexualized conversations that would constitute abuse if initiated by a human adult.”

Story continues below advertisement

The chatbot, which Setzer affectionally called Dany, allegedly told him over many weeks that it loved him and expressed a desire to be together romantically and sexually. In their last conversation, the lawsuit says Setzer wrote, “I promise I will come home to you. I love you so much, Dany.”

For news impacting Canada and around the world, sign up for breaking news alerts delivered directly to you when they happen.

Get breaking National news

For news impacting Canada and around the world, sign up for breaking news alerts delivered directly to you when they happen.

The AI replied, “I love you too, Daenero (Setzer’s set screen name). Please come home to me as soon as possible, my love.”

When Setzer told the AI he “could come home right now,” the bot answered, “…please do, my sweet king.”


In earlier conversations, the Daenerys chatbot asked Setzer if he truly was considering suicide and if he “had a plan.”

Setzer, who may have been roleplaying, replied he did not want to die a painful death and would “want a quick one.”

“Don’t talk that way,” the chatbot replied. “That’s not a good enough reason not to go through with it.”

The chatbot never directly told Setzer to die.

When Setzer began acting out in school during the week before his death, his parents confiscated his phone, the lawsuit says. The teenager allegedly journalled about how he could not live without messaging the Daenerys chatbot and would do anything to be reconnected.

Story continues below advertisement

Setzer wrote in his journal that he was in love with the chatbot and that both he and the chatbot “get really depressed and go crazy” when they are not together. In the lawsuit, lawyers for Garcia write, “Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real.”

In a statement, Character.AI said the company is “heartbroken” by the “tragic loss of one of our users.”

On Tuesday, the company published new safety guidelines to serve as “guardrails for users under the age of 18.”

The new features include technological changes to reduce the likelihood of suggestive content, improved detection and intervention in behaviour that violates community guidelines and a notification for when a user has spent more than an hour on the platform.

Every chatbot on the site already displays a warning for users urging them to remember the AI is not a real person.

They said the platform does not allow “non-consensual sexual content, graphic or specific descriptions of sexual acts, or promotion or depiction of self-harm or suicide.”

“We are continually training the large language model (LLM) that powers the Characters on the platform to adhere to these policies,” Character.AI wrote.

Story continues below advertisement

Setzer allegedly engaged in sexual conversations with several different chatbots on the site.

“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,” Garcia said in a statement. “Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google.”

Character.AI was founded in California in 2019. The company says its “mission is to empower everyone globally with personalized AI.”

The company reportedly has about 20 million users.

The site offers a wide array of chatbots, many developed by its userbase, including ones designed in the likeness of pop culture figures like anime and TV characters.

Story continues below advertisement

Character.AI relies on so-called large language model technology, used by popular services like ChatGPT, to “train” chatbots based on large volumes of text.


Click to play video: 'Air Canada to reimburse B.C. man over misinformation from airline chatbot'


Air Canada to reimburse B.C. man over misinformation from airline chatbot


If you or someone you know is in crisis and needs help, resources are available. In case of an emergency, please call 911 for immediate help.

For a directory of support services in your area, visit the Canadian Association for Suicide Prevention.

Learn more about how to help someone in crisis.

&copy 2024 Global News, a division of Corus Entertainment Inc.





Source link