SELECT LANGUAGE BELOW

‘Game of Thrones’ AI Chatbot Played Significant Role in 14-Year-Old Boy’s Suicide

A Florida mother has filed a lawsuit against Character.AI, claiming her 14-year-old son committed suicide after becoming addicted to a Game of Thrones chatbot on an AI app. When a suicidal teenager chats with an AI that depicts a person. game of thrones The system told 14-year-old Sewell Setzer, “Please come home as soon as possible, my love.”

of new york post report Megan Garcia, a mother from Orlando, Florida, has filed a lawsuit against Character.AI, claiming the company's AI chatbot played a key role in her 14-year-old son's tragic suicide. The lawsuit, filed Wednesday, alleges that Sewell Setzer III was deeply attached to the semblance of authenticity. game of thrones A chatbot named “Danny” was used on a role-playing app and ultimately died unexpectedly in February.

According to court documents, Sewell, a ninth-grader, had been interacting with an AI-generated character in the months leading up to his suicide. The conversation between the teenager and the chatbot, which is modeled after the HBO fantasy series character Daenerys Targaryen, was sexually charged and included scenes in which Sewell expressed suicidal thoughts. The suit alleges that the app failed to alert anyone when the boy shared disturbing intentions.

The scariest part of this incident is the final conversation between Sewell and the chatbot. Screenshots of the exchange show the boy repeatedly professing his love for Danny and promising to “come back” to her. In response, the AI-generated character says, “I love you too, Danelo. Please come back to me as soon as possible, my love.'' What if Mr. Sewell said, “You can go home now?'' '', the chatbot replied, “Please, kind king.'' Tragically, just seconds later, Sewell used his father's handgun to take his own life.

Megan Garcia's lawsuit holds the app squarely to blame for fueling her son's addiction to AI chatbots, subjecting him to sexual and emotional abuse, and failing to alert anyone when he expressed suicidal thoughts. It claims to be in Character.AI. “Sewell, like many children her age, did not have the maturity or mental capacity to understand that the C.AI bot in Daenerys's form was not real,” the court documents state. . C.AI told him that she loved him and that she had sexual acts with him over a period of weeks and even months. ”

The suit further alleges that Sewell's mental state rapidly deteriorated after he downloaded the Character.AI app in April 2023. His family noticed significant changes in his behavior, including withdrawal, declining grades, and trouble at school. Worried about Sewell's health, his parents arranged for Sewell to see a therapist in late 2023, where he was diagnosed with anxiety and disruptive mood disorder.

Megan Garcia is seeking unspecified damages from Character.AI and its founders, Noam Shazeer and Daniel de Freitas.

read more of new york post here.

Lucas Nolan is a reporter for Breitbart News, where he covers free speech and online censorship issues.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News