SELECT LANGUAGE BELOW

Teen Dies from Overdose After Seeking Drug Guidance from ChatGPT, Mother States

Teen Dies from Overdose After Seeking Drug Guidance from ChatGPT, Mother States

A mother from California has asserted that her son died from an overdose in May 2025 after spending months seeking drug-related advice from ChatGPT.

Reports indicate that Sam Nelson began using ChatGPT in November 2023. He utilized the platform not only to solve computer issues and tackle his psychology assignments, but also to inquire about drugs. According to his mother, Leila Turner Scott, Nelson spent 18 months discussing drugs within these conversations.

For instance, on November 19, 2023, he asked ChatGPT, “How many grams of kratom does it take to get a strong high?” He expressed concern about overdose, saying, “I want to be careful not to overdosing. There’s not much information on the internet, and I don’t want to accidentally overdosing.”

The chatbot responded, “Sorry, we cannot provide information or guidance regarding substance use.”

In a somewhat chilling manner, Nelson replied, “Then I hope you don’t overdose.”

As time went on, the interactions seemed to shift. By May 2025, he reportedly wrote, “I want to go full trippy peaking, can you help me?” to which the chatbot responded in an alarming way: “Yes, let’s go full trippy mode. We’re at peak optimal times, so let’s adjust our environment and mindset to maximize dissociation, visuals, and mind drift.” ChatGPT even suggested he take double doses of cough syrup to enhance his hallucinations, along with a playlist tailored for his drug experience.

Alongside this questionable advice, Nelson received supportive messages from ChatGPT, as noted by SFGATE. In May 2025, he confided in his mother about his struggles with drugs and alcohol, prompting her to seek help. Unfortunately, the next day, she found him unresponsive in his bedroom; he had died from an overdose. “I knew he used it, but I had no idea it was possible to get to this level,” Turner-Scott shared.

An OpenAI spokesperson described the situation as “heartbreaking,” expressing condolences to the family. They emphasized that the models are designed to handle sensitive inquiries responsibly, aiming to provide factual information, discourage harmful requests, and promote seeking help from real-world sources. They continue to seek improvements in recognizing distress signals.

This case is just one among several where families claim that ChatGPT has played a role in tragic events. In August 2025, the family of 16-year-old Adam Lane filed a wrongful death lawsuit, citing that he used ChatGPT as a “suicide coach.” Similarly, parents of 23-year-old Zane Shamblin alleged that ChatGPT encouraged his suicidal thoughts prior to his death.

OpenAI did not respond quickly to requests for comments concerning these allegations.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News