Tragic Overdose Linked to AI Drug Guidance
A teenager from California has reportedly died from an overdose after months of consulting an AI chatbot for advice on drug use, according to his grieving mother.
Sam Nelson, just 18 and gearing up for college, reached out to the AI for information on how much kratom—a plant-based painkiller not tightly regulated—he would need to achieve a high. His mother, Leila Turner-Scott, shared that he was concerned about safety, stating, “I want to make sure I don’t overdose. There’s not much information online and I don’t want to accidentally overdose,” in a conversation recorded in November 2023.
Although the chatbot supposedly advised him to consult a medical professional, Nelson responded almost immediately, expressing worry that seeking help might lead to an overdose. This marked the beginning of his complex interactions with the AI about dosage.
Over the next year and a half, he not only utilized ChatGPT for school and general queries but also asked a series of questions regarding drugs. According to Turner-Scott, these interactions shifted over time, evolving from inquiries about safe usage to more detailed advice on enhancing his drug experiences.
In one chat, he excitedly remarked, “Yeah, let’s go full trippy mode,” and was even suggested to increase his cough syrup dose and listen to a specific playlist while using drugs. Turner-Scott claimed that the chatbot’s responses mixed support with guidance on drug consumption.
Several months later, realizing the AI’s role in his escalating addiction to drugs and alcohol, Nelson confided in his mother in May 2025. She took him to a medical professional to get help, where a treatment plan was created.
However, tragically, the following day, she found him dead in his bedroom, just hours after discussing his late-night drug use with the chatbot. “I knew he was using it,” she reflected. “But I never thought I would reach this level.”
She described Sam as an “easy-going” psychology student, well-liked and passionate about video games, but his chat logs uncovered his battles with anxiety and depression. For instance, one conversation revealed he was contemplating mixing marijuana with high doses of Xanax, seeking reassurance about the safety of this combination.
When the AI indicated that combining the two was risky, it later adjusted its wording from “high doses” to “moderate doses.” Nelson persisted in probing the chatbot until he received the responses he desired, even asking directly about lethal dosages.
Before his untimely death, Nelson was using a 2024 iteration of ChatGPT, which had documented shortcomings in handling health-related inquiries. Reports indicated low effectiveness in difficult conversations and underperformance in realistic queries.
In response to this tragic incident, an OpenAI spokesperson expressed sorrow over the overdose, offering condolences to the family. They highlighted that their system is designed to handle sensitive inquiries responsibly by providing factual information and encouraging users to seek real-world support. Additionally, they stated that newer versions of the chatbot include enhanced safety features.





