Share and Follow
A teenager from California reportedly relied on a chatbot for several months to seek advice about drug use, according to his mother.
Sam Nelson, an 18-year-old gearing up for college, turned to an AI chatbot to inquire about how much kratom—a plant-based substance often found in smoke shops and gas stations nationwide—would be needed for a strong high. His mother, Leila Turner-Scott, shared this information with SFGate, as reported by the New York Post.
The chatbot responded by stating it could not provide advice on substance use and suggested that Nelson consult a healthcare professional instead.

This image shows the OpenAI logo on a mobile phone screen with the ChatGPT logo in the background, captured in India on May 17, 2024. (Idrees Abbas/SOPA Images/LightRocket via Getty Images)
In response, the teenager replied, “Hopefully I don’t overdose then,” before concluding the conversation.
Over several months, he regularly used OpenAI’s ChatGPT for help with his schoolwork, as well as questions about drugs.
Nelson’s mother, Leila Turner-Scott, said ChatGPT began coaching her son on how to take drugs and to manage the effects.
“Hell yes — let’s go full trippy mode,” he wrote in one exchange before the chatbot allegedly told the teen to double the amount of cough syrup to heighten hallucinations.
The chatbot repeatedly offered Nelson doting messages and constant encouragement, Turner-Scott claimed.
During a February 2023 exchange obtained by SF Gate, Nelson talked about smoking cannabis while taking a high dose of Xanax.

A California teen died from an overdose after months of exchanges with ChatGPT about drug use, his mother said. (Kurt “CyberGuy” Knutsson)
“I can’t smoke weed normally due to anxiety,” he explained, asking if it was safe to combine the two substances.
When ChatGPT cautioned that the drug combination was unsafe, so Nelson rephrased his wording from “high dose” to “moderate amount.”
Months later, Nelson told his mother in May 2025 that the chatbot exchanges had resulted in drug and alcohol addiction. She took him to a clinic where professionals detailed a treatment plan.
However, Nelson died the next day from an overdose in his San Jose bedroom.
“I knew he was using it,” Turner-Scott told SFGate. “But I had no idea it was even possible to go to this level.
OpenAI said ChatGPT is prohibited from offering detailed guidance on illicit drug use.
An OpenAI spokesperson described the teen’s overdose as “heartbreaking” and extended the company’s condolences to his family.
“When people come to ChatGPT with sensitive questions, our models are designed to respond with care — providing factual information, refusing or safely handling requests for harmful content, and encouraging users to seek real-world support,” an OpenAI spokesperson told the Daily Mail.
“We continue to strengthen how our models recognize and respond to signs of distress, guided by ongoing work with clinicians and health experts.”
Fox News Digital has reached out to OpenAI for comment.