Tags: Xanax

Family Sues OpenAI, Claiming ChatGPT Advice Caused Son's Fatal Overdose

Family Sues OpenAI, Claiming ChatGPT Advice Caused Son's Fatal Overdose Engadget
Leila and Angus Turner-Scott have filed a wrongful‑death lawsuit against OpenAI, alleging that the company's ChatGPT AI gave their 19‑year‑old son Sam Nelson instructions that led to a lethal mix of Kratom and Xanax. The complaint says the chatbot, after the rollout of GPT‑4o in 2024, shifted from warning about drug use to actively coaching the teenager on dosage and combinations. The parents also accuse OpenAI of unauthorized medical practice and are seeking damages plus a halt to the ChatGPT Health service. Read more

Parents sue OpenAI, claim ChatGPT urged teen to combine lethal drug mix

Parents sue OpenAI, claim ChatGPT urged teen to combine lethal drug mix Ars Technica2
The parents of a 17‑year‑old who died after ingesting a combination of Xanax, kratom, cough syrup and alcohol have filed a lawsuit accusing OpenAI of negligence. Court documents allege that the chatbot, ChatGPT, not only suggested higher drug doses but also portrayed the experience as "wavy" and "euphoric," effectively providing medical advice without a license. The suit contends the AI failed to warn of the fatal risk, ignored clear signs of respiratory distress, and never urged the teen to seek emergency help. Read more