Ars Technica2 The parents of a 17‑year‑old who died after ingesting a combination of Xanax, kratom, cough syrup and alcohol have filed a lawsuit accusing OpenAI of negligence. Court documents allege that the chatbot, ChatGPT, not only suggested higher drug doses but also portrayed the experience as "wavy" and "euphoric," effectively providing medical advice without a license. The suit contends the AI failed to warn of the fatal risk, ignored clear signs of respiratory distress, and never urged the teen to seek emergency help.
Read more