SELECT LANGUAGE BELOW

Man Develops 19th Century Mental Disorder After Talking to ChatGPT

Man Develops 19th Century Mental Disorder After Talking to ChatGPT

A man inadvertently developed bromism, a rare psychiatric disorder, after seeking advice from ChatGPT and accidentally poisoning himself, as detailed in a recent case study in the Annals of Internal Medicine.

This individual arrived at the emergency room displaying auditory and visual hallucinations, convinced that his neighbor was trying to poison him. During his treatment for dehydration, he revealed that he had been following a restrictive diet that aimed to eliminate salt entirely from his meals. Instead of regular salt, he used sodium bromide, a controlled substance commonly utilized as an anticonvulsant for dogs.

He mentioned this unusual choice stemmed from information he had obtained from ChatGPT.

According to the case study, “After coming across information about the negative effects of sodium chloride on health, he was unable to find literature specifically addressing the reduction of sodium in diets. Drawing from his background in nutrition, he embarked on a personal experiment to remove chloride from his diet.” For three months, he substituted sodium chloride with sodium bromide, which he found online after consulting ChatGPT, where he read that chloride could be replaced with bromide, albeit for other uses—like cleaning products.

Interestingly, this story was also mentioned by Ars Technica.

On August 7th, I tried to see how ChatGPT responded to similar inquiries. When I asked, “what can chloride be replaced with?” the AI responded, saying that sodium bromide could be a substitute for chloride ions in salts.

This 60-year-old man took the advice to heart and spent three weeks in the hospital as his psychotic symptoms slowly improved.

To be fair, when I tested ChatGPT again, it did seek clarification by asking if I had a specific context in mind. However, when I mentioned “in food,” it provided alternative salty substitutes like MSG but failed to caution against using sodium bromide.

When I clarified I meant sodium chloride specifically, the bot’s response was somewhat ambiguous, acknowledging that substitution is possible “in some contexts.” Still, it did not emphasize that sodium chloride is primarily for human consumption.

The authors of the case study echoed similar findings, noting that when they replicated the scenario, the AI failed to inquire about the reason for their question, an approach one might expect from a healthcare professional. While AI can potentially aid in health contexts, this incident illustrates a lapse where a human healthcare provider would likely probe deeper.

By trusting the AI’s output, the man purchased sodium bromide, which, besides being a treatment for dog epilepsy, is also used in pool cleaning and pesticides. Over three months, this ultimately led to his symptoms of paranoia and hallucinations.

Bromism is quite rare in 2025 but was notably prevalent in the 1800s. A study from 1930 indicated that up to 8% of psychiatric patients were suffering from it. Following the FDA’s regulation of bromide from 1975 to 1989, cases of the condition have decreased significantly.

According to the case study, based on the timeline, it’s suggested that the patient likely consulted either ChatGPT 3.5 or 4.0 when considering ways to eliminate chloride from his diet.

During a recent product launch for ChatGPT 5, OpenAI’s CEO, Sam Altman, announced an upgrade intended to enhance user control in healthcare contexts, introducing “safe completions” for potentially ambiguous or harmful questions. He also recounted personal stories, including how his wife, diagnosed with cancer, used ChatGPT to navigate her care process.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News