Diet Change Leads to Medical Emergency
A man sought advice from ChatGPT before altering his diet. After three months of strictly following these changes, he found himself in the emergency room, showing serious psychiatric symptoms like paranoia and hallucinations.
The 60-year-old was diagnosed with bromism, a condition caused by prolonged exposure to bromide or bromine. In this situation, he had consumed sodium bromide purchased online.
This case, reported on August 5 in Annals of Internal Medicine Clinical Cases, raised some eyebrows. When Live Science inquired further, an OpenAI representative pointed out that their services shouldn’t be relied upon for health diagnoses or treatments, stressing that users should seek professional medical guidance.
“A Personal Experiment”
Historically, bromide compounds were frequently included in various medications, such as sedatives and sleep aids, in the 19th and 20th centuries. Over time, it became evident that chronic exposure, often due to misuse, led to bromism symptoms.
This condition can trigger a range of neuropsychiatric issues like psychosis and coordination difficulties, largely because bromide accumulates in the body and hampers neuron functioning.
During the late 20th century, numerous bromide forms were removed from over-the-counter medications, which significantly reduced bromism cases. Still, recent occurrences have been noted, particularly linked to bromide-infused dietary supplements bought online.
Prior to his health scare, the man was exploring the effects of excessive table salt and was surprised to find limited information regarding chloride reduction. Motivated by his past nutrition studies, he opted for an experiment to eliminate chloride from his diet.
He turned to ChatGPT—likely version 3.5 or 4.0—seeking guidance on replacing chloride. Although the specific exchange isn’t available, the man claimed the AI suggested bromide as a substitute. This appears to be a logic applicable to cleaning purposes rather than dietary ones.
To test this further, the man’s doctors asked ChatGPT about potential chloride substitutes and received a similar bromide suggestion without any health warnings or request for clarification, a gap they noted a medical professional would likely fill.
Recovering from Bromism
After three months of sodium bromide use, he approached the emergency room, convinced his neighbor intended to poison him. Initial tests revealed high carbon dioxide levels in his blood and increased alkalinity.
Interestingly, he had elevated chloride levels, but normal sodium levels. This turned out to be a false reading due to a phenomenon known as pseudohyperchloremia caused by the large amounts of bromide affecting the chloride measurement. Following literature consultation and Poison Control input, his doctors concluded the diagnosis was bromism.
While monitored for electrolyte balance, the patient expressed intense thirst but became suspicious of the water given to him. His paranoia escalated, leading to hallucinations and an attempt to leave the hospital. This resulted in an involuntary psychiatric hold and treatment with antipsychotic medication.
His vital signs improved with hydration and electrolyte restoration, and as his mental health stabilized, he revealed his consultation with ChatGPT. He also described other symptoms like skin issues and excessive thirst, further pointing to bromism.
Gradually tapering off the antipsychotic over three weeks, he was discharged and remained stable during a follow-up visit two weeks later.
The report concluded with a caution about the potential risks of AI providing decontextualized health information, likely absent from a medical expert’s advice. They underscored the necessity for healthcare providers to remain informed about how patients seek health information through AI tools.
Additionally, another study found that various language models, including ChatGPT, can be prone to producing misleading clinical information during medical evaluations, raising serious concerns about their reliability in healthcare settings.
This article serves only as informational content and should not be taken as medical or dietary advice.





