SELECT LANGUAGE BELOW

Man Admitted to Hospital Due to Mental Health Issues After Following AI Suggestions

Man Admitted to Hospital Due to Mental Health Issues After Following AI Suggestions

AI Missteps in Medical Advice Highlight Risks

In recent times, AI has become popular for tasks like recommending restaurants or drafting emails. However, its use as a source of medical advice shows some significant shortcomings.

Take, for instance, a man who followed a chatbot’s health suggestions only to find himself hospitalized due to a rare type of toxicity.

The incident began when this individual aimed to better his health by cutting back on salt, or sodium chloride. Like many people might do these days, he consulted ChatGPT for alternatives.

It appears the chatbot recommended sodium bromide, which he then ordered online and added to his diet.

While sodium bromide can indeed replace sodium chloride, it’s mostly in contexts like cleaning hot tubs—not enhancing the flavor of food. Unfortunately, the AI failed to provide this essential context.

Fast forward three months, and the patient arrived at the emergency room, displaying paranoid delusions and believing his neighbor was trying to harm him.

“During the first day of his stay, he became increasingly paranoid, experiencing both auditory and visual hallucinations. His attempts to escape led to an involuntary psychiatric hold due to severe impairment,” the medical team noted in their report.

After receiving treatment with anti-psychotic medication, he was able to calm down and share details about his AI-inspired diet. This information, along with lab results, confirmed he was suffering from bromism, a toxic buildup of bromide.

Typically, bromide levels in healthy individuals remain below 10 mg/L; this patient had levels recorded at an alarming 1,700 mg/L.

Bromism was once fairly common in the early 20th century and accounted for a notable portion of psychiatric admissions. However, occurrences of this condition significantly decreased in the late 20th century as bromide-containing medications fell out of favor.

After three weeks of treatment, the patient was discharged without significant problems.

The key takeaway from this situation isn’t so much about an old illness resurfacing—it’s more about the inadequacy of current AI technology compared to human expertise in critical matters.

“It’s crucial to recognize that ChatGPT and similar AI systems can produce scientific inaccuracies, ignore critical evaluation of results, and contribute to the spread of misinformation,” the authors emphasized.

“It’s very unlikely that a medical professional would have suggested sodium bromide as a substitute for sodium chloride to a patient looking for alternatives.”

This research can be found in the journal Annals of Internal Medicine: Clinical Cases.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News