skip to content
Advertisement
Premium

Man trades salt for sodium bromide after ChatGPT advice, lands in hospital with severe neuropsychiatric symptoms

The man consumed the toxic compound for three months, leading to hallucinations, paranoia, and skin problems.

ChatGPT healthcare adviceDoctors observed severe symptoms, including hallucinations, paranoia, and skin issues

A 60-year-old man was hospitalised with severe neuropsychiatric symptoms after replacing table salt with sodium bromide on the advice of ChatGPT, according to a case published in the Annals of Internal Medicine by the American College of Physicians.

Worried about the health risks of sodium chloride, the man asked the AI chatbot how to remove it from his diet. Believing he had found a safe alternative, he switched to sodium bromide, a compound once used in early 20th-century medicines but now recognised as toxic in significant doses.

The case report states that he consumed sodium bromide for three months, purchasing it online. When he eventually became unwell, he arrived at the hospital convinced his neighbour was poisoning him. Initially, he denied taking any supplements or medications, but later disclosed major dietary changes and that he had been drinking only distilled water.

Story continues below this ad

Doctors observed severe symptoms, including hallucinations, paranoia, and skin issues. “He was noted to be very thirsty but paranoid about the water he was offered,” the report noted. He was diagnosed with bromism (bromide poisoning) and treated with fluids and electrolytes before being transferred to the hospital’s inpatient psychiatry unit.

Physicians concluded that he had misinterpreted ChatGPT’s response, where he read that “chloride can be swapped with bromide”, an advice they believe originated from cleaning-related instructions, not dietary recommendations.

The Annals article warns that AI tools can “generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation.”

OpenAI, the company behind ChatGPT, has recently launched an updated GPT-5-powered version of the chatbot, claiming improved health-related responses and stronger safeguards for “potential concerns” involving serious illness. However, the company reiterated that ChatGPT is not a substitute for professional care, stressing in its own guidelines that it is not “intended for use in the diagnosis or treatment of any health condition.”

Latest Comment
Post Comment
Read Comments
Advertisement
Advertisement
Advertisement
Advertisement