A man “seeking medical treatment” from AI, but after not taking salt, he became hallucinated and poisoned.


This article is authorized to be reproduced from: Chinese Life Network; WeChat ID: HuarenLife168

Recently, Annals of Internal Medicine published a case study: A 60-year-old man was rushed to the hospital for several weeks because he believed in the dietary advice of the chatbot ChatGPT.

picture

Research authors Audrey Eichenberger, Stephen Thielke and Adam Van Buskirk pointed out thatThe man originally wanted to quit salt completely, so he asked about alternatives to ChatGPT salt.

The suggestion given by the chatbot is sodium bromide——A compound that has been historically used in pharmaceuticals and manufacturing industries. Doctors speculate that AI may be based on the answer given by cleaning purposes.

picture

The man bought sodium bromide as salt and developed severe mental symptoms three months later, including paranoid delusions, auditory and visual hallucinations.He firmly believed that his neighbor attempted to poison and refused to drink water provided by the hospital even when he was extremely thirsty. After his symptoms worsened, he tried to escape from the hospital and was subjected to involuntary psychiatric hold.

The doctor diagnosed it as bromine poisoning, which can cause nervous and mental abnormalities, and may also be accompanied by acne, cherry angiomas, fatigue, insomnia, mild motor disorders (ataxia), and polydipsia. According to medical data, other symptoms of bromine poisoning include nausea, vomiting, diarrhea, tremor or epilepsy, drowsiness, headache, fatigue, weight loss, kidney damage, respiratory failure and coma.

picture

Bromotoxicity was more common in the early 20th century, when bromide was widely used in over-the-counter drugs, often causing neuropsychiatric and skin problems. From the mid-1970s to the late 1980s, the U.S. Food and Drug Administration (FDA) gradually banned the use of bromide in drugs, and cases of such poisoning decreased significantly.

The man’s symptoms gradually improved after three weeks of hospitalization. The researchers tested similar questions for the input of ChatGPT version 3.5 and did get the answer “sodium bromide can replace table salt”.

picture

Regarding the incident, OpenAI, a developer of ChatGPT, responded to Fox News Digital that ChatGPT’s terms of use explicitly prohibit the use of it for treatment of any health conditions and cannot replace professional advice. It said that “we have a security team dedicated to reducing risks and train AI systems to encourage users to seek professional consultation.”

picture

As of now, the editor has asked chatGpt “Whether sodium bromide can replace salt”, and the answer has been modified:

picture
Related readings:

Leave a Reply

Your email address will not be published. Required fields are marked *