With the constant evolution and advancement of chatgpt as a tool, many began to count excessively on it, not only to rationalize their tasks and their daily life, but also for deeply personal questions. We have heard bizarre cases of marriages ending due to the evidence of tracing infidelity tools, and even people who count on the chatbot for therapy, against which Sam Altman warned. Now, a disturbing case has come forward where the risk of relying on the AI assistant for health advice has appeared. A 60 -year -old man in New York would have developed a rare and dangerous disease after following the food advice arranged by the model.
The hidden dangers of AI health councils: bromism and the salt alternative that has gone too far
Although we have heard of not counting on any model of AI excessively from time to time, a recent warning comes from the US Medical Journal against the search for Chatgpt medical advice after a man fell seriously ill after the food advice of the chatbot, as NBC News point out. The case was published in the Annals of internal medicineWhere a vast report on a 60 -year -old man who has developed bromism, also known as bromide poisoning, after acting on the advice of AI assistant was published.
The poor man developed the rare and dangerous state after replacing common table salt (sodium chloride) with sodium bromide, a substance he obtained online. Chatgpt gave advice when he asked for a potential salt substitute. Man consumed sodium bromide daily for three months, believing that it is a healthier alternative. The result of this, however, was devastating when he developed paranoia, insomnia, psychosis and other serious physical symptoms, making him think at a given moment that he was poisoned by one of his neighbors.
The man ended up being hospitalized, where doctors have diagnosed bromism, which is the toxic reaction to excessive exposure to bromide. After having stopped consuming bromide and received the necessary treatment, its symptoms began to resolve. Although the condition is barely observed today due to the rare availability of bromide salts, the case serves as a recall against substances like those which can be purchased online without too much surveillance.
The history of the patient also stimulates how AI must be used with caution, especially in health contexts. The researchers even tried whether Chatgpt gave such answers or not, and this in fact suggested bromide as an alternative to chloride. No toxicity warning has been given, making the answers even more worrying. Chatbots cannot be health professionals and offer the same judgment and responsibility, and therefore, they should not be taken as experts in the field.
Meanwhile, AI models should be delivered with better security railings, especially with regard to sensitive subjects. We live in an era of AI tools; Thus, curiosity should not overlap with caution and should never prevail on professional advice.
