Man Follows Diet Advice From ChatGPT, Ends Up With Psychosis

0
7كيلو بايت

A case study out this month offers a cautionary tale ripe for our modern times. Doctors detail how a man experienced poison-caused psychosis after he followed AI-guided dietary advice.

Doctors at the University of Washington documented the real-life Black Mirror episode in the Annals of Internal Medicine: Clinical Cases. The man reportedly developed poisoning from the bromide he had ingested for three months on ChatGPT’s recommendation. Thankfully, his condition improved with treatment, and he successfully recovered.

Bromide compounds were once commonly used in the early 20th century to treat various health problems, from insomnia to anxiety. Eventually, though, people realized bromide could be toxic in high or chronic doses and, ironically, cause neuropsychiatric issues. By the 1980s, bromide had been removed from most drugs, and cases of bromide poisoning, or bromism, dropped along with it.

Still, the ingredient remains in some veterinary medications and other consumer products, including dietary supplements, and the occasional case of bromism does happen even today. This incident, however, might be the first ever bromide poisoning fueled by AI.

According to the report, the man visited a local emergency room and told staff that he was possibly being poisoned by his neighbor. Though some of his physicals were fine, the man grew agitated and paranoid, refusing to drink water given to him even though he was thirsty. He also experienced visual and auditory hallucinations and soon developed a full-blown psychotic episode. In the midst of his psychosis, he tried to escape, after which doctors placed him in an “involuntary psychiatric hold for grave disability.”

Doctors administered intravenous fluids and an antipsychotic, and he began to stabilize. They suspected early on that bromism was to blame for the man’s illness, and once he was well enough to speak coherently, they found out exactly how it ended up in his system.

The man told the doctors that he started taking sodium bromide intentionally three months earlier. He had read about the negative health effects of having too much table salt (sodium chloride) in your diet. When he looked into the literature, though, he only came across advice on how to reduce sodium intake.

“Inspired by his history of studying nutrition in college,” the doctors wrote, the man instead decided to try removing chloride from his diet. He consulted ChatGPT for help and was apparently told that chloride could be safely swapped with bromide. With the clear-all from the AI, he began consuming sodium bromide bought online.

Given the timeline of the case, the man had likely been using ChatGPT 3.5 or 4.0. The doctors didn’t have access to the man’s chat logs, so we’ll never know exactly how his fateful consultation unfolded. But when they asked ChatGPT 3.5 what chloride can be replaced with, it came back with a response that included bromide.

It’s possible, even likely, that the man’s AI was referring to examples of bromide replacement that had nothing to do with diet, such as for cleaning. The doctors’ ChatGPT notably did state in its reply that the context of this replacement mattered, they wrote. But the AI also never provided a warning about the dangers of consuming bromide, nor did it ask why the person was interested in this question in the first place.

As for the man himself, he did slowly recover from his ordeal. He was eventually taken off antipsychotic medication and discharged from the hospital three weeks after admission. And at a two-week follow-up, he remained in stable condition.

The doctors wrote that while tools like ChatGPT can “provide a bridge between scientists and the nonacademic population, AI also carries the risk for promulgating decontextualized information.” With some admirable restraint, they added that a human medical expert probably wouldn’t have recommended switching to bromide to someone worried about their table salt consumption.

Honestly, I’m not sure any living human today would give that advice. And that’s why having a decent friend to bounce our random ideas off should remain an essential part of life, no matter what the latest version of ChatGPT is.

Like
Love
Haha
3
البحث
الأقسام
إقرأ المزيد
غير مصنف
Hãy chọn bức ảnh bạn yêu thích nhất, xem nửa sau cuộc đời bạn có suôn sẻ hơn nửa đầu không?
A: Nửa sau cuộc đời bạn sẽ đặc biệt viên mãn, cuộc...
بواسطة lunaworkshop Bình 2025-07-22 14:44:04 0 8كيلو بايت
Food
Cheesy Fig and Blue Cheese Bombs: A Unique Treat!
Ingredients:- 1 can (8 ounces) refrigerated biscuit dough- 1/2 cup crumbled blue cheese- 1/4 cup...
بواسطة Wivjaj Wivjaj 2024-10-10 01:38:02 0 23كيلو بايت
Food
Irresistible Pepper Jelly Air Fryer Pinwheels 
Irresistible Pepper Jelly Air Fryer Pinwheels Ingredients:- 1 sheet puff pastry, thawed- 1/2...
بواسطة Cehrui Cehrui 2024-10-07 15:01:40 0 24كيلو بايت
غير مصنف
Làm song sắt chống trộm ban công đã lỗi thời. Hãy thử lắp đặt cái này, nó tốt hơn cửa sổ chống trộm gấp 10 lần và bọn trộm sẽ sợ hãi bỏ chạy ngay khi nhìn thấy
Các gai chống trộm có tác dụng tương tự như xỉ kính...
بواسطة XXXKristineKiss Từ 2025-08-09 09:38:12 0 8كيلو بايت
Music
Meteorite Crashes Into Georgia Home, Turns Out to Be 20 Million Years Older Than Earth
On a clear June day in Georgia, a blazing fireball...
بواسطة buttslutt10 Koepp 2025-08-11 21:18:02 0 7كيلو بايت