29.4 C
New York
Tuesday, August 12, 2025

Man poisons himself after taking ChatGPT's dietary advice



A 60-year-old man wound up in the hospital after seeking dietary advice from ChatGPT and accidentally poisoning himself.

According to a report published in the Annals of Internal Medicine, the man wanted to eliminate salt from his diet and asked ChatGPT for a replacement.

The artificial intelligence (AI) platform recommended sodium bromide, a chemical often used in pesticides, as a substitute. The man then purchased the sodium bromide online and replaced it with salt for three months.

The man eventually went to the hospital, fearing his neighbor was trying to poison him. There, doctors discovered he was suffering from bromide toxicity, which caused paranoia and hallucinations.

Bromide toxicity was more common in the 20th century when bromide salts were used in various over-the-counter medications. Cases declined sharply after the Food and Drug Administration (FDA) phased out bromide between 1975 and 1989.

The case highlights the dangers of relying on ChatGPT for complex health decisions without sufficient understanding or proper AI literacy.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay Connected

0FansLike
0FollowersFollow
0SubscribersSubscribe

Latest Articles