
Man Hospitalized After Following ChatGPT's Diet Advice, Leading to Poisoning and Psychosis
A 60-year-old man in New York followed dietary advice from ChatGPT, replacing salt with sodium bromide obtained online, which led to bromide poisoning and severe health issues. The case highlights the dangers of relying on AI for medical guidance and underscores the need for better safety measures in AI tools, especially regarding health-related advice.


