
Man Hospitalized with Hallucinations After Asking ChatGPT About Salt-Free Diet
A 60-year-old man was hospitalized after replacing table salt with sodium bromide following advice from ChatGPT, leading to bromide poisoning and hallucinations. The case highlights the risks of using AI for medical advice without professional guidance.