Google AI Mocked for Bizarre Search Answers Like Glue in Pizza

TL;DR Summary
Google is taking "swift action" to address erroneous and dangerous AI Overviews after several bizarre and harmful suggestions went viral. The company acknowledges the issues and is working on improvements, but some AI-generated responses have included dangerous advice like drinking urine or jumping off a bridge.
- Google is playing whack-a-mole with disastrous AI Overviews gone viral Android Authority
- Google scrambles to manually remove weird AI answers in search The Verge
- Google AI said to put glue in pizza, so I put glue in pizza Business Insider
- Google Search's “udm=14” trick lets you kill AI search for good Ars Technica
- Glue in Pizza? Eat Rocks? Google's AI Search Is Mocked for Bizarre Answers CNET
Reading Insights
Total Reads
0
Unique Readers
1
Time Saved
2 min
vs 3 min read
Condensed
89%
416 → 46 words
Want the full story? Read the original article
Read on Android Authority