The Urgent Need for a Global AI Shutdown

TL;DR Summary
AI researcher Eliezer Yudkowsky, who has been warning about the dangers of AI for over 20 years, has called for an "indefinite and worldwide" ban on AI development, stating that the most likely result of building superhumanly smart AI is that everyone on Earth will die. Yudkowsky refrained from signing an open letter calling for a six-month moratorium on AI development, stating that it understated the seriousness of the situation. Yudkowsky has been described as an "AI doomer" and has been warning about the possibility of an "AI apocalypse" for a long time.
- Researcher warning about dangers of AI says: 'shut it all down' Business Insider
- AI expert warns Elon Musk-signed letter doesn't go far enough, says 'literally everyone on Earth will die' Fox News
- Elon Musk calls for artificial intelligence pause ABC News
- As Musk & Others Call For AI Pause, The Perils Of Rushing Development Are Clear The Drum
- Opinion | A Six-Month AI Pause? No, Longer Is Needed The Wall Street Journal
Reading Insights
Total Reads
0
Unique Readers
0
Time Saved
2 min
vs 3 min read
Condensed
84%
576 → 93 words
Want the full story? Read the original article
Read on Business Insider