
The Urgent Need for a Global AI Shutdown
AI researcher Eliezer Yudkowsky, who has been warning about the dangers of AI for over 20 years, has called for an "indefinite and worldwide" ban on AI development, stating that the most likely result of building superhumanly smart AI is that everyone on Earth will die. Yudkowsky refrained from signing an open letter calling for a six-month moratorium on AI development, stating that it understated the seriousness of the situation. Yudkowsky has been described as an "AI doomer" and has been warning about the possibility of an "AI apocalypse" for a long time.