The Urgent Need for a Global AI Shutdown

1 min read
Source: Business Insider
The Urgent Need for a Global AI Shutdown
Photo: Business Insider
TL;DR Summary

AI researcher Eliezer Yudkowsky, who has been warning about the dangers of AI for over 20 years, has called for an "indefinite and worldwide" ban on AI development, stating that the most likely result of building superhumanly smart AI is that everyone on Earth will die. Yudkowsky refrained from signing an open letter calling for a six-month moratorium on AI development, stating that it understated the seriousness of the situation. Yudkowsky has been described as an "AI doomer" and has been warning about the possibility of an "AI apocalypse" for a long time.

Share this article

Reading Insights

Total Reads

0

Unique Readers

0

Time Saved

2 min

vs 3 min read

Condensed

84%

57693 words

Want the full story? Read the original article

Read on Business Insider