The Dangers of AI Self-Training and Data Feeding

1 min read
Source: Business Insider
The Dangers of AI Self-Training and Data Feeding
Photo: Business Insider
TL;DR Summary

AIs trained solely on other AIs will eventually produce gibberish content, according to a group of British and Canadian scientists. The phenomenon, called "model collapse," occurs when AIs are trained on AI-generated content, causing errors and instances of nonsense to spiral. The scientists warn that this will make it impossible for later AIs to distinguish between fact and fiction. The problem lies in the AI's perception of probability after being trained on an earlier AI, narrowing what the next AI understands to be possible. The scientists likened the effect to pollution, saying that the internet will be filled with "blah."

Share this article

Reading Insights

Total Reads

0

Unique Readers

0

Time Saved

3 min

vs 4 min read

Condensed

85%

667100 words

Want the full story? Read the original article

Read on Business Insider