The Dangers of AI Self-Training and Data Feeding

TL;DR Summary
AIs trained solely on other AIs will eventually produce gibberish content, according to a group of British and Canadian scientists. The phenomenon, called "model collapse," occurs when AIs are trained on AI-generated content, causing errors and instances of nonsense to spiral. The scientists warn that this will make it impossible for later AIs to distinguish between fact and fiction. The problem lies in the AI's perception of probability after being trained on an earlier AI, narrowing what the next AI understands to be possible. The scientists likened the effect to pollution, saying that the internet will be filled with "blah."
Topics:technology#ai#artificial-intelligence#gibberish#internet-pollution#machine-learning#model-collapse
- AIs trained on each other start to produce junk content: study Business Insider
- ChatGPT on Mechanical Turk: Teaching AI With AI Comes With Risks Bloomberg
- ChatGPT will make the web toxic for its successors TechTalks
- Analysis | The Ghost In The Machine Shouldn't Be AI The Washington Post
- AI models feeding on AI data may face death spiral Tech Xplore
Reading Insights
Total Reads
0
Unique Readers
0
Time Saved
3 min
vs 4 min read
Condensed
85%
667 → 100 words
Want the full story? Read the original article
Read on Business Insider