
The Dangers of AI Self-Training and Data Feeding
AIs trained solely on other AIs will eventually produce gibberish content, according to a group of British and Canadian scientists. The phenomenon, called "model collapse," occurs when AIs are trained on AI-generated content, causing errors and instances of nonsense to spiral. The scientists warn that this will make it impossible for later AIs to distinguish between fact and fiction. The problem lies in the AI's perception of probability after being trained on an earlier AI, narrowing what the next AI understands to be possible. The scientists likened the effect to pollution, saying that the internet will be filled with "blah."