
"Nightshade: Empowering Artists to Safeguard Art from AI Exploitation"
Nightshade, a new tool developed by researchers at the University of Chicago, aims to help creatives protect their work from AI image generators by adding undetectable pixels to images, effectively poisoning the AI's training data. The tool, currently under peer review, alters how machine-learning tools interpret data scraped from online sources, resulting in AI models reproducing something entirely different from the original image. By using Nightshade and another tool called Glaze, artists can protect their images while sharing them online. The hope is that widespread use of these tools will encourage larger companies to properly compensate and credit original artists.

