"Artists Empowered: Nightshade AI Tool Counters AI Image Scrapers and Protects Art"

Artists now have a tool called Nightshade that can corrupt training data used by AI models, such as DALL-E, Stable Diffusion, and Midjourney, by attaching it to their creative work. Nightshade adds invisible changes to pixels in digital art, exploiting a security vulnerability in the model's training process. This tool aims to disrupt AI companies that use copyrighted data without permission. Nightshade can be integrated into Glaze, a tool that masks art styles, allowing artists to choose whether to corrupt the model's training or prevent it from mimicking their style. The tool is proposed as a last defense against web scrapers that ignore opt-out rules. Copyright issues surrounding AI-generated content and training data remain unresolved, with lawsuits still ongoing.
- Artists can use a data poisoning tool to confuse DALL-E and corrupt AI scraping The Verge
- New Data 'Poisoning' Tool Enables Artists To Fight Back Against Image Generating AI ARTnews
- University of Chicago researchers seek to “poison” AI art generators with Nightshade Ars Technica
- Nightshade AI: Defending Art From AI Dataconomy
- Sabotage tool takes on AI image scrapers Tech Xplore
Reading Insights
1
1
2 min
vs 3 min read
75%
467 → 119 words
Want the full story? Read the original article
Read on The Verge