"Artists Empowered: Nightshade AI Tool Counters AI Image Scrapers and Protects Art"

1 min read
Source: The Verge
"Artists Empowered: Nightshade AI Tool Counters AI Image Scrapers and Protects Art"
Photo: The Verge
TL;DR Summary

Artists now have a tool called Nightshade that can corrupt training data used by AI models, such as DALL-E, Stable Diffusion, and Midjourney, by attaching it to their creative work. Nightshade adds invisible changes to pixels in digital art, exploiting a security vulnerability in the model's training process. This tool aims to disrupt AI companies that use copyrighted data without permission. Nightshade can be integrated into Glaze, a tool that masks art styles, allowing artists to choose whether to corrupt the model's training or prevent it from mimicking their style. The tool is proposed as a last defense against web scrapers that ignore opt-out rules. Copyright issues surrounding AI-generated content and training data remain unresolved, with lawsuits still ongoing.

Share this article

Reading Insights

Total Reads

1

Unique Readers

1

Time Saved

2 min

vs 3 min read

Condensed

75%

467119 words

Want the full story? Read the original article

Read on The Verge