Protecting Artists' Work: Safeguarding Against A.I. Scraping and NSFW Image Generation
Originally Published 2 years ago — by Smithsonian Magazine

Researchers at the University of Chicago have developed a tool called Nightshade that allows artists to embed invisible "poison" into their work, misleading AI models and protecting their artwork from unauthorized use. Nightshade changes an image's pixels in a way that is undetectable to humans but impairs AI models' ability to label the images correctly. By training AI models on these poisoned images, their abilities break down, leading to misclassifications. Nightshade is a step towards giving artists more power in the face of AI, although it may not be a long-term solution.