Nightshade, a cutting-edge tool developed by researchers at the University of Chicago, led by Ben Zhao, safeguards artists against AI companies’ unauthorized use of their work. This tool allows artists to subtly change their digital art, making it tricky for AI models and causing unpredictable AI-generated content. The tool “poisons” training data, disrupting models like DALL-E, Midjourney and Stable Diffusion. This innovation addresses concerns over copyright infringement by AI companies and could empower artists facing unauthorized usage by industry giants like OpenAI, Meta, Google and Stability AI.
Another tool, Glaze, complements Nightshade by allowing artists to hide their personal style from AI scraping. Glaze operates similarly to Nightshade, confusing machine-learning models with image alterations. Its research team plans to integrate Nightshade into the tool and offer artists the choice to use the data-poisoning platform. They are also making Nightshade open source for wider use. Nightshade focuses on a weakness in AI models, which comes from the data they use from the internet for training. When artists use Glaze to mask their work, these modified images become part of AI models’ datasets. Cleaning up this messed-up data is a tough job for tech companies as they have to carefully find and remove each bad sample one by one.
Trying Nightshade on Stable Diffusion’s recent models led to interesting findings. With only 50 altered dog images, the AI output turned bizarre, generating cartoonish creatures with extra limbs and unusual traits. Using 300 changed samples made the AI create dog images resembling cats, showing how the tool can disrupt how images and text are interpreted.
While there’s a potential for misuse, it is important to note that attackers would need thousands of tampered samples to significantly disrupt larger AI models. This highlights the need for robust defenses in the AI industry. Artists hope that Nightshade will make AI companies respect their rights and seek fairer compensation. While some AI companies offer opt-out choices, artists often find them inadequate. Tools like Nightshade could give artists more control over their work and discourage unauthorized use by AI companies.