New tools are being developed to help artists and photographers protect their work from being used to train artificial intelligence (AI) models. However, researchers have found ways to get around these protections.
AI models that create images need to be trained on large amounts of visual data. Some people are concerned that these datasets often include copyrighted material used without permission. Artists worry that AI models will learn their style, copy their work, and potentially take their jobs.
In 2023, tools like Glaze and Nightshade were created to protect online images by "poisoning" them for AI training purposes. These tools slightly alter the images in a way that is barely noticeable to humans but confuses AI models. Shawn Shan, who helped develop these tools, was even recognized for his work.
However, a new project called Lightshed claims it can bypass these protections, making the images usable for AI training again. The developers of Lightshed say they don't want to steal artists' work, but they want to show that these protections aren't foolproof. They believe it's important for artists to know that companies might have ways to remove the "poison" from their images.
Glaze works by causing AI models to misinterpret the style of an image, for example, seeing a photorealistic painting as a cartoon. Nightshade causes the AI to misidentify the content of the image, such as recognizing a cat as a dog. Glaze is designed to protect an artist's unique style, while Nightshade aims to prevent AI models from training on artwork found online.
The team behind Lightshed trained their tool to recognize where Glaze and Nightshade add their digital "poison" to images so that it can effectively remove it. Lightshed is reportedly very effective and can even adapt to new protection tools it hasn't encountered before.
While Lightshed has some difficulty with small amounts of "poison," these amounts usually don't significantly affect the AI model's ability to understand the images. This creates a situation where the AI benefits, but the artists using the protection tools lose out.
Many people, including artists with smaller followings, have downloaded Glaze to protect their art. They see these tools as an important defense, especially while regulations around AI training and copyright are unclear. The creators of Lightshed hope their work will serve as a warning that tools like Glaze are not permanent solutions.
The developers of Glaze and Nightshade seem to agree. They have acknowledged that their tools are not guaranteed to be future-proof. However, they believe that these defenses are still valuable as a deterrent, sending a message to AI companies to take artists' concerns seriously. The goal is to create enough obstacles to encourage companies to work directly with artists.
The hope is that the insights from Lightshed will help develop new defenses for artists, such as clever watermarks that remain visible even after an AI model processes the image. While no tool can completely protect artwork from AI forever, the aim is to shift the balance of power back towards the creators.