New tools are being developed to help artists protect their work from being used to train artificial intelligence (AI) models. However, researchers have discovered ways to get around these protections.
AI models that create images need to be trained on a large amount of visual data. Some people worry that this training data includes copyrighted material used without permission. This concerns artists who fear that AI could learn their style, copy their work, and potentially take their jobs.
In 2023, some tools emerged to help artists fight back. Programs like Glaze and Nightshade were designed to "poison" images, making them unusable for AI training. These tools work by making small changes to images that are barely noticeable to humans, but confuse AI models.
However, researchers have now created a method called Lightshed that can undo the effects of these protective tools. This means that images protected by Glaze and Nightshade could still be used for AI training.
The researchers behind Lightshed aren't trying to steal artists' work. Instead, they want to show that current protection methods may not be reliable. They believe it's important for artists to know that companies might have ways to remove the "poison" from their images.
Glaze works by tricking AI models into misinterpreting the style of an image, for example, thinking a realistic painting is a cartoon. Nightshade, on the other hand, causes the AI to misidentify the subject of the image, such as seeing a cat as a dog. Lightshed learns how Glaze and Nightshade add these changes to images and then removes them.
Lightshed is very effective at removing these changes. It can even adapt to new anti-AI tools without having seen them before.
While Lightshed can struggle with small amounts of "poison," these small changes often don't stop AI models from understanding the images. This means that AI companies could still use the images for training, even if the artist has tried to protect them.
Many artists have already downloaded tools like Glaze to protect their art. They see these tools as an important defense, especially since the rules around AI training and copyright are still unclear. The creators of Lightshed hope that their work will serve as a warning that current protection tools are not perfect solutions.
The developers of Glaze and Nightshade acknowledge that their tools may not be foolproof. However, they still believe that these defenses are valuable because they can discourage AI companies from using artists' work without permission. The goal is to create enough obstacles that companies will be forced to work directly with artists.
Researchers hope that the lessons learned from developing Lightshed will help create new and better ways to protect artists' work. This could include developing watermarks that stay visible even after an image has been processed by an AI model. While no method can protect art from AI forever, the goal is to shift the balance of power back towards the artists.