New data poisoning tool lets artists fight back against generative AI
New data poisoning tool lets artists fight back against generative AI
www.technologyreview.com This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
There is a discussion on Hacker News, but feel free to comment here as well.
7
crosspost
1
comments
I love how cyberpunk this is.
4Reply