University of Chicago researchers release Nightshade to public, a tool that is intended to "poison" pictures in order to ruin generative models trained on them.
https:// nightshade.cs.uchicago.edu / University of Chicago researchers release Nightshade to public, a tool that is intended to "poison" pictures in order to ruin generative models trained on them.
3
comments
Reminder that this is made by Ben Zhao, the University of Chicago professor who stole open source code for his last data poisoning scheme.
29ReplyThank you for background
10Reply
If this technology is so great, why does the site not show any before / after examples? Let alone demonstrating that it does what he claims?
4Reply