Image uploads are now disabled on lemm.ee due to malicious users
Sorry for the short post, I'm not able to make it nice with full context at the moment, but I want to quickly get this announcement out to prevent confusion:
Unfortunately, people are uploading child sexual abuse images on some instances (apparently as a form of attack against Lemmy). I am taking some steps to prevent such content from making it onto lemm.ee servers. As one preventative measure, I am disabling all image uploads on lemm.ee until further notice - this is to ensure that lemm.ee can not be used as gateway to spread CSAM into the network.
It will not possible to upload any new avatars or banners while this limit is in effect.
I'm really sorry for the disruption, it's a necessary trade-off for now until we figure out the way forward.
I think this is a great move until we have something rock solid to prevent this. There are tons of image hosting sites you can use (most of which have the resources to already try to prevent this stuff) so it shouldn’t really cause much inconvenience.
I'm sorry that you and the people on this instance are being subjected to that shit. It's always despicable but on top of that it just seems absurd to target lemm.ee -- a deliberately unprofitable platform -- with such illegal means.
I know there are automated tools that exist for detection CSAM- given the challenges the fediverse has had with this issue it really feels like it'd be worthwhile for the folks developing platforms like lemmy and mastodon to start thinking about how to integrate those tools with their platforms to better support moderators and folks running instances.
This is really sad and disgusting. It affects the whole platform but especially smaller instances that can't keep up. Despite being a lemm.ee user, I was particularly upset about thegarden.land shutting down because of that spam. It had my favourite gardening community on here.
I really hope this gets sorted out, and the spammers end up where they belong.
This is foul and I am extremely sorry for the users and mods who were sent the CSAM. It isn't something they should expect to deal with in a voluntary role for their communities and it can be traumatic. I hope they are given time and space to process their emotions.
Perfectly fine. People can upload images elsewhere and then just link to them. Most image upload sites will have all those protections in place already. A good stopgap until Lemmy gets those mod tools
If you're concerned about legal liability I think it's worth noting that there is some protection for websites in this matter. For the most part as long as you're taking "reasonable action" against it you're not liable, and that most laws take into consideration the resources of the site dealing with the uploads.
Not pleasant for users though of course. And the speed at which its handled is obviously a concern.
Thank you sir. I appreciate the dedication to the community to subject yourself to the moderation. Hopefully we can squash this before it goes too far, farther than it has anyway..
This is a very good decision, I worried about this problem from the very beginning that I learned about the Fediverse.
Research must definitely be done to find CSAM detection tools that integrate into Lemmy, perhaps we could make a separate bridge repo that integrates a tool like that easily into the codebase.
I hope every disgusting creature that uploads that shit gets locked up
That's disgusting! You made the right thing, sorry you admins and mods have to put up with that shit, I hope instance owners that are being attacked are reporting it to local authorities.
Personally, I would love it if all NSFW pictures were banned. There's deepai.org's nsfw detection bot. No idea if it costs money to use the api or not.
Seems like this is still disabled, is it now in play for the foreseeable future? What about setting a time frame in which no images can be posted until it's surpassed? Or maybe have an approval required for adding the avatar and banner?
Better shut the internet down then. This will only continue to worsen now that anybody can generate whatever images they want with AI assistance. Such image hashes will not be in CSAM databases (if AI generated imagery is even CSAM)