Skip Navigation
72 comments
  • Why do they just mention absolute numbers, instead of comparing them to similar platforms? All they said was that there is CSAM on the Fediverse, but that's also true for centralized services and the internet as a whole. The important question is whether there is more or less CSAM on the Fediverse, no?

    This makes it look very unscientific to me. The Fediverse might have a CSAM problem, but you wouldn't know it from this study.

    • Fediverse also makes it potentially easier to scan for this stuff. You can just connect a new server to the network and if the material is on federated servers, then you can find it (probably). While if it's some private forum or even the dark web, I assume it's a lot more difficult.

      The other thing is, most regular servers defederate from suspicious stuff already. Like pretty much nobody federates with that one shota instance, and they only serve drawm stuff (AFAIK). So I don't know if you can even say servers like that are a part of the Fediverse in the first place.

      • That's what I thought as well. If the authors of this "study" were able to simply scan for it on the Fediverse, then what's stopping law enforcement units from doing the same? They can literally get a message everytime someone posts something on a suspicious instance.

  • Another "SCREAM" for "BAN FEDIVERSE ITS DANGEROUS!!!!!!!". And then of course its tagged "CSAM" lol. You want a (small) website removed just accuse them of csam. And boom hoster and admins raided.

  • yeahhh the free internet is a bit like a wild west. maybe ain't for children. maybe gotta keep em behind more secure digital walls while they're growing before letting em loose. but still gotta teach em media literacy and safe internet practices and such.

    but lbr this isn't gonna stop the more persistent kids

  • The suggestions in their direction for future improvement section should be implemented sooner rather than later. There's no point in growing this platform if it's going to be left wide open for abuse like it is.

    I also think, in lieu of lemmy devs making any improvements, another good solution would be for a third party to prop something up that scrapes every lemmy post and runs it through an automated service for detecting known CSAM. The third party service would be forcing at least one of those future improvements on lemmy, as it exists today. Any known CSAM that's found would be automatically reported, and if the instance owners can't deal with it, then they would rightfully have to deal with the consequences of their inactions.

72 comments