There’s another round of CSAM attacks and it’s really disturbing to see those images. It was really bothering to see those and they weren’t taken down immediately. There was even a disgusting shithead in the comments who thought it was funny?? the fuck
It’s gone now but it was up for like an hour?? This really ruined my day and now I’m figuring out how to download tetris. It’s really sickening.
No.
The nature of the checksums and perceptual hashing is kept in confidence between the National Center for Missing and Exploited Children (NCMEC) and the provider. If the “is this classified as CSAM?” service was available as an open source project those attempting to circumvent the tool would be able to test it until the modifications were sufficient to get a false negative.
There are attempts to do “scan and delete” but this may add legal jeopardy to server admins even more than not scanning as server admins are required by law to report and preserve the images and log files associated with CSAM.
I’d strongly suggest anyone hosting a Lemmy instance to read https://www.eff.org/deeplinks/2022/12/user-generated-content-and-fediverse-legal-primer
The requirements for hosting providers are https://www.law.cornell.edu/uscode/text/18/2258A