• 0 Posts
  • 4 Comments
Joined 1 year ago
cake
Cake day: June 12th, 2023

help-circle

  • So the standard approach to this is so-called “perceptual hashing.” Effectively, using cryptographic hashes (sha256, etc.) doesn’t really work well in this case. Given a piece of illegal content, that content is likely to still be just as illegal with a single pixel changed – however, it’ll have a completely different cryptographic hash. So instead, a hash function that determines how “similar-looking” two images are, ignoring things like dimensions, color palette, JPEG compression artifacts, etc. This is obviously way fuzzier, and is prone to both false positives and negatives.

    Because all this is inherently kinda fuzzy, the exact database of hashes is usually “secret sauce” if you will. If it were public, it would be super easy to circumvent. As an example, given an illegal image:

    1. Is the image’s hash in the DB?
    2. No? All done, you can post it with impunity.
    3. Yes? Change one random pixel, GOTO 1.

    As a result even “public” databases are distributed with NDAs etc. This obviously does not jive well with an open source, federated network like Mastodon, and I have my doubts as to how willing the relevant agencies would be to give their databases to every rando with $5 to spin up a Pleroma instance on a VPS. A public DB might help in some cases, but unfortunately more illegal content is produced every day, and so it would be extremely hard to keep up with the bad actors.


  • In my opinion the biggest issue the author points out is that cached materials are sometimes retained even after moderator action. Which honestly just sounds like a straight up bug more than anything. Though if I were running an instance, the feds showing up at my door with a warrant because I’ve been accidentally distributing CSAM would be my nightmare scenario. And of course jurisdiction plays a part, too: an American user on a Canadian server might see drawn depictions of sexualized minors, think “weird but not illegal,” and now the Canadian admin has content that’s illegal in Canada on their Canadian server and has no idea.

    IMO I think the best solution to this is something similar to what Renaud Chaput (Mastodon’s resident infra boffin) described in his recent blog post. Effectively, give admins a way to hand this off to pluggable third-party services. Admins that are worried about this sort of thing can then have some degree of safety via e.g. PhotoDNA, whereas others can take on additional risk and preserve additional privacy.

    All that said: yeah the headline makes it sound like .social is some 8chan-esque hellhole, whereas in reality my feed is 99% German programmers sharing milquetoast political takes.


  • glorbo@lemmy.oneto196@lemmy.blahaj.zonerule
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    My fave Lin Manuel-Miranda song is the one where he advocates on behalf of the US government for the privatization of the Puerto Rican electrical grid claiming it will improve reliability but then it actually becomes even less reliable. It’s really catchy 😊