Several months ago Beehaw received a report about CSAM (i.e. Child Sexual Abuse Material). As an admin, I had to investigate this in order to verify and take the next steps. This was the first time in my life that I had ever seen images such as these. Not to go into great detail, but the images were of a very young child performing sexual acts with an adult.

The explicit nature of these images, the gut-wrenching shock and horror, the disgust and helplessness were very overwhelming to me. Those images are burnt into my mind and I would love to get rid of them but I don’t know how or if it is possible. Maybe time will take them out of my mind.

In my strong opinion, Beehaw must seek a platform where NO ONE will ever have to see these types of images. A software platform that makes it nearly impossible for Beehaw to host, in any way, CSAM.

If the other admins want to give their opinions about this, then I am all ears.

I, simply, cannot move forward with the Beehaw project unless this is one of our top priorities when choosing where we are going to go.

  • jarfil@beehaw.org
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    10 months ago

    Those images are burnt into my mind and I would love to get rid of them but I don’t know how or if it is possible

    I’m very sorry this happened to you, and I wish I could offer you some advice… but that’s the main reason I stopped hosting open community stuff many years ago. I thought I was hardened enough, but nope; between the spam, the “shock imagery” (NSFL gore, CSAM), the doxxing, and toxic users in general… even having some ads was far from making it all worthwhile. There is a reason why “the big ones” like Facebook or Google churn through 3rd world mods who can’t take it for more than a few months before getting burnt out.

    I wish I could tell you that you’ll eventually forget what you’ve seen… but I still remember stuff from 30 years ago. Also don’t want to scare you, but it’s not limited to images… some “fanfiction” with text imagery is evil shit that I still can’t forget either.

    Nowadays, you can find automated CSAM identification services, like the one run by Microsoft, so if you integrated that, you could err on the side of caution and block any image it marks as even suspicious. This may or may not work in your jurisdiction, with some requiring you to “preserve the proof” and submit it to authorities (plus different jurisdictions having different definitions of what is an what isn’t breaking the law, and laws against swamping them with false positives… so you basically can’t win). This will also do nothing for the NSFL or text based imagery.

    A way to “shield yourself” from all of this as an admin, is to go to an encrypted platform where you can’t even see what’s getting posted, so you never run the risk of seeing that kind of content… but then you end up with zero moderation tools, pushing all the burden onto your users, so not suitable for a safe space.

    Honestly, I don’t think there is an effective solution for this yet. It’s been a great time abusing the good will of the admins and mods staying on Beehaw, but if you can’t find a reasonable compromise… oh well.