Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • priapus@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Seems odd that they mention Mastodon as a Twitter alternative in this article, but do not make any mention of the fact that Twitter is also rife with these problems, more so as they lose employees and therefore moderation capabilities. These problems have been around on Twitter for far longer, and not nearly enough has been done.

  • whatsarefoogee@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Mastodon is a piece of software. I don’t see anyone saying “phpBB” or “WordPress” has a massive child abuse material problem.

    Has anyone in the history ever said “Not a good look for phpBB”? No. Why? Because it would make no sense whatsoever.

    I feel kind of a loss for words because how obvious it should be. It’s like saying “paper is being used for illegal material. Not a good look for paper.”

    What is the solution to someone hosting illegal material on an nginx server? You report it to the authorities. You want to automate it? Go ahead and crawl the web for illegal material and generate automated reports. Though you’ll probably be the first to end up in prison.

    • redcalcium@lemmy.institute
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      I get what you’re saying, but due to federated nature, those CSAMs can easily spread to many instances without their admins noticing them. Having even one CSAM in your server is a huge risk for the server owner.

      • MinusPi (she/they)@pawb.social
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        I don’t see what a server admin can do about it other than defederate the instant they get reports. Otherwise how can they possibly know?

        • krimsonbun@lemmy.ml
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          This could be a really big issue though. People can make instances for really hateful and disgusting crap but even if everyone defederates from them it’s still giving them a platform, a tiny tiny corner on the internet to talk about truly horrible topics.

          • priapus@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Those corners will exist no matter what service they use and there is nothing Mastodon can do to stop this. There’s a reason there are public lists of instances to defederate. This content can only be prevented by domain providers and governments.