Not a good look for Mastodon - what can be done to automate the removal of CSAM?

    • Lemdee
      link
      fedilink
      611 year ago

      So if I’m understanding right, based on their recommendations this will all be addressed as more moderation and QOL tools are introduced as we move further down the development roadmap?

      • @fubo@lemmy.world
        link
        fedilink
        -791 year ago

        What development roadmap? You’re not a product manager and this isn’t a Silicon Valley startup.

        • Fuck Lemmy.World
          link
          fedilink
          1221 year ago

          What makes you think that development roadmaps are exclusive to Silicon Valley startup product managers, and not just a general practice in software engineering?

          Mastodon actually does have a roadmap, and you can find it here: https://joinmastodon.org/roadmap

          • As does most successful open source software. It’s more of a "this is where we’d like to see things go long term, but that in no way restricts contributions, it merely helps communicate the ideas of the core contributors.

    • @duncesplayed@lemmy.one
      link
      fedilink
      English
      511 year ago

      If I can try to summarize the main findings:

      1. Computer-generated (e.g…, Stable Diffusion) child porn is not criminalized in Japan, and so many Japanese Mastodon servers don’t remove it
      2. Porn involving real children is removed, but not immediately, as it depends on instance admins to catch it, and they have other things to do. Also, when an account is banned, the Mastodon server software is not sending out a “delete” for all of their posted material (which would signal other instances to delete it)

      Problem #2 can hopefully be improved with better tooling. I don’t know what you do about problem #1, though.

      • @fubo@lemmy.world
        link
        fedilink
        English
        281 year ago

        One option would be to decide that the underlying point of removing real CSAM is to avoid victimizing real children; and that computer-generated images are no more relevant to this goal than Harry/Draco slash fiction is.

        • @PoliticalAgitator@lemm.ee
          link
          fedilink
          English
          11 year ago

          And are you able to offer any evidence to reassure us that simulated child pornography doesn’t increase the risk to real children as pedophiles become normalised to the content and escalate (you know, like what already routinely happens with regular pornography)?

          Or are we just supposed to sacrifice children to your gut feeling?

          • @fubo@lemmy.world
            link
            fedilink
            English
            1
            edit-2
            1 year ago

            Would you extend the same evidence-free argument to fictional stories, e.g. the Harry/Draco slash fiction that I mentioned?

            For what it’s worth, your comment has already caused ten murders. I don’t have to offer evidence, just as you don’t. I don’t know where those murders happened, or who was murdered, but it was clearly the result of your comment. Why are you such a terrible person as to post something that causes murder?

            • @PoliticalAgitator@lemm.ee
              link
              fedilink
              English
              21 year ago

              I have no problem saying that writing stories about two children having gay sex is pretty fucked in the head, along with anyone who forms a community around sharing and creating it.

              But it’s also not inherently abuse, nor is it indistinguishable from reality.

              You’re advocating that people just be cool with photo-realistic images of children, of any age, being raped, by any number of people, in any possible way, with no assurances that the images are genuinely “fake” or that pedophiles won’t be driven to make it a reality, despite other pedophiles cheering them on.

              I was a teenage contrarian psuedo-intellectual once upon a time too, but I never sold out other peoples children for something to jerk off too.

              If you want us to believe its harmless, prove it.

              • @fubo@lemmy.world
                link
                fedilink
                English
                -11 year ago

                You keep making up weird, defamatory accusations. Please stop. This isn’t acceptable behavior here.

                • @PoliticalAgitator@lemm.ee
                  link
                  fedilink
                  English
                  21 year ago

                  Awful pearl-clutchy for someone advocating for increased community support for photorealistic images of children being raped.

                  Which do you think is more acceptable to Lemmy in general? Someone saying “fuck”, or communities dedicated to photorealistic images of children being raped?

                  Maybe I’m not the one who should be changing their behavior.

                  • @fubo@lemmy.world
                    link
                    fedilink
                    English
                    -21 year ago

                    FYI: You’ve now escalated to making knowingly-false accusations about a specific person.

      • CaptainBasculin
        link
        fedilink
        English
        111 year ago

        Such a signal exists in the ActivityPub protocol, so I wonder why it’s not being used.

      • @HughJanus@lemmy.ml
        link
        fedilink
        English
        51 year ago

        I don’t know what you do about problem #1, though.

        Well the simple answer is that it doesn’t have to be illegal to remove it.

        The legal question is a lot harder, considering AI image generation has reached levels that are almost indistinguishable from reality.

        • @sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          3
          edit-2
          1 year ago

          In which case, admins should err on the side of caution and remove something that might be illegal.

          I personally would prefer to have nothing remotely close to CSAM, but as long as children aren’t being harmed in any conceivable way, I don’t think it would be illegal to post art containing children. But communities should absolutely manage things however they think is best for their community.

          In other words, I don’t think #1 is a problem at all, imo things should only be illegal if there’s a clear victim.

    • @mindbleach@lemmy.world
      link
      fedilink
      281 year ago

      4.1 Illustrated and Computer-Generated CSAM

      Stopped reading.

      Child abuse laws “exclude anime” for the same reason animal cruelty laws “exclude lettuce.” Drawings are not children.

      Drawings are not real.

      Half the goddamn point of saying CSAM instead of CP is to make clear that Bart Simpson doesn’t count. Bart Simpson is not real. It is fundamentally impossible to violate Bart Simpson’s rights, because he doesn’t fucking exist. There is nothing to protect him from. He cannot be harmed. He is imaginary.

      This cannot be a controversial statement. Anyone who can’t distinguish fiction from real life has brain problems.

      You can’t rape someone in MS Paint. Songs about murder don’t leave a body. If you write about robbing Fort Knox, the gold is still there. We’re not about to arrest Mads Mikkelsen for eating people. It did not happen. It was not real.

      If you still want to get mad at people for jerking off to the wrong fantasies, that is an entirely different problem from photographs of child rape.

      • @DrQuint@lemmy.world
        link
        fedilink
        51 year ago

        Oh, wait, Japanese in the other comment, now I get it. This conversation is a about AI Loli porn.

        Pfft, of course, that’s why no one is saying the words they mean, because it suddenly becomes much harder to take the stance since hatred towards Loli Porn is not universal.

        • I mean, I think it’s disgusting, but I don’t think it should be illegal. I feel the same way about cigarettes, 2 girls 1 cup, and profane language. It’s absolutely not for me, but that shouldn’t make it illegal.

          As long as there’s no victim, knock yourself out with whatever disgusting, weird stuff you’re into.

        • @mindbleach@lemmy.world
          link
          fedilink
          31 year ago

          What does that even mean?

          There’s nothing to “cover.” They’re talking about illustrations of bad things, alongside actual photographic evidence of actual bad things actually happening. Nothing can excuse that.

          No shit they are also discussing actual CSAM alongside… drawings. That is the problem. That’s what they did wrong.

      • @balls_expert@lemmy.blahaj.zone
        link
        fedilink
        1
        edit-2
        1 year ago

        Okay, thanks for the clarification

        Everyone except you still very much includes drawn & AI pornographic depictions of children within the basket of problematic content that should get filtered out of federated instances so thank you very much but I’m not sure your point changed anything.

        • @priapus@sh.itjust.works
          link
          fedilink
          51 year ago

          They are not saying it shouldn’t be defederated, they are saying reporting this to authorities is pointless and that considering CSAM is harmful.

            • @priapus@sh.itjust.works
              link
              fedilink
              31 year ago

              Definitions of CSAM definitely do not include illustrated and simulated forms. They do not have a victim and therefore cannot be abuse. I agree that it should not be allowed on public platforms, hence why all instances hosting it should be defederated. Despite this, it is not illegal, so reporting it to authorities is a waste of time for you and the authorities who are trying to remove and prevent actual CSAM.

              • @balls_expert@lemmy.blahaj.zone
                link
                fedilink
                1
                edit-2
                1 year ago

                CSAM definitions absolutely include illustrated and simulated forms. Just check the sources on the wikipedia link and climb your way up, you’ll see “cartoons, paintings, sculptures, …” in the wording of the protect act

                They don’t actually need a victim to be defined as such

                • @priapus@sh.itjust.works
                  link
                  fedilink
                  11 year ago

                  That Wikipedia broader is about CP, a broader topic. Practically zero authorities will include illustrated and simualated forms of CP in their definitions of CSAM

            • What’s the point of reporting it to authorities? It’s not illegal, nor should it be because there’s no victim, so all reporting it does is take up valuable time that could be spent tracking down actual abuse.

              • @balls_expert@lemmy.blahaj.zone
                link
                fedilink
                1
                edit-2
                1 year ago

                It’s illegal in a lot of places including where I live.

                In the US you have the protect act of 2003

                (a) In General.—Any person who, in a circumstance described in subsection (d), knowingly produces, distributes, receives, or possesses with intent to distribute, a visual depiction of any kind, including a drawing, cartoon, sculpture, or painting, that— (1) (A) depicts a minor engaging in sexually explicit conduct; and (B) is obscene; or (2) (A) depicts an image that is, or appears to be, of a minor engaging in graphic bestiality, sadistic or masochistic abuse, or sexual intercourse, including genital-genital, oral-genital, anal-genital, or oral-anal, whether between persons of the same or opposite sex; and (B) lacks serious literary, artistic, political, or scientific value; or attempts or conspires to do so, shall be subject to the penalties provided in section 2252A(b)(1), including the penalties provided for cases involving a prior conviction.

                Linked to the obscenity doctrine

                https://www.law.cornell.edu/uscode/text/18/1466A

                • @sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  1
                  edit-2
                  1 year ago

                  Wow, that’s absolutely ridiculous, thanks for sharing! That would be a very unpopular bill to get overturned…

                  I guess it fits with the rest of the stupidly named bills. It doesn’t protect anything, it just prosecutes undesirable behaviors.

        • @mindbleach@lemmy.world
          link
          fedilink
          01 year ago

          If you don’t think images of actual child abuse, against actual children, is infinitely worse than some ink on paper, I don’t care about your opinion of anything.

          You can be against both. Don’t ever pretend they’re the same.

                • @mindbleach@lemmy.world
                  link
                  fedilink
                  31 year ago

                  ‘Everyone but you agrees with me!’ Bullshit.

                  ‘Nobody wants this stuff that whole servers exist for.’ Self-defeating bullshit.

                  ‘You just don’t understand.’ Not an argument.

                  • @balls_expert@lemmy.blahaj.zone
                    link
                    fedilink
                    0
                    edit-2
                    1 year ago

                    Okay, the former then.

                    Let’s just think about it, how do you think it would turn out if you went outside and asked anyone about pornographic drawings of children? How long until you find someone who thinks like you outside your internet bubble?

                    “Nobody wants this stuff that whole servers…”

                    There are also servers dedicated to real child porn with real children too. Do you think that argument has any value with that tidbit of information tacked onto it?

            • @mindbleach@lemmy.world
              link
              fedilink
              11 year ago

              Some confused arguments reveal confused people. Some terrible arguments reveal terrible people. For example: I don’t give two fucks what Nazis think. Life’s too short to wonder which subjects they’re not facile bastards about.

              If someone’s motivation for making certain JPEGs hyper-illegal is “they’re icky” - they’ve lost benefit of the doubt. Because of their decisions, I no longer grant them that courtesy.

              Demanding pointless censorship earns my dislike.

              Equating art with violence earns my distrust.

              • Perhaps. But pretty much everyone has a stupid take on something.

                There’s obviously a limit there, but most people can be reasoned with. So instead of jumping to a conclusion, attempt a dialogue first until they prove that they can’t be reasoned with. This is especially true on SM where, even if you can’t convince the person you’re talking with, you may just convince the next person to come along.

                • @mindbleach@lemmy.world
                  link
                  fedilink
                  11 year ago

                  Telling someone why they’re a stupid bastard for the sake of other people is not exactly a contradiction. You know what doesn’t do observers any good? “Debating” complete garbage, in a way that lends it respect and legitimacy. Sometimes you just need to call bullshit.

                  Some bullshit is so blatant that it’s a black mark against the bullshitter.

                  • Sure, and I don’t think that’s the case here. If someone is literally arguing that a certain race should be exterminated that’s one thing (report, down vote, block, and move on), but someone arguing that lolicon is just as bad as CP is something completely different entirely.

                    I’m just arguing that it’s generally better to have the conversation than to completely shut them out. I really hate cancel culture, so I will always call out anything that seems similar. I believe in letting people explain themselves, to an extent, and my limit is if they’re actively promoting real harm to actual people (e.g. encouraging violence against some group).

            • @balls_expert@lemmy.blahaj.zone
              link
              fedilink
              0
              edit-2
              1 year ago

              He invented the stupid take he’s fighting against. Nobody equated “ink on paper” with “actual rape against children”.

              The bar to cross to be filtered out of the federation isn’t rape. Lolicon is already above the threshold, it’s embarrassing that he doesn’t realize that.

              • I don’t think the OP ever said the bar was rape, the OP said the article and the person they responded to are treating drawn depictions of imaginary children the same as depictions of actual children. Those are not the same thing at all, yet many people seem to combine them (apparently including US law as of the Protect Act of 2003).

                Some areas make a distinction (e.g. Japan and Germany), whereas others don’t. Regardless of the legal status in your area, the two should be treated separately, even if that means both are banned.

                • @balls_expert@lemmy.blahaj.zone
                  link
                  fedilink
                  1
                  edit-2
                  1 year ago

                  “treating them the same” => The threshold for being refused entry into mainstream instances is just already crossed at the lolicon level.

                  From the perspective of the fediverse, pictures of child rape and lolicon should just both get you thrown out. That doesn’t mean you’re “treating them the same”. You’re just a social network. There’s nothing you can do above defederating.

                  • No, more like “treating them the same” => how the data is reported in the study. Whether they’re both against the TOS of the instance you’re on is a separate issue entirely, the problem is the data doesn’t separate the two categories.

                    Look elsewhere ITT about that exact perspective. Even the US law (Protect Act of 2003) treats them largely the same (i.e. in the same sentence), and includes other taboo topics like bestiality, even if no actual animals are involved.

                    It’s completely fine for neither to be allowed on a social network, what isn’t okay is for research to conflate the two. An instance inconsistently removing lolicon is a very different thing from an instance inconsistently removing actual CP, yet the article combines the two, likely to make it seem like a much worse problem than it is.

              • @mindbleach@lemmy.world
                link
                fedilink
                01 year ago

                We’re not just talking about ‘ew gross icky’ exclusion from a social network. We’re talking about images whose possession is a felony. Images that are unambiguously the product of child rape.

                This paper treats them the same. You’re defending that false equivalence. You need to stop.

                • @balls_expert@lemmy.blahaj.zone
                  link
                  fedilink
                  1
                  edit-2
                  1 year ago

                  Who places the bar for “exclusion from a social network” at felonies? Any kind child porn has no place on the fediverse, simulated or otherwise. That doesn’t mean they’re equal offenses, you’re just not responsible for carrying out anything other than cleaning out your porch.

                  • @mindbleach@lemmy.world
                    link
                    fedilink
                    11 year ago

                    We’re not JUST talking about exclusion from a social network.

                    Do you speak English?

                    The subject matter is the part that’s a felony - so the glib inclusion of the part you just don’t like is dangerous misinformation.

                    I am calling out how this study falsely equates child rape and gross drawings, and your neverending hot take is ‘well I don’t care for either.’ There’s not enough ‘who asked’ in the world. One of these things is tacitly legal and has sites listed on Google. One of these things means you die in prison, anywhere in the world.

                    And here you are, still calling both of them “child porn.” In the same post insisting you’re not equating them. Thanks for keeping this simple, I guess.

      • Mark
        link
        fedilink
        11 year ago

        Oh no, what you describe is definitely illegal here in Canada. CSAM includes depictions here. Child sex dolls are illegal. And it should be that way because that stuff is disgusting.

        • @mindbleach@lemmy.world
          link
          fedilink
          -1
          edit-2
          1 year ago

          CSAM includes depictions here.

          Literally impossible.

          Child rape cannot include drawings. You can’t sexually assault a fictional character. Not “you musn’t.” You can’t.

          If you think the problem with child rape amounts to ‘ew, gross,’ fuck you. Your moral scale is broken, if there’s not a vast gulf between those two bad things.