• theherk@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    I get this position, truly, but I struggle to reconcile it with the feeling that artwork of something and photos of it aren’t equal. In a binary way they are, but with more precision they’re pretty far apart. But I’m not arguing against it, I’m just not super clear how I feel about it yet.

    • Corkyskog@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      It’s not a difficult test. If a person can’t reasonably distinguish it from an actual child, then it’s CSAM.

      • Madison420@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        This would also outlaw “teen” porn as they are explicitly trying to look more childlike as well as models that only appear to be minors.

        I get the reason people think it’s a good thing but all censorship has to be narrowly tailored to content lest it be too vague or overly broad.

        • Corkyskog@sh.itjust.works
          link
          fedilink
          arrow-up
          0
          ·
          1 month ago

          And nothing was lost…

          But in seriousness, as you said they are models who are in the industry, verified, etc. It’s not impossible to have a white-list of actors, and if anything there should be more scrutiny on the unknown “actresses” portraying teenagers…

          • Madison420@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            1 month ago

            Except jobs dude, you may not like their work but it’s work. That law ignores verified age, that’s a not insignificant part of my point…

    • Madison420@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      1 month ago

      So long as the generation is without actual model examples that are actual minors there’s nothing technically illegal about having sexual material of what appears to be a child. They would then have a mens rea question and a content question, what actual defines in a visual sense a child? Could those same things equally define a person of smaller stature? And finally could someone like tiny texie be charged for producing csam as she by all appearance or of context looks to be a child.

      • Fungah@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        It is illegal in Canada to have sexual depictions of a child whether its a real image or you’ve just sat down and drawn it yourself. The rationale being that behavior escalated, and looking at images goes to wanting more

        It borders on thought crime which I feel kind of high about but only pedophiles suffer which I feel great about. There’s no legitimate reason to have sexualized image of a child whether computer geneerate, hand drawn, or whatever.

        • Madison420@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          1 month ago

          This article isn’t about Canada homeboy.

          Also that theory is not provable and never will be, morality crime is thought crime and thought crime is horseshit. We criminalize criminal acts not criminal thoughts.

          Similarly, you didn’t actually offer a counterpoint to any of my points.

        • Madison420@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          1 month ago

          Real images that don’t have to be of csam but rather of children, it could theoretically train anything sexual with legal sexual content and let the ai connect the dots.