• TheAnonymouseJoker@lemmy.ml
      link
      fedilink
      arrow-up
      1
      arrow-down
      8
      ·
      5 days ago

      So if I draw a stick figure with 2 circles, call it 8 years old, is it CSAM? Will I be arrested for it? Do you see how that dumb logic does not work too well?

      • ssj2marx@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        5 days ago

        Hot take: yes. All art exists in a social context, and if the social context of your art is “this is a child and they are sexualized” then your art should be considered CSAM. Doesn’t matter if it’s in an anime style, a photorealistic style, or if it’s a movie where the children are fully clothed for the duration but are sexualized by the director as in Cuties - CSAM, CSAM, CSAM.

        • TheAnonymouseJoker@lemmy.ml
          link
          fedilink
          arrow-up
          1
          arrow-down
          3
          ·
          5 days ago

          Glad that it will always remain a hot take.

          The problem with your argument is there cannot be developed a scale or spectrum to judge where the fake stops and real starts for drawings or AI generated media. And since they were not recorded with a camera in real world, they cannot be real, no matter what your emotional response to such a deplorable defamation act may be. It is libel of an extreme order.

          Cuties was shot with a camera in real world. Do you see the difference between AI generated media and what Cuties was?

          • ssj2marx@lemmy.ml
            link
            fedilink
            arrow-up
            2
            ·
            5 days ago

            there cannot be developed a scale or spectrum to judge where the fake stops and real starts

            Ah, but my definition didn’t at all rely on whether or not the images were “real” or “fake”, did it? An image is not merely an arrangement of pixels in a jpeg, you understand - an image has a social context that tells us what it is and why it was created. It doesn’t matter if there were real actors or not, if it’s an image of a child and it’s being sexualized, it should be considered CSAM.

            And yes I understand that that will always be a subjective judgement with a grey area, but not every law needs to have a perfectly defined line where the legal becomes the illegal. A justice system should not be a computer program that simply runs the numbers and delivers an output.

            • TheAnonymouseJoker@lemmy.ml
              link
              fedilink
              arrow-up
              1
              arrow-down
              4
              ·
              5 days ago

              An image is not merely an arrangement of pixels in a jpeg,

              I am not one of those “it’s just pixels on a screen” people. But if it was not recorded in real world with a camera, it cannot be real.

              Who will be the judge? If there is some automated AI created, who will be the one creating it? Will it be perfect? No. We will end up in the situation that Google caused to users, like doctors, married parents and legitimate people being labelled as pedophiles or CSAM users. It has already happened to me in this thread, and you also said it. The only accurate way to judge it will be a very large team of forensic experts on image/video media, which is not feasible for the amount of data social media generates.

              not every law needs to have a perfectly defined line

              And this is where the abuse by elites, politicians and establishment starts. Activists and dissidents can be easily jailed by CSAM being planted, which would in this case be as simple as AI pictures being temporary drive by downloads onto target’s devices.

              • ssj2marx@lemmy.ml
                link
                fedilink
                arrow-up
                1
                ·
                5 days ago

                Who will be the judge?

                The same people that should judge every criminal proceeding. Of course it’s not going to be perfect, but this is a case of not letting perfect be the enemy of good. Allowing generated or drawn images of sexualized children to exist has external costs to society in the form of normalizing the concept.

                The argument that making generated or drawn CSAM illegal is bad because the feds might plant such images on an activist is incoherent. If you’re worried about that, why not worry that they’ll plant actual CSAM on your computer?

                • TheAnonymouseJoker@lemmy.ml
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  3
                  ·
                  5 days ago

                  Have you considered the problem of doctors, married parents and other legitimate people being labelled as CSAM users and pedophiles? This has already happened, and they are not obligated to take the brunt of misjudgement of tools developed to judge such media. This is not a hypothetical scenario, and has already happened in real world, and has caused real world damage to people.

                  The argument of planted CSAM is not incoherent, but has also played out with many people. It is one of the favourite tools for elites and ruling politicians to use. I am less worried about it because such a law thankfully does not exist, that will misjudge the masses brutally for fictional media.

                  • ssj2marx@lemmy.ml
                    link
                    fedilink
                    arrow-up
                    2
                    ·
                    5 days ago

                    How many times can I say “social context” before you grok it? There’s a difference between a picture taken by a doctor for medical reasons and one taken by a pedo as CSAM. If doctors and parents are being nailed to the cross for totally legitimate images then that strikes me as evidence that the law is too rigid and needs more flexibility, not the other way around.