A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • BonesOfTheMoon@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    1 month ago

    Could this be considered a harm reduction strategy?

    Not that I think CSAM is good in any way, but if it saves a child would it be worthwhile? Like if these pedos were to use AI images instead of actual CSAM would that be any better?

    I’ve read that CSAM sites on the dark web number into the hundreds of thousands. I just wonder if it would be a less harmful thing since it’s such a problem.

    • xta@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      1 month ago

      by the same metric, i wonder why not let convicts murderers and psichopaths work at Slaughterhouses

  • macniel@feddit.org
    link
    fedilink
    arrow-up
    0
    ·
    1 month ago

    I don’t see how children were abused in this case? It’s just AI imagery.

    It’s the same as saying that people get killed when you play first person shooter games.

    Or that you commit crimes when you play GTA.

    • KillerTofu@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      1 month ago

      How was the model trained? Probably using existing CSAM images. Those children are victims. Making derivative images of “imaginary” children doesn’t negate its exploitation of children all the way down.

      So no, you are making false equivalence with your video game metaphors.

      • fernlike3923@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        A generative AI model doesn’t require the exact thing it creates in its datasets. It most likely just combined regular nudity with a picture of a child.

        • finley@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          In that case, the images of children were still used without their permission to create the child porn in question

          • fernlike3923@sh.itjust.works
            link
            fedilink
            arrow-up
            0
            ·
            1 month ago

            That’s a whole other thing than the AI model being trained on CSAM. I’m currently neutral on this topic so I’d recommend you replying to the main thread.

              • fernlike3923@sh.itjust.works
                link
                fedilink
                arrow-up
                0
                ·
                edit-2
                1 month ago

                It’s not CSAM in the training dataset, it’s just pictures of children/people that are already publicly available. This goes on to the copyright side of things of AI instead of illegal training material.

                • finley@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  1 month ago

                  It’s images of children used to make CSAM. No matter of your mental gymnastics can change that, nor the fact that those children’s consent was not obtained.

                  Why are you trying so hard to rationalize the creation of CSAM? Do you actually believe there is a context in which CSAM is OK? Are you that sick and perverted?

                  Because it really sounds like that’s what you’re trying to say, using copyright law as an excuse.

    • TallonMetroid@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Well, the image generator had to be trained on something first in order to spit out child porn. While it may be that the training set was solely drawn/rendered images, we don’t know that, and even if the output were in that style, it might very well be photorealistic images generated from real child porn and run through a filter.

  • JaggedRobotPubes@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    Do we know that AI child porn is bad? I could believe it would get them in the mood for the real thing and make them do it more, and I could believe it would make them go “ok, itch scratched”, and tank the demand for the real stuff.

    Depending on which way it goes, it could be massively helpful for protecting kids. I just don’t have a sense for what the effect would be, and I’ve never seen any experts weigh in.

    • PhilMcGraw@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      In Australia cartoon child porn is enforced in the same way as actual child porn. Not that it answers your question but it’s interesting.

      I’d imagine for your question “it depends”, some people who would have acted on their urges may get their jollies from AI child porn, others who have never considered being pedophiles might find the AI child porn (assuming legal) and realise it’s something they were into.

      I guess it may lower the production of real child porn which feels like a good thing. I’d hazard a guess that there are way more child porn viewers than child abusers.

      • redfellow@sopuli.xyz
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        In Australia a 30 year old woman cannot be in the porn industry if she has small breasts. That, and the cartoon ban both seem like overcompensating.

        • Queue@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          0
          ·
          1 month ago

          Nothing says “we’re protecting children” like regulating what adult women can do with their bodies.

          Conservatives are morons, every time.

          • Cryophilia@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            1 month ago

            They’re not morons.

            Any time anyone ever says they want to do anything “to protect the children” you should assume it’s about control. No one actually gives a shit about children.

    • Maggoty@lemmy.world
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      1 month ago

      You’re missing the point. They don’t care what’s more or less effective for helping kids. They want to punish people who are different. In this case nobody is really going to step up to defend the guy for obvious reasons. But the motivating concept is the same for conservatives.