This is more of a 2 part question. Should child porn that does not include a real child be illegal? If so, who is being harmed by it?

The other question is; does giving a pedophile access to “imitation” children give them an outlet for their desire, so they won’t try to engage with real children, or does it just reinforce their desire, thus helping them to rationalize their behavior and lead to them being more encouraged to harm real children?

I’ve heard psychologists discuss both sides, but I don’t think we have any real life studies to go off of because the technology is so new.

I’m just curious what the other thought out there are from people who are more liberty minded.

  • LouNeko@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    1 year ago

    As for the first part. Since the images AI generates aren’t photographs but somewhat artistic renditions, they can essentialy be classified as art. And art containing nudity of minors is already regulated in many countries and a legal gray area in others.

    For the second part. In my and the majorities opinion, no you can’t have “imitation” children for pedophiles, for the same reason why you can’t have “imitation” bodies for serial killers or “imitation” buildings for arsonists. It is not our social duty to provide an outlet to people that can’t function in society. Giving leeway to evil, gives it a chance to bundle up and organize, making it harder to fight against - see for example drug cartels or sexual misconduct of the church. Giving an outlet is one step closer to normalization of any crime.

    I also know that we tend to close our eyes on some peoples pedophilic tendencies like Elvis Presley or Charlie Chaplin because of their “social value”. Which is probably why this question comes up far more often than something along the lines of should it be okay to murder people?. A good quote comes from the comedian Jim Jefferies: “How talented do you have to be to fuck a kid?”. Most famous pedophiles are innl part a product of their enviroment. The entertainment industry, is basically crimes on top of crimes. Drugs, union busting, fraud, money laundering, verbal, sexual and physical abuse and pedophilia. To bring this back to your question let me paraphrase it. Should we give leeway to pedophiles? I say no, I would gladly sacrifice a couple of songs from my playlist or a few DVDs if it meant that the crimes of those people would be undone.

    • MomoTimeToDie@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      1 year ago

      no you can’t have “imitation” children for pedophiles, for the same reason why you can’t have “imitation” bodies for serial killers or “imitation” buildings for arsonists. It is not our social duty to provide an outlet to people that can’t function in society.

      I mean you literally can do those things. Make up as many ballistic dummies and do whatever you like to them. You’ll run into some regulatory issues burning down your own building, but that’s only because of concern for fire spreading, not out of some noble concern for buildings. But if you have an old building on your land, and call the fire department in advance, it’s not uncommon for them to arrange training fires, assuming there isn’t much risk to surrounding stuff.

      And it’s hardly a “social duty to provide” to not arrest people for something. Nobody is arguing that the government should be wasting money to give these things away. Just that individuals who obtain the on their own should not be criminalized since they aren’t hurting anyone

      • LouNeko@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I’m sorry, I must have misunderstood you question then. I thought with “giving access” you’ve ment it like perscribtion drugs for health care. Basically, professionally made, but distributed with government substitute. Obviously I was against that.

        However, I’m against limitation on personal freedoms, especially if it’s something you’re doing on you own time and property. In that case you can go crazy and make your own AI model do the most degenerate thing imaginable if it pleases you. But we all know that this behavior will not keep to itself. Somebody will start profiting from this, and this is in my opinion where I draw the line. As soon as there’s any form of distribution or 2nd party involvement the generated “art” should be subject to the law. And in extend, I think any sort of suspicious behavior or expression towards sexualization of minors should be at least flagged and considered if any actual criminal behavior should arise.

        • MomoTimeToDie@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          1 year ago

          So what if someone starts selling it? There still isn’t an actual victim involved, so why should people be effectively punished for it (yes, I consider extra government scrutiny and/or surveillance beyond the norm to be a punishment)? Should we force everyone who buys a gun to be under extra surveillance since they might break some other law at some other point in time? Install government-mandated GPS trackers in every car to monitor traffic violations?

          • LouNeko@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Here’s a very simple but likely scenario. Somebody who’s keen in the AI field feeds his model a bunch of pictures from adult websites. A lot of different pictures from random actresses on PH for example. This already may cause copyrighting issues since none of the women explicitly agreed for their pictures to be used that way, but thats beside the point. Good, now the model knows what porn is. Now that person takes the child pictures of their friends Facebook, focusing only on one child. Generating porn images will now heavily resemble that one child.\

            If the model is trained well enough to generate convincing images, how is this a victimless crime?

            Right now there is no way to reliably determine if an image is generated or geniune, and the quality of the generated images will only increase with time. We can’t simply rely on the kindness of a persons hearth to watermark all the images as AI generated. And even if the images are clearly marked as fake, nothing stops others from using the images maliciously against the child and their family anyways. This isn’t a hypothetical, this is actually happening right now, hopefully less with children but definitely with celebrities that have a lot of public images available.

            The person generating their own porn won’t necessarily go out of their way to insure anonymity of their generated images. Just like I and many others are often times interested in a specific adult actress/actor because they represent features we are attracted to, I’d expect that pedophiles are most likely also interested in specific children. This sort of negates the “no victim” notion. While yes there is no actual harm done to the victim, the consequences will likely still affect them and their family mentally and financially for the rest of their life.

            Thats also the reason why we have joyride laws. Nobody is allowed to just get in your running car and go for a joyride, even though they fill up the tank at the end and bring it back in perfect condition. Technically no harm was done to you, but what if you had an important appointment that you now missed, who would be liable? Eventualities are always something that laws have to consider, because they have to serve as a deterrent to the actual act.

            • MomoTimeToDie@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              1 year ago

              In the first case you gave, the fact that it’s a child is hardly the relevant aspect as much as publishing false and misleading imagery of someone. At least to me, the problem with children being involved in sexual things is that children can’t properly give consent, and since we’re looking at a situation without consent (regardless of the age of the person), it’s not something that would change if a kid is involved, whether you think it should be legal or not.

              Personally, I lean towards the idea that it should be legal since I don’t support the idea that someone “owns” their own image, and that so long as it isn’t being presented as true information, which would be defamation, people are free to make whatever content they like featuring someone’s image, even if the subject doesn’t like it.

              Regarding the example of joyriding, there is harm done. The joyrider deprived me of my rightfully owned property for some period of time, and used it against my interests. That’s a specific and provable harm inherent to the crime. This is the entire principle behind the concept of “conversion”. Even if you rightfully have possession of something I own, it’s still illegal for you to use it in a manner I have not approved of.

              • LouNeko@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                Personally, I lean towards the idea that it should be legal since I don’t support the idea that someone “owns” their own image, and that so long as it isn’t being presented as true information, which would be defamation, people are free to make whatever content they like featuring someone’s image, even if the subject doesn’t like it.

                I guess this is where our opinions differ, because I lean towards the contrary.

                If you rephrase:

                The joyrider deprived me of my rightfully owned property for some period of time, and used it against my interests.

                To:

                The deepfaker deprived me of my rightfully owned property for some period of time, and used it against my interests.

                under consideration that I see images as intellectual property, you can see where my approach to this problem came from and why I specifically used joyriding as a fitting example.

                • MomoTimeToDie@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  3
                  ·
                  1 year ago

                  True. Our difference in opinion largely stems from how we view intellectual property. Personally, I believe that intellectual property should be extremely limited in scope, such that it only amounts to a limited ability for distribution of works.

                  • LouNeko@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    1 year ago

                    I’ve already had this debate once about similar topic regarding AI. There are certainly very good arguments for both points of view (especial when it comes to music, I’m more on your side). I’m ready to agree to disagree.