This is more of a 2 part question. Should child porn that does not include a real child be illegal? If so, who is being harmed by it?
The other question is; does giving a pedophile access to “imitation” children give them an outlet for their desire, so they won’t try to engage with real children, or does it just reinforce their desire, thus helping them to rationalize their behavior and lead to them being more encouraged to harm real children?
I’ve heard psychologists discuss both sides, but I don’t think we have any real life studies to go off of because the technology is so new.
I’m just curious what the other thought out there are from people who are more liberty minded.
So what if someone starts selling it? There still isn’t an actual victim involved, so why should people be effectively punished for it (yes, I consider extra government scrutiny and/or surveillance beyond the norm to be a punishment)? Should we force everyone who buys a gun to be under extra surveillance since they might break some other law at some other point in time? Install government-mandated GPS trackers in every car to monitor traffic violations?
Here’s a very simple but likely scenario. Somebody who’s keen in the AI field feeds his model a bunch of pictures from adult websites. A lot of different pictures from random actresses on PH for example. This already may cause copyrighting issues since none of the women explicitly agreed for their pictures to be used that way, but thats beside the point. Good, now the model knows what porn is. Now that person takes the child pictures of their friends Facebook, focusing only on one child. Generating porn images will now heavily resemble that one child.\
If the model is trained well enough to generate convincing images, how is this a victimless crime?
Right now there is no way to reliably determine if an image is generated or geniune, and the quality of the generated images will only increase with time. We can’t simply rely on the kindness of a persons hearth to watermark all the images as AI generated. And even if the images are clearly marked as fake, nothing stops others from using the images maliciously against the child and their family anyways. This isn’t a hypothetical, this is actually happening right now, hopefully less with children but definitely with celebrities that have a lot of public images available.
The person generating their own porn won’t necessarily go out of their way to insure anonymity of their generated images. Just like I and many others are often times interested in a specific adult actress/actor because they represent features we are attracted to, I’d expect that pedophiles are most likely also interested in specific children. This sort of negates the “no victim” notion. While yes there is no actual harm done to the victim, the consequences will likely still affect them and their family mentally and financially for the rest of their life.
Thats also the reason why we have joyride laws. Nobody is allowed to just get in your running car and go for a joyride, even though they fill up the tank at the end and bring it back in perfect condition. Technically no harm was done to you, but what if you had an important appointment that you now missed, who would be liable? Eventualities are always something that laws have to consider, because they have to serve as a deterrent to the actual act.
In the first case you gave, the fact that it’s a child is hardly the relevant aspect as much as publishing false and misleading imagery of someone. At least to me, the problem with children being involved in sexual things is that children can’t properly give consent, and since we’re looking at a situation without consent (regardless of the age of the person), it’s not something that would change if a kid is involved, whether you think it should be legal or not.
Personally, I lean towards the idea that it should be legal since I don’t support the idea that someone “owns” their own image, and that so long as it isn’t being presented as true information, which would be defamation, people are free to make whatever content they like featuring someone’s image, even if the subject doesn’t like it.
Regarding the example of joyriding, there is harm done. The joyrider deprived me of my rightfully owned property for some period of time, and used it against my interests. That’s a specific and provable harm inherent to the crime. This is the entire principle behind the concept of “conversion”. Even if you rightfully have possession of something I own, it’s still illegal for you to use it in a manner I have not approved of.
I guess this is where our opinions differ, because I lean towards the contrary.
If you rephrase:
To:
under consideration that I see images as intellectual property, you can see where my approach to this problem came from and why I specifically used joyriding as a fitting example.
True. Our difference in opinion largely stems from how we view intellectual property. Personally, I believe that intellectual property should be extremely limited in scope, such that it only amounts to a limited ability for distribution of works.
I’ve already had this debate once about similar topic regarding AI. There are certainly very good arguments for both points of view (especial when it comes to music, I’m more on your side). I’m ready to agree to disagree.